This is default featured slide 1 title
This is default featured slide 2 title
 

Monthly Archives: February 2017

Cool Technology iPhone

Multi-touch screens

The iPhone’s most obvious contribution was to ditch the physical keyboard.

Prior to 2007, phones fell into two main camps: feature phones with a numeric keypad or “smartphones” like the Blackberry with a full QWERTY keyboard. The latter sometimes came with a touchscreen but they required a stylus to operate and weren’t really suitable for typing.

The iPhone instead featured a 3.5-inch (9 centimeters) LCD screen with multi-touch technology. Not only did this get rid of the stylus in favor of what Jobs said was the ultimate pointing device — our finger — it enabled “smart” functions like pinch-to-zoom and physics-based interaction that presented on-screen elements as real objects with weight, size and intuitive responses.

More importantly, it allowed the screen to cover the entire face of the phone, which was the basis of many of the devices’ other innovations.

Multi-touch screens The iPhone’s most obvious contribution was to ditch the physical keyboard. Prior to 2007, phones fell into two main camps: feature phones with a numeric keypad or “smartphones” like the Blackberry with a full QWERTY keyboard. The latter sometimes came with a touchscreen but they required a stylus to operate and weren’t really suitable for typing. The iPhone instead featured a 3.5-inch (9 centimeters) LCD screen with multi-touch technology. Not only did this get rid of the stylus in favor of what Jobs said was the ultimate pointing device — our finger — it enabled “smart” functions like pinch-to-zoom and physics-based interaction that presented on-screen elements as real objects with weight, size and intuitive responses. More importantly, it allowed the screen to cover the entire face of the phone, which was the basis of many of the devices’ other innovations.

Google Maps

It may seem strange to list Google Maps as an innovation made popular by the iPhone, but Steve Jobs was central in bringing Google’s mapping smarts to mobile devices when he asked Google to build an app for the iPhone.

It was the first smartphone to feature the app, and even though the original iPhone didn’t feature GPS, this was rectified in later versions, allowing Google to add the turn-by-turn satellite navigation capability that is now standard in smartphones.

Google Maps It may seem strange to list Google Maps as an innovation made popular by the iPhone, but Steve Jobs was central in bringing Google’s mapping smarts to mobile devices when he asked Google to build an app for the iPhone. It was the first smartphone to feature the app, and even though the original iPhone didn’t feature GPS, this was rectified in later versions, allowing Google to add the turn-by-turn satellite navigation capability that is now standard in smartphones.

iPhone 10: The Best Is Yet to Come?

It seems like yesterday — not 10 years ago — that Steve Jobs took the stage at MacWorld to debut Apple’s latest new gadget: the iPhone.

The iPhone was three devices in one, he declared at Moscone West in San Francisco. It was a widescreen iPod with touch controls, a revolutionary mobile phone, and a breakthrough Internet communications device.

Apple’s “three-in-one device” has gone on to become a critical contributor to the company’s success, accounting for more than half its revenues annually, as well as a can’t-live-without tool for many people.

“iPhone is an essential part of our customers’ lives, and today more than ever it is redefining the way we communicate, entertain, work and live,” said Apple CEO Tim Cook.

“iPhone set the standard for mobile computing in its first decade and we are just getting started,” he continued. “The best is yet to come.”

Fundamental Change

What made the iPhone different from what passed for a mobile phone before Jobs introduced it that fateful day 10 years ago?

“What we’ve realized over the last few years is that the iPhone fundamentally changed how we thought about phones,” said Jack E. Gold, principal analyst at J.Gold Associates.

“Before the iPhone, we looked at phones as primarily communications devices,” he explained to TechNewsWorld. “iPhone changed that to ‘we’ve got a computer in our hands that happens to be a phone as well.'”

Prior to the iPhone’s arrival, most vendors believed there was scant need or demand for a palm-sized “smart” device supporting search, media consumption and other Internet-based functions, recalled Charles King, principal analyst at Pund-IT.

“The iPhone put an end to those assumptions and transformed the marketplace,” he told TechNewsWorld.

It also flipped forever how mobile phones were used.

“Before iPhone, we talked on phones 90 percent of the time,” Gold said. “Now we talk 10 percent and do other stuff 90 percent of the time.”

App Colossus

Apple brought something else to mobile phones that hadn’t been there before: usability.

“In the early days of smartphones, they were awful to use,” Gold noted. “The iPhone wasn’t perfect, but it fundamentally changed how people perceived these things as fun and easy to use.”

Another notable change the iPhone pioneered was the use of mobile applications.

“A big innovation was the introduction of a development kit for creating apps and making apps a key part of the iPhone design,” said Tim Bajarin, president of Creative Strategies.

“That allowed them to create a broader ecosystem that included hardware, software and services,” he told TechNewsWorld.

There is no doubt that the iPhone created a good monster with its app-centric approach. Since Apple began selling software from its App Store in 2008, it has returned more than US$50 billion to developers.

“The app economy opened the floodgates for the transformation of many industries — streaming music, mobile banking, mobile video, mobile retail and mobile games,” Reticle Research Principal Analyst Ross Rubin told TechNewsWorld.

A Few Bad Blemishes

Perhaps what has distinguished the iPhone above all else during its 10 years of existence has been its design.

“Some of the technology in the iPhone was around before the iPhone, but the iconic design wasn’t seen before,” noted David McQueen, a research director at ABI Research.

“There’ve been a lot of copycats, but none have come up to the beauty of the iPhone’s designs,” he told TechNewsWorld.

Ironically, one of the iPhone’s biggest flubs was connected to the design of the iPhone 4. The so-called Antennagate problem occurred because the phone’s antenna was placed on the edge of the phone where a user’s hand could interfere with the call signal.

The introduction of Maps created another blemish on the iPhone’s record that Apple would like to forget. The app was embarrassingly inaccurate when it made its debut.

Those mistakes were just bumps in the road for the iPhone, though, with little impact on its popularity or sales. That is attributable in part to Apple’s service, which is “fantastic,” according to McQueen.

Another factor is customer loyalty.

“People who love Apple love Apple,” Gold observed. “There aren’t a lot of people leaving Apple.”

Bright Future

If Apple can maintain the iPhone’s premium status, it should continue to thrive.

“Apple has done really well by staying at the high end, where margins are good, and by selling additional services,” Gold explained. “If they can maintain those margins, they’ll do fine.”

Apple is also making investments in artificial intelligence, which should help the iPhone keep pace with competitors.

“The prospects are good for the future iPhone as long as Apple continues to perfect next-generation use cases like AR, VR, MR and modularity,” Moor Insights and Strategy Principal Analyst Patrick Moorhead told TechNewsWorld.

Short-term prospects look good for the iPhone, too.

“I believe Apple will set sales records with the iPhone 8,” Bajarin said, “and start a new super cycle for upgrades that will drive strong revenue at least through 2019.”

WhatsApp Shaves Off a Little More Privacy

WhatsApp on Thursday announced an update to its terms and privacy policy — the first in four years.

Among other things, the changes will affect the ways users can communicate with businesses while continuing to avoid third-party banner ads or spam messages, according to the company.

However, WhatsApp will begin to share some personal details about its 1 billion users — such as phone numbers and other data — with Facebook, its parent company. The information sharing will permit better tracking of basic metrics, allowing Facebook to offer better friend suggestions, for example — and of course, to show more relevant ads.

Connected Network

The increased connectivity and information sharing might not be apparent to WhatsApp users initially. Further, neither WhatsApp nor Facebook actually will read any messages, which are encrypted. Phone numbers and other personal data won’t be shared with advertisers.

Despite those limitations, the fact that WhatsApp will share any relevant information with Facebook has raised some flags.

“This announcement should be very concerning to WhatsApp users, who have been promised many times by both WhatsApp and Facebook that their privacy will be respected and protected,” said Claire T. Gartland, consumer protection counsel at the Electronic Privacy Information Center.

“That is why many individuals use WhatsApp in the first place,” she told the E-Commerce Times.

“WhatsApp may claim otherwise, but this is really the beginning of the end of privacy through that service,” warned Jim Purtilo, associate professor in the computer science department at the University of Maryland.

“We’ve seen this cycle before. Web users visiting sites with a browser once had some sense of privacy, but it didn’t take servers long to figure out how to share traffic data with one another and piece together profiles of each user,” he told the E-Commerce Times.

“Today, any time you visit a site which offers a Facebook login or an AddThis tag, you also transmit a trace of your activity to big corporations to analyze and use,” Purtilo added. “Just browsing is enough — traffic analysis lets companies fill in the blanks, and this paints a pretty rich picture of you. You’d be pretty naive to think they go to this trouble for your benefit.”

End of Privacy

The warnings over privacy concerns actually go back to 2014 when Facebook first acquired WhatsApp for approximately US$19.3 billion.

“Jessica Rich, director of the FTC’s Consumer Protection Bureau, sent a letter to the companies during Facebook’s acquisition of WhatsApp warning the companies that the privacy promises made to WhatsApp users must be respected,” recalled EPIC’s Gartland.

“WhatsApp’s blog describes two different means of opting out of the proposed new sharing,” she noted, “and neither of these options appear consistent with Rich’s letter, which requires Facebook to get users’ affirmative consent before changing the way they use data collected via WhatsApp.”

Moreover, it does not appear as if WhatsApp even plans to secure what could be considered “meaningful, informed opt-in consent from its users to begin sharing this information with Facebook,” Gartland suggested.

Opt-Out Process

Users will be able to opt out, according to WhatsApp, but it likely will require reading the fine print — something few users actually do.

“WhatsApp says in a FAQ that existing users can opt out of sharing account information with Facebook for use by Facebook to improve the user’s ‘Facebook ads and predicts experiences’ in two ways,” said Karl Hochkammer, leader of the Honigman Law Firm’s information and technology transactions practice group.

“One way to opt out is to click the ‘read’ hyperlink before accepting the new terms of service and privacy policy, scroll to the bottom of the screen, and uncheck the box,” he told the E-Commerce Times.

“This is set up to make the default rule an opt-in, with the option of opting out, so if someone agrees to the new terms and privacy policy without opting out, WhatsApp is also saying that a user has 30 days to make this decision by changing the user account’s settings,” Hochkammer explained. “Even if someone opts out, the information will still be shared with Facebook, but it won’t be used in connection with the user’s Facebook account.”

This method of opting out, in essence, could result in a user’s private information still being shared with Facebook.

“All WhatsApp has effectively said is that they are ready to apply the same analysis techniques to messaging as had previously been done for Web browsing,” remarked Purtilo.

“Privacy goes out the window at that point, even if bit by bit,” he added. “You can’t monetize such services without knowing how to tailor your advertising, and the only way to tailor it is by opening up the traffic and content for analysis, so that big corporations will have an even richer picture of you.”

Will Users Care?

It could be that WhatsApp can’t afford to disregard the wishes of an installed base of more than 1 billion users, but it’s questionable whether many of those users actually care about the new policies.

“On one level, this was probably inevitable. Facebook is a public company that faces investor scrutiny to make a profit,” observed Greg Sterling, vice president of strategy and insight at the Local Search Association.

“It is the logic of the market, and thus was unlikely that WhatsApp could continue with the small subscription model,” he told the E-Commerce Times. “It simply has too large a user base for Facebook to ignore from the advertiser point of view.”

Though there may be a loud and vocal minority that objects, most users will accept the changes.

“Look at the many changes that Facebook has made over the years,” said Sterling.

“That hasn’t had a detrimental impact on the company, even as many of its users are distrustful of Facebook,” he pointed out.

WhatsApp “is probably betting that users who would never try their service under these terms are now sufficiently dependent that they give up their data rather than invest the effort to find alternate products,” said Purtilo, “and we’ve seen that before as well. This is how privacy dies, bit by bit.”

Modern technology is changing the way our brains work, says neuroscientist.

Human identity, the idea that defines each and every one of us, could be facing an unprecedented crisis.
It is a crisis that would threaten long-held notions of who we are, what we do and how we behave.
It goes right to the heart – or the head – of us all. This crisis could reshape how we interact with each other, alter what makes us happy, and modify our capacity for reaching our full potential as individuals.
And it’s caused by one simple fact: the human brain, that most sensitive of organs, is under threat from the modern world.
Unless we wake up to the damage that the gadget-filled, pharmaceutically-enhanced 21st century is doing to our brains, we could be sleepwalking towards a future in which neuro-chip technology blurs the line between living and non-living machines, and between our bodies and the outside world.
It would be a world where such devices could enhance our muscle power, or our senses, beyond the norm, and where we all take a daily cocktail of drugs to control our moods and performance.
Already, an electronic chip is being developed that could allow a paralysed patient to move a robotic limb just by thinking about it. As for drug manipulated moods, they’re already with us – although so far only to a medically prescribed extent.
Increasing numbers of people already take Prozac for depression, Paxil as an antidote for shyness, and give Ritalin to children to improve their concentration. But what if there were still more pills to enhance or “correct” a range of other specific mental functions?
What would such aspirations to be “perfect” or “better” do to our notions of identity, and what would it do to those who could not get their hands on the pills? Would some finally have become more equal than others, as George Orwell always feared?
Of course, there are benefits from technical progress – but there are great dangers as well, and I believe that we are seeing some of those today.
I’m a neuroscientist and my day-to-day research at Oxford University strives for an ever greater understanding – and therefore maybe, one day, a cure – for Alzheimer’s disease.
But one vital fact I have learnt is that the brain is not the unchanging organ that we might imagine. It not only goes on developing, changing and, in some tragic cases, eventually deteriorating with age, it is also substantially shaped by what we do to it and by the experience of daily life. When I say “shaped”, I’m not talking figuratively or metaphorically; I’m talking literally. At a microcellular level, the infinitely complex network of nerve cells that make up the constituent parts of the brain actually change in response to certain experiences and stimuli.
The brain, in other words, is malleable – not just in early childhood but right up to early adulthood, and, in certain instances, beyond. The surrounding environment has a huge impact both on the way our brains develop and how that brain is transformed into a unique human mind.
Of course, there’s nothing new about that: human brains have been changing, adapting and developing in response to outside stimuli for centuries.
What prompted me to write my book is that the pace of change in the outside environment and in the development of new technologies has increased dramatically. This will affect our brains over the next 100 years in ways we might never have imagined.
Our brains are under the influence of an ever- expanding world of new technology: multichannel television, video games, MP3 players, the internet, wireless networks, Bluetooth links – the list goes on and on.
But our modern brains are also having to adapt to other 21st century intrusions, some of which, such as prescribed drugs like Ritalin and Prozac, are supposed to be of benefit, and some of which, such as widelyavailable illegal drugs like cannabis and heroin, are not.
Electronic devices and pharmaceutical drugs all have an impact on the micro- cellular structure and complex biochemistry of our brains. And that, in turn, affects our personality, our behaviour and our characteristics. In short, the modern world could well be altering our human identity.
Three hundred years ago, our notions of human identity were vastly simpler: we were defined by the family we were born into and our position within that family. Social advancement was nigh on impossible and the concept of “individuality” took a back seat.
That only arrived with the Industrial Revolution, which for the first time offered rewards for initiative, ingenuity and ambition. Suddenly, people had their own life stories – ones which could be shaped by their own thoughts and actions. For the first time, individuals had a real sense of self.
But with our brains now under such widespread attack from the modern world, there’s a danger that that cherished sense of self could be diminished or even lost.
Anyone who doubts the malleability of the adult brain should consider a startling piece of research conducted at Harvard Medical School. There, a group of adult volunteers, none of whom could previously play the piano, were split into three groups.
The first group were taken into a room with a piano and given intensive piano practise for five days. The second group were taken into an identical room with an identical piano – but had nothing to do with the instrument at all.
And the third group were taken into an identical room with an identical piano and were then told that for the next five days they had to just imagine they were practising piano exercises.
The resultant brain scans were extraordinary. Not surprisingly, the brains of those who simply sat in the same room as the piano hadn’t changed at all.
Equally unsurprising was the fact that those who had performed the piano exercises saw marked structural changes in the area of the brain associated with finger movement.
But what was truly astonishing was that the group who had merely imagined doing the piano exercises saw changes in brain structure that were almost as pronounced as those that had actually had lessons. “The power of imagination” is not a metaphor, it seems; it’s real, and has a physical basis in your brain.
Alas, no neuroscientist can explain how the sort of changes that the Harvard experimenters reported at the micro-cellular level translate into changes in character, personality or behaviour. But we don’t need to know that to realise that changes in brain structure and our higher thoughts and feelings are incontrovertibly linked.
What worries me is that if something as innocuous as imagining a piano lesson can bring about a visible physical change in brain structure, and therefore some presumably minor change in the way the aspiring player performs, what changes might long stints playing violent computer games bring about? That eternal teenage protest of ‘it’s only a game, Mum’ certainly begins to ring alarmingly hollow.
Already, it’s pretty clear that the screen-based, two dimensional world that so many teenagers – and a growing number of adults – choose to inhabit is producing changes in behaviour. Attention spans are shorter, personal communication skills are reduced and there’s a marked reduction in the ability to think abstractly.
This games-driven generation interpret the world through screen-shaped eyes. It’s almost as if something hasn’t really happened until it’s been posted on Facebook, Bebo or YouTube.
Add that to the huge amount of personal information now stored on the internet – births, marriages, telephone numbers, credit ratings, holiday pictures – and it’s sometimes difficult to know where the boundaries of our individuality actually lie. Only one thing is certain: those boundaries are weakening.
And they could weaken further still if, and when, neurochip technology becomes more widely available. These tiny devices will take advantage of the discovery that nerve cells and silicon chips can happily co-exist, allowing an interface between the electronic world and the human body. One of my colleagues recently suggested that someone could be fitted with a cochlear implant (devices that convert sound waves into electronic impulses and enable the deaf to hear) and a skull-mounted micro- chip that converts brain waves into words (a prototype is under research).
Then, if both devices were connected to a wireless network, we really would have arrived at the point which science fiction writers have been getting excited about for years. Mind reading!
He was joking, but for how long the gag remains funny is far from clear.
Today’s technology is already producing a marked shift in the way we think and behave, particularly among the young.
I mustn’t, however, be too censorious, because what I’m talking about is pleasure. For some, pleasure means wine, women and song; for others, more recently, sex, drugs and rock ‘n’ roll; and for millions today, endless hours at the computer console.
But whatever your particular variety of pleasure (and energetic sport needs to be added to the list), it’s long been accepted that ‘pure’ pleasure – that is to say, activity during which you truly “let yourself go” – was part of the diverse portfolio of normal human life. Until now, that is.
Now, coinciding with the moment when technology and pharmaceutical companies are finding ever more ways to have a direct influence on the human brain, pleasure is becoming the sole be-all and end-all of many lives, especially among the young.
We could be raising a hedonistic generation who live only in the thrill of the computer-generated moment, and are in distinct danger of detaching themselves from what the rest of us would consider the real world.
This is a trend that worries me profoundly. For as any alcoholic or drug addict will tell you, nobody can be trapped in the moment of pleasure forever. Sooner or later, you have to come down.
I’m certainly not saying all video games are addictive (as yet, there is not enough research to back that up), and I genuinely welcome the new generation of “brain-training” computer games aimed at keeping the little grey cells active for longer.
As my Alzheimer’s research has shown me, when it comes to higher brain function, it’s clear that there is some truth in the adage “use it or lose it”.
However, playing certain games can mimic addiction, and that the heaviest users of these games might soon begin to do a pretty good impersonation of an addict.
Throw in circumstantial evidence that links a sharp rise in diagnoses of Attention Deficit Hyperactivity Disorder and the associated three-fold increase in Ritalin prescriptions over the past ten years with the boom in computer games and you have an immensely worrying scenario.
But we mustn’t be too pessimistic about the future. It may sound frighteningly Orwellian, but there may be some potential advantages to be gained from our growing understanding of the human brain’s tremendous plasticity. What if we could create an environment that would allow the brain to develop in a way that was seen to be of universal benefit?
I’m not convinced that scientists will ever find a way of manipulating the brain to make us all much cleverer (it would probably be cheaper and far more effective to manipulate the education system). And nor do I believe that we can somehow be made much happier – not, at least, without somehow anaesthetising ourselves against the sadness and misery that is part and parcel of the human condition.
When someone I love dies, I still want to be able to cry.
But I do, paradoxically, see potential in one particular direction. I think it possible that we might one day be able to harness outside stimuli in such a way that creativity – surely the ultimate expression of individuality – is actually boosted rather than diminished.
I am optimistic and excited by what future research will reveal into the workings of the human brain, and the extraordinary process by which it is translated into a uniquely individual mind.
But I’m also concerned that we seem to be so oblivious to the dangers that are already upon us.
Well, that debate must start now. Identity, the very essence of what it is to be human, is open to change – both good and bad. Our children, and certainly our grandchildren, will not thank us if we put off discussion much longer.