Since 2006, computing giant IBM has been making annual predictions about which five innovations will change our lives in the next five years. This year, the company says the biggest impact will come from technological breakthroughs that augment our five senses.

These innovations will come as a result of cognitive computing. With this approach, computers are not programmed but instead use advanced algorithms and circuitry to learn through experiences, find patterns and correlations, create hypotheses and then remember the results — just like humans do. Cognitive computing systems will be able to see, smell, touch, taste and hear the world in real-time and react accordingly and quickly in ways that will greatly improve our lives. Here are a few examples of what that might mean:

Robot Prostitutes, the Future of Sex Tourism


1. SIGHT: Image recognition. Asking computers to look at a library of thousands of images

could help a machine do what a human does intuitively. Forest scenes, for example,

have a different distribution of colors than a cityscape. Once the computer

learns what a forest is supposed to look like, a programmer will show it

thousands of pictures of people doing something like hiking or picnicking. That

way a computer can start to understand what a scene should look like without needing

tags in the image.

If computers could recognize images in this way, then they

can pick out what matters in them — an important point if one is

aggregating security camera video or using imaging devices to diagnose disease.

2. SOUND: Hearing and translation. For hearing, a similar issue arises: picking out what

matters. Here computers are already pretty good, as speech recognition software

has made a debut on our phones with apps such as Siri. But the same kind of

pattern-learning systems could be applied to sounds as well as vision, and

result in computers that can, for instance, understand baby-talk — and maybe

even analyze your mood by the tone of your voice. Wouldn't it be great if those customer service robots knew how annoyed you were?

3. TASTE: Flavor breakdown. Then there is taste. Designing a computer that can

experience flavor can break down foods and understand why it is that some

things taste good. That in turn can help chefs design nutritious food or come

up with that perfect pairing of food and wine. (With any luck IBM will do

better than the Nutrimatic).

I, For One, Welcome Our New Computer Overlords

4. SMELL: Sensing dangerous chemicals. Computers could also learn to smell, picking up on gases

that no human being would be able to detect. Breathalyzers can already pick up

the alcohol content of your blood, but imagine one that could tell you if you

had a kidney ailment or cancer. A machine that could pick up explosives or drugs the way dogs do would be very useful in port security — and possibly put the K-9 units out of work.

5. TOUCH: Feeling from afar. Haptics already allow us to get some feedback — there's a hand that transmits pressure, video games that transmit vibrations and touch screens let us control our devices. Take that one step further

and you could actually feel the fabric of a suit on a clothing store's website — no more having to go all the way there to try it on –  by using the vibration capabilities of your phone. Other uses could include remote medical diagnostics or even surgery.

It's all a part of making computers more human-like and also

more useful. It might even change the way we use computers as profoundly as search

engines and the Internet did. Of course, the question then arises: how human do

we want our computers to be?

via IBM

Credit: IBM