IBM shows us the functions of smartphones and computers of the future (Video)

[youtube]http://youtu.be/wXkfrBJqVcQ[/youtube]

  Those from IBM I think for the future and the way they look at things makes me believe that in a few years the mobile terminals will be much smarter than the ones now. According to them, smartphones will help us feel what they display on the screens, computers will recognize objects and people much more easily, computers will also be able to "hear" and distinguish between different types of sounds, and in the end we find out that all computers, and probably smartphones, will have a sense of smell and will help us taste food with the help of sensors.

  Below you have the explanations from IBM and a suggestive video clip.

touch

[youtube]http://youtu.be/Gg3tmZrwbDs[/youtube]

In the 1970s, when a telephone company encouraged us to "reach out and touch someone," it had no idea that a few decades later that could be more than a metaphor. Infrared and haptic technologies will enable a smart phone's touchscreen technology and vibration capabilities to simulate the physical sensation of touching something. So you could experience the silkiness of that catalog's Egyptian cotton sheets instead of just relying on some copywriter to convince you.

Sight

[youtube]http://youtu.be/YwfJVwknvRo[/youtube]

Recognition systems can pinpoint a face in a crowd. In the future, computer vision might save a life by analyzing patterns to make sense of visuals in the context of big data. In industries as varied as healthcare, retail and agriculture, a system could gather information and detect anomalies specific to the task—such as spotting a tiny area of ​​diseased tissue in an MRI and applying it to the patient's medical history for faster, more accurate diagnosis and treatment.

hearing

[youtube]http://youtu.be/-oKfWIgDTFs[/youtube]

Before the tree fell in the forest, did anyone hear it? Sensors that pick up sound patterns and frequency changes will be able to predict weakness in a bridge before it buckles, the deeper meaning of your baby's cry or, yes, a tree breaking down internally before it falls. By analyzing verbal traits and including multi-sensory information, machine hearing and speech recognition could even be sensitive enough to advance dialogue across languages ​​and cultures.

Taste

[youtube]http://youtu.be/DNz23XXLa1E[/youtube]

The challenge of providing food—whether it's for impoverished populations, people on restricted diets or picky kids—is in finding a way to meet both nutritional needs and personal preferences. In the works: a way to calculate "perfect" meals using an algorithmic recipe of favorite flavors and optimal nutrition. No more need for substitute foods when you can have a personalized menu that satisfies both the calorie count and the palate.

Smell

[youtube]http://youtu.be/RYkSvNKdyBM[/youtube]

When you call a friend to say how you're doing, your phone will know the full story. Soon, sensors will detect and distinguish odors: a chemical, a biomarker, even molecules in the breath that affect personal health. The same smell technology, combined with deep learning systems, could troubleshoot operating-room hygiene, crops' soil conditions or a city's sanitation system before the human nose knows there's a problem.