Touchscreens of the future will recognize various touches, will operate various menus

  Current touch screens are able to recognize the presses made by users and based on them to perform various actions, but the system cannot distinguish between a press made with a finger, a nail or a palm. A new American company wants to solves this problem and has developed a system that is able to distinguish whether a screen is touched with the nail, the side of the finger, the palm or the upper part of the finger, and here lies the future of screens for mobile terminals.

There's more to our fingers than just the tips, though, and a startup called Qeexo aims to take advantage of this with technology that can differentiate between fingertips, nails, and knuckles. The San Jose, California-based company's technology, called FingerSense, can be used to do things like bring up a menu of options (akin to right-clicking on a mouse) on an e-mail with the knock of a knuckle, or enable new kinds of controls in games. Currently in talks with phone manufacturers, Qeexo hopes to have FingerSense installed in smartphones within a year.

  By recognizing how a screen is touched, the system can actuate various functions of an operating system. Basically, a touch with the upper part of the finger would open an application or select an option, but the touch made with the nail or the side of the finger would open a menu of options or activate the function of copying or pasting the text and the examples can go on. Screens of this kind would expand the functionality of a mobile terminal, but everything also depends on how mobile OS developers know how to implement it.

The project got them thinking about how such technology could be used to differentiate between different parts of the finger on a smartphone screen. Qeexo's technology relies in part on an acoustic sensor that can capture the sounds—mechanical vibrations—made by different types of on-screen touches from the different parts of a finger. Software on the phone, which has been trained with multiple people to tell the difference between various touches, uses information gathered by the acoustic sensor, along with data like where the touch occurred and how big it was, to make an educated guess about how a person is touching the screen.

  The current operating systems for mobile terminals are limited, they cannot offer users the functionality normally found in an OS for desktops, and this system can solve the problem. I think that the first company to implement something like this in its devices will "revolutionize" the mobile terminal industry, but everything depends on how well the system will be implemented.