Using ultrasound waves, researchers have created a "forcefield" that allows users to interact with a screen without actually touching it.

Touch surfaces, such as those used in iPhones, create an easy-to-use format for individuals navigating through small or large amounts of data. The drawback is that as a person places his or her finger on a screen, that portion of the visual display is blocked. Furthermore, they are unable to feel what they have "touched," meaning they are able to feel the screen, but not the actual play button -- or whatever it may be.

To solve this problem, scientists from the University of Bristol's Interaction and Graphics (BIG) research group unveiled what's called a multi-point haptic feedback system.

Haptics refers to the use of sensations -- such as vibrations -- as a source of feedback for users, such as the rumble in a console game controller.

Called UltraHaptics, the new system relies on the principle of acoustic radiation force to project sensations through a screen and into the user's hands directly. According to the press release outlining the project, it "not only allows people to feel what is on the screen, but also receive invisible information before they touch it."

The use of ultrasonic vibrations as a form of delivering tactile information to a user represents an entirely new approach to the study of haptic technology, according to the researchers. It works through the emission of very high frequency sound waves that create sensations on a person's hand when they meet at the same time and at the same place.

"Current systems with integrated interactive surfaces allow users to walk-up and use them with bare hands," Tom Carter, a PhD student in the Department of Computer Science's BIG research group, said. "Our goal was to integrate haptic feedback into these systems without sacrificing their simplicity and accessibility."

The resulting design, Carter explains, includes an ultrasound transducer array situated under "an acoustically transparent display."

"This arrangement allows the projection of focused ultrasound through the interactive surface and directly onto the users' bare hands," he explained. "By creating multiple simultaneous feedback points, and giving them individual tactile properties, users can receive [localized] feedback associated to their actions."