Shinoda Lab presents "touchable holography" that measures where your hand is in relation to an object, then uses ultrasonic transducers to create pressure at the appropriate points. The result is the simulated sense of tactile interaction with a virtual object.
Imagine pointing your iphone at different locations around you to reveal geographically pertinent annotations and/or other media that people have deposited there. Now there's an app for that. In futurist circles, this basic world-as-web scenario has been discussed for years (I even worked on one such forecasting project ), if not decades. The simplest version of the concept has always been an application that intuitively and instantly blends real-time first-person physical world experience with the valuable data contained Wikipedia, Yelp or other websites, allowing you to instantly access stats about restaurants, concert venues, parks, car dealerships, schools, businesses, etc, that you encounter in your view. Such an app could, for example, provide information about a certain shrub in your yard, allowing quick access to species data, historical photos and related ads from the local lawncare services. Now, thanks to the convergence of smart phones and real-time geo-sensing, a