About 1:35 into
Jeff Han impressively demonstrates a lava-lamp application on a multi-touch user interface.
Having spent considerable time in the past pondering the fluid dynamics (e.g., convection) of the Earth’s atmosphere and deep interior (i.e., mantle and core), Han’s demonstration immediately triggered a scientific use case: Is it possible to computationally steer scientific simulations via multi-touch user interfaces?
A quick search via Google returns almost 20,000 hits … In other words, I’m likely not the first to make this connection 😦
In my copious spare time, I plan to investigate further …
Also of note is how this connection was made: A friend sent me a link to an article on Apple’s anticipated tablet product. Since so much of the anticipation of the Apple offering relates to the user interface, it’s not surprising that reference was made to Jeff Han’s TED talk (the video above). Cool.
If you have any thoughts to share on multi-touch computational steering, please feel free to chime in.
One more thought … I would imagine that the gaming industry would be quite interested in such a capability – if it isn’t already!