Unlike voice-detecting Google Glass, and the camera-powered Kinect and Leap Motion controller, Thalmic Labs is going to the source of your hand and finger gestures – your forearm muscles. “In looking at wearable computers, we realized there are problems with input for augmented-reality devices,” says Thalmic Labs co-founder Stephen Lake. “You can use voice, but no one wants to be sitting on the subway talking to themselves, and cameras can’t follow wherever you go.”
Gesture control of robots, both aerial and ground-based, are prominently featured in the promo video. This is a notable innovation, given that the US Military already is investing in gesture-control research that will allow unmanned aerial vehicles to operate on the same flight decks as manned aircraft.
"By harnessing a new sphere of science called “lovotics”, Hooman Samani, an artificial intelligence researcher at the Social Robotics Lab at the National University of Singapore, believes it is possible to engineer love between humans and robots."
For more on lovotics follow this link.
In the era of cognitive computing, systems learn instead of passively relying on programming. As a result, emerging technologies will continue to push the boundaries of human limitations to enhance and augment our senses with machine learning, artificial intelligence (AI), advanced speech recognition and more. No need to call for Superman when we have real super senses at hand.Read More
TED series of talks.