News these days is replete with the progress of technology. Not a day goes by without hearing that the AI is beating humans at yet another task. The other day I was struck by how much progress Boston Dynamics has made between their last generation state of the art, a robot dog, and this year’s Atlas humanoid robot. Check out the difference below (dog on top, humanoid below it):
Spot is a four-legged robot designed for indoor and outdoor operation. It is electrically powered and hydraulically actuated. Spot has a sensor head that helps it navigate and negotiate rough terrain. Spot weighs about 160 lbs.
A new version of Atlas, designed to operate outdoors and inside buildings. It is specialized for mobile manipulation. It is electrically powered and hydraulically actuated. It uses sensors in its body and legs to balance and LIDAR and stereo sensors in its head to avoid obstacles, assess the terrain, help with navigation and manipulate objects.
The progress from one generation to the next, just a year later, is irrefutable. Interestingly, Boston Dynamics was purchased by Google at the end of 2013. One might ask, what would a search company want with a robotics company? Increasingly, a lot. This has to do with a concept and practice called cloud robotics. Cloud robotics takes advantage of intelligence in the cloud to infuse a rather dumb robot on the edge with intelligence.
This becomes relevant to everything from the Google driverless cars, to how robots will know how to grasp a myriad of objects, and even augment the ability of surgeons to perform operations.
Check out this talk on cloud robotics, from none other than Google, which covers some of this ground:
Ken Goldberg is the craigslist Distinguished Professor of New Media and Professor of Industrial Engineering and Operations Research (IEOR) at the University of California, Berkeley. He also holds an appointment in the Department of Radiation Oncology at the University of California, San Francisco.
Of course, Google can highlight the more altruistic uses of this technology, like the cloud-enabled surgery in the aforementioned Google Talk, but imagine the bug and bird-sized drones in the new movie Eye in the Sky, discussed in the post below, infused with the intelligence of the cloud.
Eye in the Sky is a tight British thriller staring Helen Mirren, Breaking Bad’s Aaron Paul and the late and already missed Alan Rickman. That should be enough to know to get you down to the cinema to see this recently released movie. The movie is a surprisingly cerebral look at the future of war.
As the Internet of Things, Big Data, and robotics converge, I think, the web will transform yet again to something that much more physical and immersive. Whether for good or ill.
In the Google Talk video, the speaker explains how robots may be able to perform secondary often redundant surgical tasks like sewing sutures. The robot could take advantage of a data bank of similar procedures to eventually be able to outperform most surgeons. The video also suggests that maybe the robot could then be sped up way beyond the speed of what a normal surgeon could perform. This really gets at a post I wrote a couple of year’s ago called The Future of Humans and the notion of alien intelligence.
Alien intelligence is a concept James Martin writes about in his book After the Internet, and, I think, can be equally applied to cloud robotics, in that by pooling collective intelligence with Big Data and machine learning, as alluded to, an operation may be able to be performed which is beyond the ability of a simple human or the small group of a surgical team, just as 3D print designs already surpass what a simple sculptor or CNC machine operator can achieve.
A lot of this technology, especially around the Da Vinci surgical machines and any hypothetical cloud robotic add-ons, still seems tentative. But if the one year of progress between Boston Dynamics robot dog, and then robot human, is the benchmark, alien surgeons might just be on the horizon for Web 4.0, 5.0, etc etc.