Robot’s of all shapes and sizes are being used in agriculture and disaster relief situations. Some of these are know for being incredibly large, but now new advances are also creating bug-sized robots to work alongside their lumbering comrades. And a new breed of synthetic fliers is getting an upgrade that any evil mad scientist could get excited about. Lasers for eyes! One of the challenges is miniaturizing a robots bugs eye to the size of…well, a bugs eye.
A cool solution that a team of researcher came up with at the University of Buffalo is to equip them with tiny laser-powered sensors that act as eyes. This allows the tiny creatures to sense the size, shape and distance of approaching objects. “Essentially, it’s the same technology that automakers are using to ensure that driverless cars don’t crash into things,” says University at Buffalo computer scientist Karthik Dantu. “Only we need to shrink that technology so it works on robot bees that are no bigger than a penny.” The UB-led research project was funded by a $1.1 million National Science Foundation grant and includes researchers from Harvard University and the University of Florida. It builds on the work done in the RoboBee initiative that was led by Harvard and Northeastern University. The latter project is trying to create insect-inspired robots that could someday be used in agriculture and disaster relief.
Researchers have already been able to create robot bees are capable of flying in a swarm and in tethered flight. That later would allow a small swarm of bees to carry a net for example or work together to accomplish a task that no one bee is able able to complete on its own. Miniature insects are also able to move while underwater. But until now, all mini robots have lacked depth perception. This means that a robot bee cannot sense what’s in front of it and how far away it is. This is a big problem if you want your helper bee to avoid flying into a wall or perform more difficult tasks like landing on a flower, says Dantu.
The UB-led research team hopes that they will be able to address the limitation by outfitting the insect inspired robot bee with a car inspired solution.
Driverless cars use a remote sensing technology called lidar, which is short for light detection and ranging, which works like radar. The difference Lidar and radar is that Lidar emits invisible laser beams instead of microwaves. The beams then capture the light reflected from distant objects and sensors then measure the time it takes for the light to return. With this data, they are able to calculate the distance and shape of objects that are in front of it. The car mounted systems are currently about the size of a traditional camping lantern, but the team Dantu leads hopes that they can shrink them to a size that can be used with their robots, called “micro-lidar.”
University of Florida researchers are developing the tiny sensor that will measure the light’s reflection, while Dantu will create the novel perception and navigation algorithms required to enable the bee to process and map the world around it. The team at Harvard will then incorporate the lidar technology into the flying bees. Once perfected, the technology the team develops won’t be limited to robot insects. The sensors will quickly make their way into other things, such as wearable technology, endoscopic tools, and smartphones, tablets and other mobile devices.