Join or Sign in

Register for your free asmag.com membership or if you are already a member,
sign in using your preferred method below.

To check your latest product inquiries, manage newsletter preference, update personal / company profile, or download member-exclusive reports, log in to your account now!
Login asmag.comMember Registration
https://www.asmag.com/rankings/
INSIGHTS

Drones get smarter with honey-bee brains!

Drones get smarter with honey-bee brains!
Researchers at the University of Sheffield in the U.K. are reverse-engineering the brain of the honey bee to design AI controllers for flying robots.
The developments in the field of artificial intelligence (AI) are clearly revolutionizing the robot and drone industry, with several companies offering unique solutions tailored for specific verticals. However, while most of these solutions rely on machine learning to meet their needs, some scientists are going back to nature for inspiration.

Researchers at the University of Sheffield in the U.K. are reverse-engineering the brain of the honey bee to design AI controllers for flying robots. They work from published behavioral and neuroscientific data, as well as generating their own, to develop the models. The project is called 'Brains on Board'.

Speaking to asmag.com, James Marshall, Professor at the University explained how their approach is different from the conventional methods that seek to take advantage of AI.

“Unlike many approaches to robot control and AI, which rely on machine learning, our models require no training data,” Marshall said. “They have already been embodied and tested on real robot platforms and function effectively with minimal parameter tuning.”

Further elaborating on the hardware and software combination that goes into their model, Marshall indicated that the developments in the processor technology have enabled incorporation of intelligent solutions on even tiny unmanned aerial vehicles.

“We specify our neural models and test them computationally, before translating this into code that runs on low-power, high throughput parallel computing hardware.”

-James Marshall, Professor, University of Sheffield

“We specify our neural models and test them computationally, before translating this into code that runs on low-power, high throughput parallel computing hardware,” Marshall said. “NVIDIA’s Generally Programmable Graphical Processing Units (GPU), for example, are now available for mobile devices such as tablets and smartphones; the low weight and low power requirements of these processors means it is now feasible to run full, albeit small, brain simulations onboard a small quadcopter, for genuine robot autonomy.”

Asked about the impact of AI on the drone industry, the professor added that so far drones have been almost exclusively used by human operators or even teams of human operators. To truly fulfill their potential, for example in exploring challenging environments, drones need to become autonomous, allowing humans to assign multiple drones single-handed to objectives, rather than needing each to be piloted by hand.

Considering the current condition of the drone market, the researchers expect a strong commercial demand for their solution.

“The commercial opportunities of AI for drones are almost limitless,” Marshall said. “In the short term, given how frequently user error causes drone crashes, there should be good demand for even our simplest algorithms, which can be used to prevent drones from being flown too close to obstacles using no more than data from their onboard cameras.”

Image source: ​Alex Cope
Subscribe to Newsletter
Stay updated with the latest trends and technologies in physical security

Share to: