UPENN’s new quadcopter uses a smartphone for autonomous flight, using solely on-board hardware and vision algorithms—no GPS concerned. The drone was engineered as a cooperative project between Qualcomm and a team of University of Pennsylvania researchers light-emitting diode by Vijay Kumar.
What’s distinctive concerning this demo is that it’s the primary time that a complicated platform like this (and vision-based time period autonomous navigation of a flying automaton is pretty darn sophisticated) has been controlled by a awfully basic client device.
All of the clever stuff during this quadcopter is being handled entirely by a phone, that is simply a stock robot smartphone with a Qualcomm flower within. It’s not a special device; it’s one thing that you just will obtain for yourself, and also the UPenn guys solely [*fr1] jokingly offered to put in their app on my phone and let it fly the automaton at CES 2015.
At CES 2015, we have a tendency to stopped by the Qualcomm booth to visualize out a cooperative project with University of Pennsylvania researchers light-emitting diode by Vijay Kumar: it’s a quadrotor that uses a smartphone for a brain for autonomous flight, exploitation solely on-board hardware and vision algorithms, no GPS. Impressive.
At CES 2015, we have a tendency to stopped by the Qualcomm booth to visualize out a cooperative project with University of Pennsylvania researchers light-emitting diode by Vijay Kumar: it’s a quadrotor that uses a smartphone for a brain for autonomous flight, exploitation solely on-board hardware and vision algorithms, no GPS. Impressive.
Just to be clear on this, the sole issue that the quadrotor has in terms of physical science could be a motor controller and A battery. All of the clever stuff is being handled entirely by the phone, that is simply a stock robot smartphone with a Qualcomm flower within. In different words, this is often not a special device (like Google’s Project Tango phone, that the UPenn researchers utilized in a demo last year); it’s one thing that you just will obtain for yourself, and also the UPenn guys solely [*fr1] jokingly offered to put in their app on my phone and let it fly the automaton.
This is an amazing example of simply however so much smartphones have come: they’re definitely powerful computers, however it’s the integrated sensing that comes customary in the majority of them (things like gyros, accelerometers, IMUs, and high resolution cameras) that produces them ideal for low-priced brains for robots. What’s distinctive concerning the CES demo is that it’s the primary time that a complicated platform like this (vision-based time period autonomous navigation of a flying automaton is pretty darn sophisticated) has been controlled by a awfully basic client device.
So, what’s next? Vijay Kumar tells North American country wherever they’re headed:
“What we’d prefer to do is build these forms of robots smaller, smarter, and faster. once you build things smaller, the amount of stuff you will do will increase, and that’s wherever we have a tendency to hope to use scores of these guys. thus rely on one flying phone that you just have today; tomorrow, you’ll see a swarm of flying phones. That’s what we’re operating towards.”
So this post about " A Smartphone is used as a brain for Quadcopter Autonomy " thanks for visiting my blog ...