Skywriting with a difference using Air Scrawl
from hackster.io
Drones have lots of useful and fun applications, but to get the most out of a drone, an operator needs to acquire a good deal of piloting skills first.
Getting those skills takes time and effort, and there may be broken propellers, smashed electronics, and all manner of other expensive repair work along the way. This state of affairs is just the sort of scenario that should set an engineer to thinking that there has to be a better way.
Researchers from the Skolkovo Institute of Science and Technology believe that they have a better way to control drones, and have implemented it in their DronePaint system.DronePaint works by reading hand gestures made by an operator, which are translated into flight trajectories that control the movement of a drone, or even a swarm of drones.
To use DronePaint, the operator stands in front of a webcam, and gives a short series of hand gesture-based commands to input a flight plan and instruct the drones to carry it out. Hand tracking is achieved with the help of the Mediapipe framework, which is able to identify 21 key points on a human hand in each image frame.
These key points are fed into a deep neural network that has been trained to recognize hand gestures. A vocabulary of eight gestures can be recognized by the system, and these gestures were able to be recognized correctly in 99.75% of cases.
As you might expect, drawing the drone swarm trajectory in the air by hand can be a little bit rough, so the next step in the pipeline is a trajectory processing algorithm that smooths things out.An alpha-beta filter performs this smoothing, after which trajectory coordinates are translated from pixels on a screen to the flight zone coordinate system. These results are then passed to the swarm control algorithm, which determines the actual commands that are to be sent to the drones.
The new technique was evaluated in a small study involving seven participants. These participants were asked to both trace trajectories using gestures, and also via computer mouse, with the later method being used as a reference point to gauge the relative success of DronePaint.
On average, gesture-drawn trajectories were found to deviate 2.5 centimeters further from the ground truth than the mouse-drawn trajectories. While acceptable in many use cases, this is a finding that DronePaint needs to address. The team suggests that a haptic feedback device may allow users to position their hands more accurately.
While DronePaint may need more work to provide sufficiently accurate results for some applications, it showed great promise in generating long exposure light paintings, suggesting immediate utility in the art industry.Hopefully this technology will continue to develop, as the concept behind DronePaint provides a very simple, and intuitive way to pilot drones.
Leave a comment