That's true OpenPose requires a lot of processing power, but it seems there are other models for pose estimation that can run on smartphones. An example here (I never tried) : edvardHua/PoseEstimationForMobilethank you! This is the first Tello project I have seen that really impresses me. But it seems like OpenPose requries too much processing power for realtime use on a mobile device..
I have just realized that you were the author of an android app. I have read very good reviews on it, I will try it ! I imagine it must be quiet difficult to deploy such a technology (Openpose or more generally neural nets) whose efficiency depends very much on the available processing power, and that have to work equally well on a wide selection of phones. The Morse code would surely be easier to integrateYes, that one I had seen before. It does 2 fps on my reference phone (Samsung S4). I haven't tested this but I guess 5-8 fps is the minimum for reliable control.
Anyway, thanks for this project. Really impressive work.
I particularly enjoyed the morse code for takeoff
Interesting catch. Ryze must have changed this in recent firmwares. Tello would give position data only inflight back when I started TelloFpv development.Thanks to your post and from some tests I've just made, I begin to understand a bit better how the positionning system works. And it is not good news for my "camera operator" project. With the TelloPy package, the information I can get are labelled : mvo.vel_x,mvo.vel_y,mvo.vel_z,mvo.pos_x,mvo.pos_y,mvo.pos_z,imu.acc_x,imu.acc_y,imu.acc_z,imu.gyro_x,imu.gyro_y,imu.gyro_z,imu.q0,imu.q1,imu.q2, imu.q3,imu.vg_x,imu.vg_y,imu.vg_z
I don't know what mvo stands for, but I imagine it corresponds to data coming from the VPS.
I don't need to make the drone fly to see change in values.
In the graph below, I have drawn the trajectory (mvo.pos_x,mvo_pos_y) as I was walking holding the drone horizontally, but paying attention not to cover the sensors.
I walked 3 times along the same rectangular path (5mx0.5cm).
Too much variations to be usable.
Another test: if I hold the drone perfectly still and I move a book about 40 cm below, the values mvo.pos_* change as if the drone was moving.
In contrast, if I calculate the yaw angle from the quaternion, the values I get seems much more consistant, even if I "shake" the drone in all directions. But I agree with you that the IMU alone will not give acceptable results for the position.
The use of markers could help but would be too burdensome for my project.
Never mind, I have other ideas I want to explore.