JPL traced the cause of this glitch to the loss of a single image in the pipeline that Ingenuity uses to estimate its speed and position. The robot
relies on a downward-pointing VGA camera for monocular feature tracking: the camera captures frames at 30hz, finds distinctive features in those frames, and then compares each frame with the frame taken immediately before to see how much those features have moved and in which way. This allows Ingenuity’s navigation algorithm to figure out what direction it’s heading in and how fast it’s going.
This is a very well established technique that works great, as long as you know exactly what the time difference between the two frames that you’re comparing is, and this is where Ingenuity got into trouble. When that one image was lost, it screwed up the timestamps of every subsequent image without the navigation algorithm realizing it, Grip says. “From this point on, each time the navigation algorithm performed a correction based on a navigation image, it was operating on the basis of incorrect information about when the image was taken. The resulting inconsistencies significantly degraded the information used to fly the helicopter, leading to estimates being constantly ‘corrected’ to account for phantom errors.”
It would not have been at all surprising if this had been the end for Ingenuity, but the helicopter managed to keep flying along its pre-planned route. It was able to land safely because it’s programmed to stop using its visual navigation system just before landing, a decision that JPL originally made because of concerns about dust kicked up by the rotors obscuring the images.
Remember that Ingenuity has already done everything that it was designed to do, and that technically, its mission was a success after the very first flight. Everything that’s happening now is just gravy.