Hello Prophesee Team,
I am using a C++ application to periodically call Metavision::Camera::get_last_timestamp() for the purpose of synchronizing and analyzing drift between system time and camera time.
My expectation was that, by recording the value of get_last_timestamp() once per second, the difference between the measured timestamp and a linear fit (over the first 100 samples) would show either a flat line (perfectly linear drift) or some random noise (if the drift was irregular).
However, when I plotted the residuals (the difference between the sampled timestamps and the linear fit), I observed a clear sawtooth pattern rather than noise or a flat line. You can see an example of this plot here:
Could you please explain what might cause this periodic sawtooth behavior?
Is this related to some internal synchronization mechanism within the camera?
Could this be a feature or limitation of how get_last_timestamp() interacts with internal buffering or timestamp counters?
Any insight into the origin of these periodic jumps would be very helpful for understanding the timing characteristics of the Metavision camera.
Thank you!