Camera Time Sync

Camera Time Sync

Hello Prophesee Team,

I am using a C++ application to periodically call Metavision::Camera::get_last_timestamp() for the purpose of synchronizing and analyzing drift between system time and camera time.

My expectation was that, by recording the value of get_last_timestamp() once per second, the difference between the measured timestamp and a linear fit (over the first 100 samples) would show either a flat line (perfectly linear drift) or some random noise (if the drift was irregular).

However, when I plotted the residuals (the difference between the sampled timestamps and the linear fit), I observed a clear sawtooth pattern rather than noise or a flat line. You can see an example of this plot here:

Could you please explain what might cause this periodic sawtooth behavior?

Is this related to some internal synchronization mechanism within the camera?
Could this be a feature or limitation of how get_last_timestamp() interacts with internal buffering or timestamp counters?
Any insight into the origin of these periodic jumps would be very helpful for understanding the timing characteristics of the Metavision camera.

Thank you!
    We are currently experiencing issues with our server. Some links may be temporarily unavailable. We apologize for the inconvenience and appreciate your patience as we work to resolve the problem as quickly as possible.

    As a Prophesee customer, join the community conversation. 
    Request your free access today.

      Information

      In this Community Forum, you can discuss products, technology and applications. Only registred users can post but everyone can read. To ask a private question, enter a support ticket in My Area https://support.prophesee.ai/portal/en/newticket