Streaming Event data as frames to an external GUI based application using Tkinter

Streaming Event data as frames to an external GUI based application using Tkinter

Hello,

I am having issues using my EVK4 with a GUI based application I have designed to stream the event-camera and a frame-camera (semi) simultaneously. The script is written in python and uses queues to put event frames and regular frames in separate queues. Threading is implemented to grab the data from the cameras and put/generate the frames them into their respective queues asynchronous of each other. There is a GUI update function that then takes the data in each queue to the dual-view GUI display at a regular interval. Threads are started at initialization for the cameras and to start writing to the .raw for the event camera.  I have noticed a few behaviors with the EVK4 that are surprising.
1. When there are a large number of dynamic events, the EVK4 stops recording with an error         ERROR:root:Unexpected error in acquisition thread: RawReader buffer size too small. Please increase max_events". 
When this happens, a single re-run of the script results in the following
      [HAL][ERROR] Evt3 protocol violation detected: NonMonotonicTimeHigh
      ERROR:root:Unexpected error in acquisition thread: RawReader buffer size too small. Please increase max_events

This persists even when the max_events argument is set to 1000000 (I expect to see a maximum of 921600 events in a single frame for a 1280 x 720 size sensor).

2. Over time, the two streams start to lag and using a memory profiler program I can see a constant ramp in memory allocation over the course of the stream/recording. This behavior is somewhat random, when compared to the dynamics in the scene; sometimes a large jump in events result in additional memory being used (I assume for the raw_reader), but other times it levels out back to the memory used once the camera is started. Additionally, this only happens when I stream/record with the event-camera; when the event-camera is disabled the resources are constant, which leads me to believe that I am not utilizing the RawReader streaming method in a memory efficient way, but most of the examples only show how to stream the data to the metavision window object, not to an external window.

I am currently using metavision_core.event_io.raw_reader RawReader and initiate_device to initialize the HAL device and read data from it. I was using the Camera class of metavision_sdk_stream but found the frame generators to take longer than the simple numpy approach. A version of the script to stream solely from the event camera is provided. I would like feedback on the following.
1. Is there a better way to get events and stream generated frames to a GUI that is not the metavision_viewer than what I have?
2. What is the proper way to close the device once the program is closed or an acquisition ending event is triggered?
3. What is the proper way to handle the NonMonotonicTimeHigh and [HAL][ERROR] to restart the camera for a non-interrupted stream?

I am still in the preliminary stages of this project so any advice on the best method for asyncronous streaming and memory handling of the metavision_core.event_io or metavision_sdk_stream modules would be helpful. Thank you.
    As a Prophesee customer, join the community conversation. 
    Request your free access today.

      Information

      In this Community Forum, you can discuss products, technology and applications. Only registred users can post but everyone can read. To ask a private question, enter a support ticket in My Area https://support.prophesee.ai/portal/en/newticket