Disable Stage on Running Pipeline
Hello, Prophesee Team! I am trying to make FrameGenerationStage to be able to be: disabled (and free all resources and processing time), enabled back reset (new accumulation_time_ms and fps), Questions: Does all resources and processing time get free
How to change parameters or state of a stage from the scope of other stage?
Let say at some point while in the scope of the Stage2 I want to change some fields of the Stage1. I can access the stage with next_stages() or previous_stages(), and hopefully change any public fields, or callbacks without any problem, correct? What
Skipping events, based on the performance problems
Hello, Prophesee Team! Let say we have a simple pipeline: auto& cam_stage = p.add_stage(std::make_unique<Metavision::CameraStage>(std::move(cam), event_buffer_duration_ms)); auto& algo_stage = p.add_stage(std::make_unique<AlgoStage>(), cam_stage); auto&
Resolving very slow motion without flickering
I am trying to image fluorescent beads in a water suspension under a microscope. The beads move very slowly. When viewed through a CMOS camera you can see the beads slowly moving. In the event camera though it mostly just flickers. What causes this behavior
Slave Camera Recording file has obvious lag
I synchronized two event cameras and tested them. When recording, pictures of the master and the slave camera are perfectly fine and no lag is displayed. But when I open raw files, master.raw is fine, slave.raw shows obvious lag (similar to slow motion),
Metavision Pipeline and IPC (Interprocess Communication)
Hello, I can't get my head around how to make C++ Pipeline work with IPC (NamedPipe for example). I am interested in both directions: sending some data to another application. The most fitting to the Pipeline probably would be "SenderStage", similar to
Metavision build - Debug mode
Hello, I am trying to build the getting started sample in debug mode. See attached for my CMakeLists.txt and source code. I also attached the logs for my CMake config and build steps. When configuring my build in Release mode, I am able to detect the
Raw to HDF5 conversion
Hi, I am trying to convert a RAW file to HDF5 and I am receiving the following errors and warnings, what may be the cause ? Is there documentation on it ? Thank you in advance. [HAL][WARNING] Evt3 protocol violation detected : InvalidVectBase [HAL][ERROR]
Down sample the resolution of event camera
Hi, I'm using evk4 for research and I wonder that whether there is a way to down sample the camera resolution from 1280*720 to 640*480 to reduce the extremely high bandwidth data. Since I use it for slam, I don't want to set ROI to crop the FOV. Hope
Is it possible to receive 'External Trigger event' with Metavision studio?
I want to synchronize a frame camera with an event-based camera using Metavision studio. I would appreciate it if you explain how
ExtTrigger events not working when multiple cameras opened in same address space
This problem arises with a pair of SilkyEVCam's (Gen3 sensor) using MetaVision 3.0.2 (compiled from OpenEB). For testing one of the cameras generates an external triggers and has the input set to "loopback". Running two executables side-by-side works
AccTime for sparse optical flow
Hi, Am i correct that the sparse optical flow sample only works with an accumulation time of 33ms? How can I make it work with 66 or 100 ms? Kind regards, Rik
The assert error when running classification inference
Hello, I am interested in the Metavision SDK and try to run the demo of classification inference from https://docs.prophesee.ai/stable/metavision_sdk/modules/ml/samples/classification_inference.html?highlight=classification. I have installed SDK(python3.7)
There is a very obvious display delay when developing our data collection software
While developing our data collection software with five EVK4 cameras, we ran into a display problem. We have contacted the local Prophesee support, and they told us to test the provided sample code (metavision\sdk\cv\samples\metavision_noise_filtering).
Metavision studio crashes without error messages
Hello, Metavision Studio crashes immediately after start, even before the GUI appears. There is no error message produced and seemingly no logging output generated. I am using metavision SDK version 3.1 and a EVK2-4.1 with the latest firmware. Has anybody
ModuleNotFoundError: No module named 'metavision_core'
Hello, I have installed metavision SDK on ubuntu 18.04 successfully, and the Metavision Studio worked well. Besides, I have set the anaconda environment according to the usage https://docs.prophesee.ai/stable/installation/linux.html. However, when I
Filtering the event video
Hi After filtering the raw event video using various filtering algorithms as shown in your code example, is it possible to save the filtered event file as a raw file?
imu camera calibration
good morning all! I'm trying to create an all-in-one disposable with silky ev cam and lsm9ds1 imu. Can someone please tell me how to create the rototranslation matrix for this complex system for the calibration? Thank you !!
Index file created by metavision_raw_info.exe
When executing metavision_raw_info.exe from Metavision Intelligence 3.0.2, an index file named [input_raw_filename].tmp_index is automatically created. I remember this was not the case for older versions, which did not create any additional file each
Daisy Chain Build
Hi, In the synchronization application notes it says: "Daisy Chain is not yet supported natively in current software release. Contact us for more information on how to build it." I am using the OpenEB version of the driver, built for Fedora OS (which
Sparse Optical Flow
is there a publication where I can read how the used algorithm for sparse optical flow computation works?
How to produce() for FrameDisplayStage
Hi, I'd like to visualise images outputted by a custom stage. To do that, I'm inheriting from BaseStage and now need to call produce(xxx) appropriately. I saw that the class FrameDisplayStage tries to cast the input to FrameData in the function set_consuming_callback
When to use SDK vs HAL
After reading through the tutorials, it seems to me that many things can be achieved through the HAL API that can also be done through the SDK. For example, opening a camera and adding a callback for events can be done using the HAL (eg HAL Viewer Sample)
EVK4 Gen4.1 bias settings
Dear Prophesee team, the new bias settings shipped with Metavision 2.3.2 comes with all bias settings in the imx636_CD_standard.bias are set to zero. Additionally, from the EVK4 documentation, there is a printout from the Metavision Intelligence Suite
Access to DominantFrequencyEventsAlgorithm class in Python
Hello, We are working with the program metavision_vibration_estimation.py and we would like to access the DominantFrequencyEventsAlgorithm class of the Analytics module in Python. We would like to redefine the use of the "min_pixel_count" parameter
DVS for Ultra high speed applications (Real time)
Dear Prophesee Team! Problem Statement: I want to process the Raw Event (X, Y, P, T) data with Delta t of 4000uS / 250 FPS in real-time. Any process (slow/fast) inside the main loop shouldn’t affect the input stream of the camera. Below is the main loop
RAW Files export with Python
Hello, I would like to export the video stream of the camera in RAW format with a Python code. I have noticed that none of the available ".py" code allows to save the video stream in this format. Have you already implemented this function?
calling set_mode_standalone() in metavision 2.3.2 causes funky device state
Moving from metavision 2.2.2 to 2.3.2 revealed the following issue: When calling set_mode_standalone() the device: 1) stopped giving cd callbacks 2) returned -1 for all bias parameters. How to reproduce: Insert the following line at line 220 in the apps/metavision_viewer.cpp
Controlling EVK1 with LABVIEW
Hello, We were currently using the EVK1 camera from the Windows command prompt. We would like to be able to control it via a LABVIEW environment (call the program, set parameters, start acquisition, output events). Have you already done this? If yes,
Modify "process_to" during execution
Hello, I am currently working on the Python version of metavision_vibration_estimation. When launching the executable, we can specify the argument "process_to" to define the acquisition time. Is it possible to modify the value of this variable during
Is there a way to provide a memory for the driver to write into, with openEB?
We are using openeb sdk, and we added a callback function to acquire the EventCD data like this: cam_ptr->cd().add_callback(cb) In cb, the events are copied from driver's memory to our memory. After getting enough events, we feed them to the next-step
Error ‘Failed to load module “canberra-gtk-module”’ when launching metavision_viewer
Using Metavision Intelligence 2.3.1, I get the error ‘Failed to load module “canberra-gtk-module”' on Ubuntu when launching metavision_viewer. How can I fix this?