Image generating by accumulating events in 40us generates a aliased image in Gen4
Hi, I was trying to capture a really fast event that lasts a few microseconds. After recording, I tried plotting the image by accumulating events in 40 us time window. I see that the images are getting formed row-wise. I have attached snapshots of the
3~4ms temporal interval between event callbacks
I'm testing the speed of callbacks using the following code: (the whole code is based on your hal sample) if (i_cddecoder) {
// Register a lambda function to be called on every CD events
auto start_t=std::chrono::system_clock::now();
Event rate plotting
Hello, I recently viewed the demo video on your website that showcases plotting the event rate alongside real-time video. You can reference the video at: https://youtu.be/fL_kOaDPzjQ. I've been trying to replicate this feature and was searching for the
What is the correct way to handle the device object lifetime
I have written a class that wraps the RawReader class. I instantiate the reader as described in the documentation by first creating a device, setting all of the hal properties for the device, creating the reader from the device, and returning the reader.
Bias Bandwidth
Hi, As far as I understand the Bias Bandwidth is a parameter that can be changed to avoid high or low lighting fluctuations, but I will also affects the bandwidth of the camera, which is restrictive to the maximum events per sec that can be processed.
Minimum accumulation time to visualize
Hello, Since you know that the accumulation time is the corresponding variable that is used in events to approximate the correspondence to a certain frame rate. I would like to know what is the minimum accumulation time that I can use to visualize the
Accumulation time below 100us tracking_spatter.py
Hi, I'm doing some experiments to evaluate the performance of the EVK 2 and for that I'm tracking some dots that are rotating, I used the tracking_spatter.py algorithm from Prophesee where I need to sed the accumulation time, which is going to be also
Pipeline stage that outputs the coordinates of undistorted events
Hello. I am referring to the sample code (https://docs.prophesee.ai/stable/samples/modules/cv/undistortion.html#chapter-samples-cv-undistortion) and using pipeline processing to event I am trying to correct the undistortion of the image output from the
What is the default bias value of IMX636? And how to calculate the event based the bias?
We read the default bias by the provided software and obtain the default pos_thres=102%,neg_thres=73%. It's very curious. If we set the intensity of two images as I1 and I2. We can get a positive event only if I2 - I1 > 102% * I1. It seems not very reasonable.
Band Pass filter bias units
I have an EVK3 and I'm trying to understand the units of the fall off and high pass filter settings. I'm hoping I can use this to target a specific frequency of motion if I know the expected frequency ahead of time. For instance, suppose I have a light
Relation of abolute bias values of Gen4.1 and IMX636
Hello, I read in the forum: "Setting the logging level to TRACE as described here : https://docs.prophesee.ai/3.0.1/faq.html#how-can-i-change-the-logging-level-at-runtime should help you to see absolute bias values on your Gen4.2 sensor (IMX636) and the
Where to get EVK4's Sensor Characterization?
Hi, Prophesee! Now I have an EVK4, but Sensor Characterization need to know, such as latency, shot noise rate, threshold and etc. In this link, https://support.prophesee.ai/portal/en/kb/prophesee-1/metavision-sensing/sensors, I found the documents of
How to save files (of my EVK4) by Python API?
hello,i can use `metavision_core.event_io.EventsIterator` to capture evs signals, but how can i save them to RAW file or H5 file with python API ?
Stereo Calibration
I am trying to add a stage to a stereo calibration pipeline this stage must receive the flow of two streams of the events coming from two Prophesee cameras and the stage must filter them and lets go only the events coming from both cameras having the
Stage as class member
Hello, Prophesee Team! Small question, I have searched all examples, and everywhere you have: Metavision::Pipeline p(true); auto &cam_stage = p.add_stage(std::make_unique<Metavision::CameraStage>(std::move(camera), ebd_ms)); but how would you properly
Feature tracking algorithm
Hi, Is there a feature tracker specifically made or compatible with a prophesee event camera? I want to compare such an algorithm to a conventional KLT tracker. Does metavision or a third-party company provide such an algorithm? Kind regards, Rik van
Event Simulator and Pixel Pitch
Does the EventSimulator class (see documentation here) take into account the pixel pitch of the camera that shot the video? I have video from a CMOS sensor that has a pixel pitch of approximately 3 microns. The Prophesee EVK 3 has a pixel pitch of 15
Request for Assistance with Adjusting Event Rate Control in Python Binding
Dear Prophesee Team, I am writing to seek your assistance with adjusting the Event Rate Control in the Python binding for the sensor I am currently using. Specifically, I am using a sensor with a resolution of 720P, which produces an excessive number
Disable Stage on Running Pipeline
Hello, Prophesee Team! I am trying to make FrameGenerationStage to be able to be: disabled (and free all resources and processing time), enabled back reset (new accumulation_time_ms and fps), Questions: Does all resources and processing time get free
How to change parameters or state of a stage from the scope of other stage?
Let say at some point while in the scope of the Stage2 I want to change some fields of the Stage1. I can access the stage with next_stages() or previous_stages(), and hopefully change any public fields, or callbacks without any problem, correct? What
Skipping events, based on the performance problems
Hello, Prophesee Team! Let say we have a simple pipeline: auto& cam_stage = p.add_stage(std::make_unique<Metavision::CameraStage>(std::move(cam), event_buffer_duration_ms)); auto& algo_stage = p.add_stage(std::make_unique<AlgoStage>(), cam_stage); auto&
Resolving very slow motion without flickering
I am trying to image fluorescent beads in a water suspension under a microscope. The beads move very slowly. When viewed through a CMOS camera you can see the beads slowly moving. In the event camera though it mostly just flickers. What causes this behavior
Slave Camera Recording file has obvious lag
I synchronized two event cameras and tested them. When recording, pictures of the master and the slave camera are perfectly fine and no lag is displayed. But when I open raw files, master.raw is fine, slave.raw shows obvious lag (similar to slow motion),
Metavision Pipeline and IPC (Interprocess Communication)
Hello, I can't get my head around how to make C++ Pipeline work with IPC (NamedPipe for example). I am interested in both directions: sending some data to another application. The most fitting to the Pipeline probably would be "SenderStage", similar to
Metavision build - Debug mode
Hello, I am trying to build the getting started sample in debug mode. See attached for my CMakeLists.txt and source code. I also attached the logs for my CMake config and build steps. When configuring my build in Release mode, I am able to detect the
Raw to HDF5 conversion
Hi, I am trying to convert a RAW file to HDF5 and I am receiving the following errors and warnings, what may be the cause ? Is there documentation on it ? Thank you in advance. [HAL][WARNING] Evt3 protocol violation detected : InvalidVectBase [HAL][ERROR]
How to evaluate the quality of synchronization between event camera and external sensor?
Hello, I'm working on synchronization between IMU and event camera(EVK4). I use hardware trigger to synchronize as documentation says. However, it is hard to evaluate the synchronization quality. A naive way is the compare the data between IMU acceleration
Down sample the resolution of event camera
Hi, I'm using evk4 for research and I wonder that whether there is a way to down sample the camera resolution from 1280*720 to 640*480 to reduce the extremely high bandwidth data. Since I use it for slam, I don't want to set ROI to crop the FOV. Hope
Is it possible to receive 'External Trigger event' with Metavision studio?
I want to synchronize a frame camera with an event-based camera using Metavision studio. I would appreciate it if you explain how
120 degree FOV lens for EVK4 HD
Hello everyone! Does anyone have a recommendation for a good quality, but not overly expensive lens for the EVK4 HD? 120 degree FOV. BR
cannot retrieve biases on SilkyEVCamHD
On a SilkyEVCamHD the reading of bias parameters through the SDK fails. The identical code works for the SilkyEVCamHD: const Metavision::Biases biases = cam_.biases(); Metavision::I_LL_Biases * hw_biases = biases.get_facility(); const auto pmap = hw_biases->get_all_biases();
ExtTrigger events not working when multiple cameras opened in same address space
This problem arises with a pair of SilkyEVCam's (Gen3 sensor) using MetaVision 3.0.2 (compiled from OpenEB). For testing one of the cameras generates an external triggers and has the input set to "loopback". Running two executables side-by-side works
AccTime for sparse optical flow
Hi, Am i correct that the sparse optical flow sample only works with an accumulation time of 33ms? How can I make it work with 66 or 100 ms? Kind regards, Rik
Research papers
Hi, Does prophesee provide research papers with their algorithms? If yes, where can i find them? Kind regards, Rik van den Boogaard
OnDemandFrameGeneration Segmentation Fault and malloc error
I have run into a problem with the OnDemandFrameGeneration algorithm when encountering high event rates. Currently we are making our own multi-event camera calibration program (to calibrate intrinsics and extrinsics in one pass) and I found the OnDemandFrameGeneration
EVK4-HD 事件相机的同步怎么接
您好,如果我想要实现事件相机与RGB相机帧同步,我需要怎么接呢?我有一个主控制器可以给RBG相机发帧同步Tigger,事件相机这一段我是只需要把那个Tigger接5脚就行了吗?他的时钟可以悬空吗?
How to record a raw file with external trigger events?
Hello, What is the correct procedure to record raw files with external trigger events? When I record raw files either with metavision_studio, metavision_viewer or metavision_player, the recorded .raw file contains no external trigger events, as reported
Low Light Bias setting
I am intending to collect recordings in low light conditions. I have played around with the bias settings and found that the following settings give me the best visuals: 299 % bias_diff 228 % bias_diff_off 370 % bias_diff_on 1650 % bias_fo 1525 % bias_hpf
Where to get Quantum Efficiency for the EKV4-HD chip?
EKV4-HD是否使用GEN4.0芯片?是以下链接吗?还是以下文件?https://support.prophesee.ai/portal/en/kb/articles/gen4-0-qe
The assert error when running classification inference
Hello, I am interested in the Metavision SDK and try to run the demo of classification inference from https://docs.prophesee.ai/stable/metavision_sdk/modules/ml/samples/classification_inference.html?highlight=classification. I have installed SDK(python3.7)
Next Page