No plugin found error on Jetson orin nano
Hi, I sussessfully compiled OPENEB-4.3.0 on my Jetson Orin Nano ( arm64 coretex, Ubuntu 20.4 ), and I lunched 'metavision_viewer' from the 'build' fowder. And I plugged EVK3 to my Jetson, but there comes the warning: 'no plugin found' and 'Metavision
LIBUSB_ERROR_TIMEOUT in Ubuntu (VMware Virtual Machine)
Hi! When I launch this example in VMware Virtual Machine - Ubuntu OS, I receive this error: "[HAL][ERROR] Error while opening from plugin: hal_plugin_gen31_evk3 [HAL][ERROR] Integrator: Prophesee [HAL][ERROR] Device discovery: Metavision::TzCameraDiscovery
python SDK samples problem
Dear All, I'm trying to interface with a vision camera EB by Imago (linux image 1.3b) using python and MetaVision 2.3.1 on Windows 10. As far as I understand, the HAL plugin provided by Imago is linked against MetaVision 2.3.1, so I believe I'm stuck
Mv_iter stuck when no events occurred
Hello, I am using the Metavision recorder (https://support.prophesee.ai/portal/en/kb/articles/how-to-record-data-with-metavision-recorder), and notice that when no events are received from the camera (For example, after use small roi and strict biases),
Metavision interface for Mvtec Halcon
There is a problem to install the MVTec HALCON acquisition interface for prophesee camera. the last version of your SDK don't mutch with your dll file for the interface. These attachemements works only for 4.1 version and not for 4.3 is it possible to
Overheating Gen4.1 sensor
Hi, I've been doing experiments with the camera where the event rate is always around 100 Mev/s, with around 10 minutes of use, the camera starts to overheating and the Metavision Studio app stops suddenly. Can I know your opinion please about this? Since
Discrepancy viewing vs processing raw file
The code below is meant to produce 2 black and white images, one for positive events and one for negative events event_iterator.reader.reset() # Reset the reader event_iterator.start_ts = int(1e6) event_iterator.delta_t = int(1e6) imgp = np.zeros((height,
Recording events straight to csv
While digging through the documentation, examples, and source code, I found a rather poorly documented feature of the python HAL API, and I am wondering if it works the way I think it does. I have tried a few examples and had some promising results, but
Strange Behavior recording raw files in quick succession
Hello Again, I apologize if I am spamming this forum, I've gone through the documentation, but I'm still running into some strangeness I cannot explain. I am trying to record several raw files in quick succession. I'm doing this with the HAL python library
Event camera acquisition on Jetson Orin
Hello, I would be interested to make event camera recordings with a jetson orin : 945-13730-0005-000 | Kit de développement NVIDIA Jetson AGX Orin Developer Kit NVIDIA | RS (rs-online.com) Event Camera Evaluation Kit 4 HD IMX636 Prophesee-Sony How can
Issue with Importing 'EventBbox' in Metavision SDK 4.1.0 While Using Machine Learning Labeling Tool
Hello, I am currently working with the Machine Learning Labeling Tool. While executing the 'label_tracking.py' file, I encountered an import error stating that 'EventBbox' could not be imported from 'metavision_sdk_ml'. Could you kindly assist me in resolving
Help understanding the difference in response between positive and negative events.
Hello everyone, I've been working with an EVK3 as part of my PHD thesis. In order to quantify the effect that the on and off biases have on the response of the camera, I have set up a simple experiment that uses the camera to measure responses to a pulsing
Inference Pipeline of Detection and Tracking using C++
Hi! I downloaded this folder LibTorch 1.13.1 for CUDA 11.7 I'm trying to compile this example: Inference Pipeline of Detection and Tracking using C++ — Metavision SDK Docs 4.0.1 documentation (prophesee.ai) When I done this line cmake .. -DCMAKE_PREFIX_PATH=`LIBTORCH_DIR_PATH`
Image generating by accumulating events in 40us generates a aliased image in Gen4
Hi, I was trying to capture a really fast event that lasts a few microseconds. After recording, I tried plotting the image by accumulating events in 40 us time window. I see that the images are getting formed row-wise. I have attached snapshots of the
3~4ms temporal interval between event callbacks
I'm testing the speed of callbacks using the following code: (the whole code is based on your hal sample) if (i_cddecoder) {
// Register a lambda function to be called on every CD events
auto start_t=std::chrono::system_clock::now();
Event rate plotting
Hello, I recently viewed the demo video on your website that showcases plotting the event rate alongside real-time video. You can reference the video at: https://youtu.be/fL_kOaDPzjQ. I've been trying to replicate this feature and was searching for the
What is the correct way to handle the device object lifetime
I have written a class that wraps the RawReader class. I instantiate the reader as described in the documentation by first creating a device, setting all of the hal properties for the device, creating the reader from the device, and returning the reader.
Bias Bandwidth
Hi, As far as I understand the Bias Bandwidth is a parameter that can be changed to avoid high or low lighting fluctuations, but I will also affects the bandwidth of the camera, which is restrictive to the maximum events per sec that can be processed.
Minimum accumulation time to visualize
Hello, Since you know that the accumulation time is the corresponding variable that is used in events to approximate the correspondence to a certain frame rate. I would like to know what is the minimum accumulation time that I can use to visualize the
Accumulation time below 100us tracking_spatter.py
Hi, I'm doing some experiments to evaluate the performance of the EVK 2 and for that I'm tracking some dots that are rotating, I used the tracking_spatter.py algorithm from Prophesee where I need to sed the accumulation time, which is going to be also
Pipeline stage that outputs the coordinates of undistorted events
Hello. I am referring to the sample code (https://docs.prophesee.ai/stable/samples/modules/cv/undistortion.html#chapter-samples-cv-undistortion) and using pipeline processing to event I am trying to correct the undistortion of the image output from the
What is the default bias value of IMX636? And how to calculate the event based the bias?
We read the default bias by the provided software and obtain the default pos_thres=102%,neg_thres=73%. It's very curious. If we set the intensity of two images as I1 and I2. We can get a positive event only if I2 - I1 > 102% * I1. It seems not very reasonable.
Band Pass filter bias units
I have an EVK3 and I'm trying to understand the units of the fall off and high pass filter settings. I'm hoping I can use this to target a specific frequency of motion if I know the expected frequency ahead of time. For instance, suppose I have a light
Relation of abolute bias values of Gen4.1 and IMX636
Hello, I read in the forum: "Setting the logging level to TRACE as described here : https://docs.prophesee.ai/3.0.1/faq.html#how-can-i-change-the-logging-level-at-runtime should help you to see absolute bias values on your Gen4.2 sensor (IMX636) and the
Where to get EVK4's Sensor Characterization?
Hi, Prophesee! Now I have an EVK4, but Sensor Characterization need to know, such as latency, shot noise rate, threshold and etc. In this link, https://support.prophesee.ai/portal/en/kb/prophesee-1/metavision-sensing/sensors, I found the documents of
How to save files (of my EVK4) by Python API?
hello,i can use `metavision_core.event_io.EventsIterator` to capture evs signals, but how can i save them to RAW file or H5 file with python API ?
Stereo Calibration
I am trying to add a stage to a stereo calibration pipeline this stage must receive the flow of two streams of the events coming from two Prophesee cameras and the stage must filter them and lets go only the events coming from both cameras having the
Stage as class member
Hello, Prophesee Team! Small question, I have searched all examples, and everywhere you have: Metavision::Pipeline p(true); auto &cam_stage = p.add_stage(std::make_unique<Metavision::CameraStage>(std::move(camera), ebd_ms)); but how would you properly
Feature tracking algorithm
Hi, Is there a feature tracker specifically made or compatible with a prophesee event camera? I want to compare such an algorithm to a conventional KLT tracker. Does metavision or a third-party company provide such an algorithm? Kind regards, Rik van
Event Simulator and Pixel Pitch
Does the EventSimulator class (see documentation here) take into account the pixel pitch of the camera that shot the video? I have video from a CMOS sensor that has a pixel pitch of approximately 3 microns. The Prophesee EVK 3 has a pixel pitch of 15
Request for Assistance with Adjusting Event Rate Control in Python Binding
Dear Prophesee Team, I am writing to seek your assistance with adjusting the Event Rate Control in the Python binding for the sensor I am currently using. Specifically, I am using a sensor with a resolution of 720P, which produces an excessive number
Disable Stage on Running Pipeline
Hello, Prophesee Team! I am trying to make FrameGenerationStage to be able to be: disabled (and free all resources and processing time), enabled back reset (new accumulation_time_ms and fps), Questions: Does all resources and processing time get free
How to change parameters or state of a stage from the scope of other stage?
Let say at some point while in the scope of the Stage2 I want to change some fields of the Stage1. I can access the stage with next_stages() or previous_stages(), and hopefully change any public fields, or callbacks without any problem, correct? What
Skipping events, based on the performance problems
Hello, Prophesee Team! Let say we have a simple pipeline: auto& cam_stage = p.add_stage(std::make_unique<Metavision::CameraStage>(std::move(cam), event_buffer_duration_ms)); auto& algo_stage = p.add_stage(std::make_unique<AlgoStage>(), cam_stage); auto&
Resolving very slow motion without flickering
I am trying to image fluorescent beads in a water suspension under a microscope. The beads move very slowly. When viewed through a CMOS camera you can see the beads slowly moving. In the event camera though it mostly just flickers. What causes this behavior
Slave Camera Recording file has obvious lag
I synchronized two event cameras and tested them. When recording, pictures of the master and the slave camera are perfectly fine and no lag is displayed. But when I open raw files, master.raw is fine, slave.raw shows obvious lag (similar to slow motion),
Metavision Pipeline and IPC (Interprocess Communication)
Hello, I can't get my head around how to make C++ Pipeline work with IPC (NamedPipe for example). I am interested in both directions: sending some data to another application. The most fitting to the Pipeline probably would be "SenderStage", similar to
Metavision build - Debug mode
Hello, I am trying to build the getting started sample in debug mode. See attached for my CMakeLists.txt and source code. I also attached the logs for my CMake config and build steps. When configuring my build in Release mode, I am able to detect the
Raw to HDF5 conversion
Hi, I am trying to convert a RAW file to HDF5 and I am receiving the following errors and warnings, what may be the cause ? Is there documentation on it ? Thank you in advance. [HAL][WARNING] Evt3 protocol violation detected : InvalidVectBase [HAL][ERROR]
How to evaluate the quality of synchronization between event camera and external sensor?
Hello, I'm working on synchronization between IMU and event camera(EVK4). I use hardware trigger to synchronize as documentation says. However, it is hard to evaluate the synchronization quality. A naive way is the compare the data between IMU acceleration
Next Page