Advice to combine spatter_tracking and vibration_estimation algorithms
Good afternoon, I wanted to ask how it is better to combine spatter_tracking and vibration_estimation algorithms in one code, so that when i run that code, both features appear at the same window, and work flowlessly parallel. I used this scripts provided
Saving the results of the generic tracking sample to a numpy file
I'm attempting to adapt the generic tracking script to allow me to save the data in a numpy file, however, whenever I inspect the data, the timestamps when grouped by object_id are exactly the same for each object. That is, object 10 will always have
Recording Application for two EVK4's and an RGB Camera
Greetings! In order to do research on event based vision in the context auf autonomous driving I am currently working on developing an application that is used to record two EVK4's- as well as images of an industrial RGB camera. The application is written
How to set MV_FLAGS_EVT3_UNSAFE_DECODER in Python api
I'm currently utilizing the EventsIterator to read EVT3 files, but I've encountered some errors. I'd like to know how to set the MV_FLAGS_EVT3_UNSAFE_DECODER option in Python so that it can ignore these errors.
EVK4 Stereo Camera ROS Driver Installation
We would appreciate your support in helping us resolve the ROS driver compatibility issue for our EVK4 stereo camera setup. We are currently working on setting up an event-based stereo vision system using two EVK4 cameras (serial numbers: P50905 and P50898).
Efficiently replaying sync data from sync camera
Dear community, Thank you for reading. I am currently working with two EVK4 camera synchronized with the sync module of the SDK. I recorded some synchronized raw and I wanted to play it back real-time, and still synchronized. I cannot use the sync module
Interfacing Prophesee USB EVK with RT Linux CRIO
Is it possible to use a USB EVK* with Linux based NI cRIO? Ideally I would like to get an event stream in my control hardware -a National Instruments cRIO9049. The cRIO uses Linux RT -not ubuntu- and I am not sure if there is a simple way of doing t
How to play back data from raw files in real time?
When playing back data from file using OpenEB (5.0.0) using the code below (simplified), the data packets are delivered not in real-time (usually much faster). Is there anything that I'm doing wrong? ``` const auto cfg = Metavision::FileConfigHints().real_time_playback(true);
Extrinsic Calibration Reference Point
Hello everyone, I am currently performing an extrinsic calibration with the EVK4 HD and would like to validate the results of my translation. I have already been able to measure the X (horizontal) and Y (vertical) directions in a CAD model or directly
EventsIterator can not work rightly
Dear Community, I am working on a project involving an event camera and have created a Python class to manage it. One of the class attributes, self.cam, is an initialized HAL device. The class includes a method, continuous_acquire_events, designed to
KV260 GenX320 to Raspberry Pi connection
Can we use the KV260 – GenX320 starter kit directly on a Raspberry Pi using an official Raspberry Pi standard-to-mini camera cable? Thank you very much for your answer !
kv260 + IMX636 not reading any event?
Dear Prophesee community, I have been doing extensive tests (a lot of try and error) in order to setup my kv260 with the imx636 following line by line the quickstart guide... everythin' seems fine but when metavision_viewer is launched the screen is black
Regarding embedded linux image for RDK2
Hi All, For an application, my linux image size exceeds the flash partition range as per the default settings in RDK2. in this regard, has anyone tried out modifying the partition and dumping the image again ? Thanks, Shankar
EVT2.1 format
Hello, I'm trying to understand how the EVT 2.1 format works on GENX320. In order to do so, I generated 2 light edges (positive then negative) with a contrast well above the threshold value. The generated .RAW file should contain data with a sequence
metavision_viewer unable to read camera_config.json correctly in KV260+imx636
this is print out and the camera_config file. metavision_viewer (4.6.2) is able to work well with the camera_config.json but when it got upgraded to 5.0.0, it become like this. Is there a change to the format of the camera_config.json file? if so, could
Help Getting Started: Streaming GenX320 Output with STM32F746 (No GUI Needed)
Hi everyone, I'm new to event-based vision and have just started working with a Prophesee GenX320 camera and an STM32F746 microcontroller. My initial goal is simple: Just open the camera and stream its output — no GUI, no visualization — just raw event
Stereo Calibration
Hello, I calibrated my two event cameras in a stereo setup to perform depth mapping but I am getting poor results and no where near the accuracy shown in the sample videos. I was wondering if you could provide the camera setup used for the courtyard stereo
eb-synced-pattern-detection not working in calibration pipeline
Hi, I am trying to calibrate two event cameras to perform depth mapping through a stereo setup. I am passing in my own calibration.json file to the metavision_calibration_pipeline script to extract the intrinsics and extrinsics parameters of each camera.
Camera not detected in Metavision Starter kit KV260 (IMX636)
Dear Prophesee Support Team, I am working with the Prophesee Metavision Starter Kit for AMD Kria KV260 and followed the instructions provided in the Quick Start Guide. After flashing the SD card and booting the board, I attempted to launch Metavision
UTC-Referenced Absolute Event Timing in Metavision Studio/GUI
For my application, I need to reference event times to UTC with microsecond precision. In Metavision Studio, the .raw recording names are only given to the nearest 1 second and the events are referenced to the camera connect time (not the recording start
Having missing events in the images.
Hi PROPHESEE Support, So i got two problems, the first one is when i use the prophesee_ros_viewer.cpp from the prophesee_ros_wrapper(https://github.com/prophesee-ai/prophesee_ros_wrapper), sometimes i get blank lines in the middle of the image, see the
The specific value of the refractory period
Hello, I would like to know what is the refractory period in microseconds for the IMX636 version of EVK3 when the bias parameter is set to the default value.
Burst of negative event noise on startup
I have a custom setup where I am using a microcontroller to capture event data from the IMX636 via MIPI CSI; I've gotten to a point where data is captured reliably and efficiently. I've noticed that when I first initialize the camera and begin recording
Upper bound of get_illumination hal api function
Hello! I was wondering what the upper bound is the get_illumination function in the HAL API. Experimentally, I've recorded 300 lux with the function, but I would be interested to know the upper bound on the sensing abilities.
About the temporal resolution of IMX636
I am currently using a high-speed projector with a minimum exposure time of 105 microseconds per frame.In my experiment, I project alternating all-white and all-black patterns continuously, with each frame having an exposure time of 105 µs. Under this
How the Reference Intensity Is Chosen After and Event Detection and On Start Up
Hello everyone, I recently had some questions regarding the nature of the reference intensity for event detections, specifically how the reference intensity is chosen following an event detection and how it is chosen after a camera is started. I couldn't
Multiple DVS synchronization (more than 1 slave)
Hello everyone, Our team plans to synchronize more than one slave DVS (with one master and three slaves). We've followed the recommendations for multiple DVS synchronization, but during testing, we found that only one slave is precisely synced with the
Error modifying Metavision recipe
Hi! First, I wanted to point out a minor error in the KV260 Starter Kit Manual. In "Edit kria applications", the suggested command is petalinux-devtool modify metavision_x.x.x. However, petalinux-devtool modify takes the root recipe, without the version.
Metavision HAL Exception Error 105000:
Greetings! I wrote an application to configure and record Eventcameras. In this setup we are using an EVK4 with IMX636. When changing the bias_diff_off to over around 120 the camera loses its connection and an Metavision HAL Exception Error 105000: is
How do I know how the FrequencyMapAsyncAlgorithm works
Hello, I'm currently using the event camera to detect frequency information, but the FrequencyMapAsyncAlgorithm doesn't seem to be open to the public, how can I know how the algorithm implements frequency detection
USB and MIPI camera different bias defaults and ranges
I am seeing different default bias values and allowable ranges between the MIPI and USB versions of the IMX636. The following figure are the bias values that are found in this article as well as through testing on the Metavision Studio application. For
Are Metavision SDK Cmake files missing KV260
Are the Metavision SDK CMake (.cmake) files missing from the KV260 Petalinux image? Same question for OpenCV. I am attempting to build some of the examples and CMake cannot find the MetavisionSDKConfig.cmake. I have attempted to locate any CMake files
Slice Condition using Python
Hi, I'm trying to make use of the CameraStreamSlicer() function to change the duration of each slice. I've tried looking through all the documentation but I cant seem to figure out how to do this using the Python functions. Based on the information I
Issue of High latency and frame drop during event data capture for accumulation period less than 10 ms.
We are using - KV260 and imx636 mipi camera setup for our system under Kria Ubuntu, for one of the application with event camera. We are capturing data with high fps by setting the accumulation period of 1-10 msec using OpenEB Metavision SDK 4.6.0. For
Uncertainty in lux measurement from .get_illumination()?
Greetings, I was wondering if uncertainty in the lux measurement returned by `.get_illumination()` was known? Thank you.
Unable to locate the path to necessary DLLs for Metavision Python bindings
I am trying to follow the tutorial 'RAW File Loading using Python' (RAW File Loading using Python — Metavision SDK Docs 4.4.0 documentation (prophesee.ai)), but when I launch the program, it shows the following error: Unable to locate the path to necessary
No module named 'metavision_sdk_base_paths_internal'
Hi! When I tried to run the python tutorials, there is a problem: No module named 'metavision_sdk_base_paths_internal'. Please see the attached photo below: I found some related files but they are not named exactly the same with the module above: Could
IMX636 Bias Values for Automotive Driving Scences
Hello, I am planning to use the EVK4 with the IMX636 sensor in recording of an automotive dataset. Are there any existing experiences or recommendations regarding bias adjustment for driving scenes or outdoor environments? If not, what would be the best
Bad x320 firmware? Driver development for Raspberry Pis
After a few weeks of playing around, our Genx320 can talk over i2c to a Raspberry Pi 5, but apparently not over MIPI/CSI2. This involved creating a device tree overlay, modifying the Linux drivers open sourced by Prophesee, patching v4l2 utils, compiling
About the IMX636 bias
I noticed in the documentation that for the IMX636 model, only bias_diff_on/off has an effect on its sensitivity. But in my tests, I found that bias_diff also had an impact. I want to know what the details should be.
Next Page