Multiple DVS synchronization (more than 1 slave)
Hello everyone, Our team plans to synchronize more than one slave DVS (with one master and three slaves). We've followed the recommendations for multiple DVS synchronization, but during testing, we found that only one slave is precisely synced with the
Help Getting Started: Streaming GenX320 Output with STM32F746 (No GUI Needed)
Hi everyone, I'm new to event-based vision and have just started working with a Prophesee GenX320 camera and an STM32F746 microcontroller. My initial goal is simple: Just open the camera and stream its output — no GUI, no visualization — just raw event
Error modifying Metavision recipe
Hi! First, I wanted to point out a minor error in the KV260 Starter Kit Manual. In "Edit kria applications", the suggested command is petalinux-devtool modify metavision_x.x.x. However, petalinux-devtool modify takes the root recipe, without the version.
Metavision HAL Exception Error 105000:
Greetings! I wrote an application to configure and record Eventcameras. In this setup we are using an EVK4 with IMX636. When changing the bias_diff_off to over around 120 the camera loses its connection and an Metavision HAL Exception Error 105000: is
Upper bound of get_illumination hal api function
Hello! I was wondering what the upper bound is the get_illumination function in the HAL API. Experimentally, I've recorded 300 lux with the function, but I would be interested to know the upper bound on the sensing abilities.
How to set MV_FLAGS_EVT3_UNSAFE_DECODER in Python api
I'm currently utilizing the EventsIterator to read EVT3 files, but I've encountered some errors. I'd like to know how to set the MV_FLAGS_EVT3_UNSAFE_DECODER option in Python so that it can ignore these errors.
EVK4 Stereo Camera ROS Driver Installation
We would appreciate your support in helping us resolve the ROS driver compatibility issue for our EVK4 stereo camera setup. We are currently working on setting up an event-based stereo vision system using two EVK4 cameras (serial numbers: P50905 and P50898).
How do I know how the FrequencyMapAsyncAlgorithm works
Hello, I'm currently using the event camera to detect frequency information, but the FrequencyMapAsyncAlgorithm doesn't seem to be open to the public, how can I know how the algorithm implements frequency detection
The specific value of the refractory period
Hello, I would like to know what is the refractory period in microseconds for the IMX636 version of EVK3 when the bias parameter is set to the default value.
Efficiently replaying sync data from sync camera
Dear community, Thank you for reading. I am currently working with two EVK4 camera synchronized with the sync module of the SDK. I recorded some synchronized raw and I wanted to play it back real-time, and still synchronized. I cannot use the sync module
Interfacing Prophesee USB EVK with RT Linux CRIO
Is it possible to use a USB EVK* with Linux based NI cRIO? Ideally I would like to get an event stream in my control hardware -a National Instruments cRIO9049. The cRIO uses Linux RT -not ubuntu- and I am not sure if there is a simple way of doing t
USB and MIPI camera different bias defaults and ranges
I am seeing different default bias values and allowable ranges between the MIPI and USB versions of the IMX636. The following figure are the bias values that are found in this article as well as through testing on the Metavision Studio application. For
Are Metavision SDK Cmake files missing KV260
Are the Metavision SDK CMake (.cmake) files missing from the KV260 Petalinux image? Same question for OpenCV. I am attempting to build some of the examples and CMake cannot find the MetavisionSDKConfig.cmake. I have attempted to locate any CMake files
Saving the results of the generic tracking sample to a numpy file
I'm attempting to adapt the generic tracking script to allow me to save the data in a numpy file, however, whenever I inspect the data, the timestamps when grouped by object_id are exactly the same for each object. That is, object 10 will always have
Slice Condition using Python
Hi, I'm trying to make use of the CameraStreamSlicer() function to change the duration of each slice. I've tried looking through all the documentation but I cant seem to figure out how to do this using the Python functions. Based on the information I
Vibration estimation seems to have a weird bottleneck
Hi, I've been using the vibration estimation code that is provided but im having some issues. The vibration estimation works fine even for detecting high frequencies however I want to run the detection it self at a higher frequency (so how often the algorithm
Issue of High latency and frame drop during event data capture for accumulation period less than 10 ms.
We are using - KV260 and imx636 mipi camera setup for our system under Kria Ubuntu, for one of the application with event camera. We are capturing data with high fps by setting the accumulation period of 1-10 msec using OpenEB Metavision SDK 4.6.0. For
Uncertainty in lux measurement from .get_illumination()?
Greetings, I was wondering if uncertainty in the lux measurement returned by `.get_illumination()` was known? Thank you.
kv260 + IMX636 not reading any event?
Dear Prophesee community, I have been doing extensive tests (a lot of try and error) in order to setup my kv260 with the imx636 following line by line the quickstart guide... everythin' seems fine but when metavision_viewer is launched the screen is black
Unable to locate the path to necessary DLLs for Metavision Python bindings
I am trying to follow the tutorial 'RAW File Loading using Python' (RAW File Loading using Python — Metavision SDK Docs 4.4.0 documentation (prophesee.ai)), but when I launch the program, it shows the following error: Unable to locate the path to necessary
No module named 'metavision_sdk_base_paths_internal'
Hi! When I tried to run the python tutorials, there is a problem: No module named 'metavision_sdk_base_paths_internal'. Please see the attached photo below: I found some related files but they are not named exactly the same with the module above: Could
IMX636 Bias Values for Automotive Driving Scences
Hello, I am planning to use the EVK4 with the IMX636 sensor in recording of an automotive dataset. Are there any existing experiences or recommendations regarding bias adjustment for driving scenes or outdoor environments? If not, what would be the best
EventsIterator can not work rightly
Dear Community, I am working on a project involving an event camera and have created a Python class to manage it. One of the class attributes, self.cam, is an initialized HAL device. The class includes a method, continuous_acquire_events, designed to
Bad x320 firmware? Driver development for Raspberry Pis
After a few weeks of playing around, our Genx320 can talk over i2c to a Raspberry Pi 5, but apparently not over MIPI/CSI2. This involved creating a device tree overlay, modifying the Linux drivers open sourced by Prophesee, patching v4l2 utils, compiling
Recording Application for two EVK4's and an RGB Camera
Greetings! In order to do research on event based vision in the context auf autonomous driving I am currently working on developing an application that is used to record two EVK4's- as well as images of an industrial RGB camera. The application is written
About the IMX636 bias
I noticed in the documentation that for the IMX636 model, only bias_diff_on/off has an effect on its sensitivity. But in my tests, I found that bias_diff also had an impact. I want to know what the details should be.
x320 writes bad data to file?
With a x320 ES (which doesn't support the EVT3 format), a patched OpenEB (to support v4l2 devices) can only write raw files of collected data -- hdf5 files open then almost instantly close without any errors. If I convert the raw files to hdf5 with `metavision_file_to_hdf5`
The exact threshold value of IMX636
I am currently utilizing the imx636 sensor for my research.To proceed with my work, I need to determine the exact threshold value rather than the bias parameter. How could I get this information?
Camera not detected in Metavision Starter kit KV260 (IMX636)
Dear Prophesee Support Team, I am working with the Prophesee Metavision Starter Kit for AMD Kria KV260 and followed the instructions provided in the Quick Start Guide. After flashing the SD card and booting the board, I attempted to launch Metavision
Event Camera for Optical Communication
Has anyone used an Event Camera for optical communication? I would like to use and LED to transmit data that can be received with an event camera. Some scenarios include: 1) stationary transmitter 2) moving transmitter 3) multiple transmitters
event camera noise
Hello, I have an EVK3-IMX636 camera, and my observation scenario involves detecting faint spatial targets in dark space. I would like to inquire about how to configure the camera so that it outputs as many events as possible generated by the sensor, without
Advice to combine spatter_tracking and vibration_estimation algorithms
Good afternoon, I wanted to ask how it is better to combine spatter_tracking and vibration_estimation algorithms in one code, so that when i run that code, both features appear at the same window, and work flowlessly parallel. I used this scripts provided
Spectral response curve for EVK4
Hello, I am trying to measure and verify the response probability S curve of the EVK4 camera. But I cannot find the quantum efficiency or detailed spectral response curve for the sensor in the preliminary datasheet. Only found a table with several wavelengths
How to play back data from raw files in real time?
When playing back data from file using OpenEB (5.0.0) using the code below (simplified), the data packets are delivered not in real-time (usually much faster). Is there anything that I'm doing wrong? ``` const auto cfg = Metavision::FileConfigHints().real_time_playback(true);
EVK4 Camera data transfer
The EVK4 camera with an USB3 cable is rated for hundreds of MB/s of data transfer; however, when running the EVK4 camera in a rapidly changing environment (where almost all pixels across the camera's resolution are changing) for around 10 seconds, an
Expected response for large intensity changes
Hello, I have a question regarding the expected response from the event camera pixels for very large intensity changes. The working principle of event cameras is that the if the long contrast between the intensity levels exceeds a certain threshold (determined
Some questions about positive events and negative events
I have an EVK4 device, when I use python's EventsIterator to read the events generated by the device, the delta_t is 10ms, and other parameters are the default values, but I found that when the brightness of the picture captured by the event camera does
Details about internal circuits
To facilitate my research, I would like to understand the general logic of the integrated circuit inside a sensor or schematic diagram. Where could I obtain these materials?And my sensor is IMX636.
About the sensitivity of the sensor under different lighting conditions
I have a question regarding the reduced sensitivity of sensors under low light intensity. I noticed from the datasheet that a higher contrast change is required to generate an event under low light intensity. However, it seems that the same absolute value
Power consumption EVK3
Hello everyone! According to the product brief document of EVK3 power consumption is 4.5W, there is no power consumption value in the manual of EVK3. For EVK4 - power consumption 500mW (Typ), 1.5W (Max) (according to its manual). So we measured maximum
Next Page