Metavision Sparse Optical Flow Questions
Hello, I am currently using the sparse optical flow algorithm and I have a few questions. As a an object enters the FOV of the camera, its velocity is tracked as near zero. This is because as the object moves into view, its "center" is staying apparently
http 404 error in train_detection.py script
Hello, I am trying to run the train_detection.py script in https://docs.prophesee.ai/stable/samples/modules/ml/train_detection.html#chapter-samples-ml-train-detection and I run it with the "toy_problem" path as shown below: python train_detection.py .
Frequency / bias value matching table
Hi, I just found this table that relates bias_hpf and bias_fo in terms of high-pass cut-off frequency and Low-pass cut-off frequency but this is for GEN 3.1. Can you please share the table or the graphs (https://support.prophesee.ai/portal/en/kb/articles/bias-tuning-flow)
metavision_hal python package
Hi, does anyone know where the metavision_hal python module is supposed to be located.I went through all the installation steps but I cant find that package. I originally got the error that metavision_core python module wasn't found but then I copied
Impact of 10Hz Sync Signal on EVK4 + RGB Camera Synchronization Precision
Hello Prophesee Team, We are currently developing a multi-sensor data acquisition system using the Prophesee EVK4 alongside a standard RGB camera, and we require precise time synchronization between them. I have carefully reviewed the official synchronization
IMX636 internal clock PPM
Hello, I am trying to find information about expected internal clock accuracy as PPM compared to the "perfect" clock. online sources give average "Standard Crystal Oscillators" somewhere around 20 PPM, which means around 12 seconds drift per week. Does
Vibration estimation seems to have a weird bottleneck
Hi, I've been using the vibration estimation code that is provided but im having some issues. The vibration estimation works fine even for detecting high frequencies however I want to run the detection it self at a higher frequency (so how often the algorithm
Stereo Calibration Depth Mapping
Hi, I am trying to use the metavision_stereo_metavision.py script to create a depth map using two syncronized recordings obtained from the metavision_sync.py script but when I run the script I get a lot of NonMontonicTimeHigh and InvalidVectBase errors
Advice to combine spatter_tracking and vibration_estimation algorithms
Good afternoon, I wanted to ask how it is better to combine spatter_tracking and vibration_estimation algorithms in one code, so that when i run that code, both features appear at the same window, and work flowlessly parallel. I used this scripts provided
Saving the results of the generic tracking sample to a numpy file
I'm attempting to adapt the generic tracking script to allow me to save the data in a numpy file, however, whenever I inspect the data, the timestamps when grouped by object_id are exactly the same for each object. That is, object 10 will always have
Recording Application for two EVK4's and an RGB Camera
Greetings! In order to do research on event based vision in the context auf autonomous driving I am currently working on developing an application that is used to record two EVK4's- as well as images of an industrial RGB camera. The application is written
How to set MV_FLAGS_EVT3_UNSAFE_DECODER in Python api
I'm currently utilizing the EventsIterator to read EVT3 files, but I've encountered some errors. I'd like to know how to set the MV_FLAGS_EVT3_UNSAFE_DECODER option in Python so that it can ignore these errors.
EVK4 Stereo Camera ROS Driver Installation
We would appreciate your support in helping us resolve the ROS driver compatibility issue for our EVK4 stereo camera setup. We are currently working on setting up an event-based stereo vision system using two EVK4 cameras (serial numbers: P50905 and P50898).
Efficiently replaying sync data from sync camera
Dear community, Thank you for reading. I am currently working with two EVK4 camera synchronized with the sync module of the SDK. I recorded some synchronized raw and I wanted to play it back real-time, and still synchronized. I cannot use the sync module
Interfacing Prophesee USB EVK with RT Linux CRIO
Is it possible to use a USB EVK* with Linux based NI cRIO? Ideally I would like to get an event stream in my control hardware -a National Instruments cRIO9049. The cRIO uses Linux RT -not ubuntu- and I am not sure if there is a simple way of doing t
How to play back data from raw files in real time?
When playing back data from file using OpenEB (5.0.0) using the code below (simplified), the data packets are delivered not in real-time (usually much faster). Is there anything that I'm doing wrong? ``` const auto cfg = Metavision::FileConfigHints().real_time_playback(true);
Extrinsic Calibration Reference Point
Hello everyone, I am currently performing an extrinsic calibration with the EVK4 HD and would like to validate the results of my translation. I have already been able to measure the X (horizontal) and Y (vertical) directions in a CAD model or directly
EventsIterator can not work rightly
Dear Community, I am working on a project involving an event camera and have created a Python class to manage it. One of the class attributes, self.cam, is an initialized HAL device. The class includes a method, continuous_acquire_events, designed to
KV260 GenX320 to Raspberry Pi connection
Can we use the KV260 – GenX320 starter kit directly on a Raspberry Pi using an official Raspberry Pi standard-to-mini camera cable? Thank you very much for your answer !
kv260 + IMX636 not reading any event?
Dear Prophesee community, I have been doing extensive tests (a lot of try and error) in order to setup my kv260 with the imx636 following line by line the quickstart guide... everythin' seems fine but when metavision_viewer is launched the screen is black
Regarding embedded linux image for RDK2
Hi All, For an application, my linux image size exceeds the flash partition range as per the default settings in RDK2. in this regard, has anyone tried out modifying the partition and dumping the image again ? Thanks, Shankar
EVT2.1 format
Hello, I'm trying to understand how the EVT 2.1 format works on GENX320. In order to do so, I generated 2 light edges (positive then negative) with a contrast well above the threshold value. The generated .RAW file should contain data with a sequence
metavision_viewer unable to read camera_config.json correctly in KV260+imx636
this is print out and the camera_config file. metavision_viewer (4.6.2) is able to work well with the camera_config.json but when it got upgraded to 5.0.0, it become like this. Is there a change to the format of the camera_config.json file? if so, could
Help Getting Started: Streaming GenX320 Output with STM32F746 (No GUI Needed)
Hi everyone, I'm new to event-based vision and have just started working with a Prophesee GenX320 camera and an STM32F746 microcontroller. My initial goal is simple: Just open the camera and stream its output — no GUI, no visualization — just raw event
Stereo Calibration
Hello, I calibrated my two event cameras in a stereo setup to perform depth mapping but I am getting poor results and no where near the accuracy shown in the sample videos. I was wondering if you could provide the camera setup used for the courtyard stereo
eb-synced-pattern-detection not working in calibration pipeline
Hi, I am trying to calibrate two event cameras to perform depth mapping through a stereo setup. I am passing in my own calibration.json file to the metavision_calibration_pipeline script to extract the intrinsics and extrinsics parameters of each camera.
Camera not detected in Metavision Starter kit KV260 (IMX636)
Dear Prophesee Support Team, I am working with the Prophesee Metavision Starter Kit for AMD Kria KV260 and followed the instructions provided in the Quick Start Guide. After flashing the SD card and booting the board, I attempted to launch Metavision
UTC-Referenced Absolute Event Timing in Metavision Studio/GUI
For my application, I need to reference event times to UTC with microsecond precision. In Metavision Studio, the .raw recording names are only given to the nearest 1 second and the events are referenced to the camera connect time (not the recording start
Having missing events in the images.
Hi PROPHESEE Support, So i got two problems, the first one is when i use the prophesee_ros_viewer.cpp from the prophesee_ros_wrapper(https://github.com/prophesee-ai/prophesee_ros_wrapper), sometimes i get blank lines in the middle of the image, see the
The specific value of the refractory period
Hello, I would like to know what is the refractory period in microseconds for the IMX636 version of EVK3 when the bias parameter is set to the default value.
Burst of negative event noise on startup
I have a custom setup where I am using a microcontroller to capture event data from the IMX636 via MIPI CSI; I've gotten to a point where data is captured reliably and efficiently. I've noticed that when I first initialize the camera and begin recording
Upper bound of get_illumination hal api function
Hello! I was wondering what the upper bound is the get_illumination function in the HAL API. Experimentally, I've recorded 300 lux with the function, but I would be interested to know the upper bound on the sensing abilities.
About the temporal resolution of IMX636
I am currently using a high-speed projector with a minimum exposure time of 105 microseconds per frame.In my experiment, I project alternating all-white and all-black patterns continuously, with each frame having an exposure time of 105 µs. Under this
How the Reference Intensity Is Chosen After and Event Detection and On Start Up
Hello everyone, I recently had some questions regarding the nature of the reference intensity for event detections, specifically how the reference intensity is chosen following an event detection and how it is chosen after a camera is started. I couldn't
Multiple DVS synchronization (more than 1 slave)
Hello everyone, Our team plans to synchronize more than one slave DVS (with one master and three slaves). We've followed the recommendations for multiple DVS synchronization, but during testing, we found that only one slave is precisely synced with the
Error modifying Metavision recipe
Hi! First, I wanted to point out a minor error in the KV260 Starter Kit Manual. In "Edit kria applications", the suggested command is petalinux-devtool modify metavision_x.x.x. However, petalinux-devtool modify takes the root recipe, without the version.
Metavision HAL Exception Error 105000:
Greetings! I wrote an application to configure and record Eventcameras. In this setup we are using an EVK4 with IMX636. When changing the bias_diff_off to over around 120 the camera loses its connection and an Metavision HAL Exception Error 105000: is
How do I know how the FrequencyMapAsyncAlgorithm works
Hello, I'm currently using the event camera to detect frequency information, but the FrequencyMapAsyncAlgorithm doesn't seem to be open to the public, how can I know how the algorithm implements frequency detection
USB and MIPI camera different bias defaults and ranges
I am seeing different default bias values and allowable ranges between the MIPI and USB versions of the IMX636. The following figure are the bias values that are found in this article as well as through testing on the Metavision Studio application. For
Are Metavision SDK Cmake files missing KV260
Are the Metavision SDK CMake (.cmake) files missing from the KV260 Petalinux image? Same question for OpenCV. I am attempting to build some of the examples and CMake cannot find the MetavisionSDKConfig.cmake. I have attempted to locate any CMake files
Next Page