PYNQ driver for the GenX320 camera
Hello, Since I wasn't able to find a driver that allowed the GenX320 camera to work with the PYNQ framework, I ported the existing drivers to Python for this purpose. You can find the code here. The available example is geared towards my current project,
Error with pybind11 when running metavision_model_3d_tracking.py (EVK4, Python)
Hello! I’m using an EVK4 event camera with Python and running the example metavision_model_3d_tracking.py together with the “maker” sample. When I run it, the program fails with an error that looks related to the pybind11 module. Could anyone advise how
Using SDK is Python venv / micromamba environment
I have an EVK4 and I am trying to use the SDK in a Python virtual environment. - Hardware: Prophesee EVK4 event-based camera - SDK: Prophesee Metavision SDK, compiled specifically for Python 3.10 (maximum for my ubuntu version) - Environment: Micromamba
IMX636 - Anti Flicker Filter (AFK) - Problem
Hello, I am currently working with the IMX636 camera and I am trying to suppress the events generated by the pixels of a computer monitor (60 Hz). My goal is to use the AFK (Anti-Flicker) filter to remove this flickering. To do this, I modified the camera
EVK4 HD Calibration result
Hello, i'm new to event camera and i followed most of the tutorial from ur documentation in order to do a good biases tuning and an intrinsec calibration. I used the metavision calibration pipeline with the chessboard pattern and i manage to get result
Establishing a time reference
Hello Prophesee Team, Is it possible to establish a time reference with the API (ideally Python)? For example, if one were to exploit the do_time_shifting functionality, could a timestamp simply be taken as the EventsIterator is called? Or is this too
Metavision Studio installation
I recently installed Metavision Studio on my Windows system, but I'm encountering two errors when trying to run the application: The first error states: "Windows doesn't find \share\metavision\apps\metavision_studio\internal\client\Metavision studio."
How to change camera settings before calling EventsIterator in python
I need some help on the Python interface. I can read teh events from the camera, with events iterator metavision_core.event_io. I also know how to change camera settings using Camera from metavision_sdk_stream. However, I am not able to combine both things
Is it possible to get multiple Entra accounts, for JFrog access, using only one camera's serial code?
I started a project a few months ago using the EVK4, now another individual is looking to take over the project. Is it possible for them to create their own Microsoft Entra ID to access Prophesee resources on JFrog?
Streaming Event data as frames to an external GUI based application using Tkinter
Hello, I am having issues using my EVK4 with a GUI based application I have designed to stream the event-camera and a frame-camera (semi) simultaneously. The script is written in python and uses queues to put event frames and regular frames in separate
How to build metavision C code only onto KV260
Hello I am starting with the compilation flow of the embedded version of Prophesee application onto the Kria KV260 with the imx636. The full compilation flow is OK (FPGA hardware, application code metavision-active-marker-3d-tracking and petalinux build).
imx636 USB 摄像头套件SDK
请问有没有关于USB方式接入的imx636 的EVK的SDK ?如果有,请提供,谢谢。
EVK2 camera not being detected.
Hello, I am trying to use the EVK2 camera using the Dockerfile attached, but i am experimenting some errors when executing metavision_platform_info. The camera is connected to my laptop but inside the container when using this command i am obtaining this
Prebuilt metavision_dense_optical_flow.exe (Metavision 4.6.2, Win11)
Hello, I am trying to launch the prebuild metavision_dense_optical_flow.exe c++ example binary on the win11 Metavision v4.6.2 In the terminal it prints: Instantiating TripletMatchingFlowAlgorithm with radius= 1.5 then gui window is showed and quickly
Using the IMX636 + MIPI CSI-2 with Jetson Orin instead of AMD Kria
Does anyone have experience making this work? - Device tree looks like it's available here: https://github.com/prophesee-ai/linux-sensor-drivers - I'm curious if there's a driver available for the Jetson, and if not, if anybody has tips to write one for
RAM usage builds up when EVK4 captures high event rates
Greetings! When building a recording Application I noticed that while the EVK4 is facing conditions which result in a high event rates being captured, the system memory, or RAM that is used by the application is constantly increasing, but not decreasing
Missing images in dataset structure for training YOLOV8 model
Hi, I have the following structure for my dataset to train a yolov8 model according to the documentation but I am getting the error: "AssertionError: train: No images found in /home/allen/code/proheese-camera/train_classification_model/train. Supported
Is there a Python version of the Simple Window using C++ Example?
I want to replicate this example using Python because I have very little experience working with C++. https://docs.prophesee.ai/stable/samples/modules/ui/simple_window.html Has someone done this before? Or is this something that Prophesee has provided
Readout Saturation Error - Events Flashing Spontaneously
Hi, I'm currently using the EVK4 Prophesee event camera, and am recording events via ros2 subscription into a rosbag recording. When I try to replay my recording, I notice that spontaneously (for different durations too), the events on the screen appear
train_detection script not training properly (Warning: No boxes were added to the evalutaion)
Hi, I am trying to use the train_detection.py script to train the model used the public FRED dataset (https://miccunifi.github.io/FRED/). I put the data in the correct structure but when I start training, every epoch displays the messaage: Warning : No
Error trying to use metavision sdk on python
Hello, I have installed Metavision SDK 4.6 and created an anaconda environment with python 3.9. I am trying to load the metavision packages however I always get the same error "ImportError Traceback (most recent call last) ----> 5 from metavision_core.event_io
Metavision Sparse Optical Flow Questions
Hello, I am currently using the sparse optical flow algorithm and I have a few questions. As a an object enters the FOV of the camera, its velocity is tracked as near zero. This is because as the object moves into view, its "center" is staying apparently
http 404 error in train_detection.py script
Hello, I am trying to run the train_detection.py script in https://docs.prophesee.ai/stable/samples/modules/ml/train_detection.html#chapter-samples-ml-train-detection and I run it with the "toy_problem" path as shown below: python train_detection.py .
metavision_hal python package
Hi, does anyone know where the metavision_hal python module is supposed to be located.I went through all the installation steps but I cant find that package. I originally got the error that metavision_core python module wasn't found but then I copied
Vibration estimation seems to have a weird bottleneck
Hi, I've been using the vibration estimation code that is provided but im having some issues. The vibration estimation works fine even for detecting high frequencies however I want to run the detection it self at a higher frequency (so how often the algorithm
Stereo Calibration Depth Mapping
Hi, I am trying to use the metavision_stereo_metavision.py script to create a depth map using two syncronized recordings obtained from the metavision_sync.py script but when I run the script I get a lot of NonMontonicTimeHigh and InvalidVectBase errors
Advice to combine spatter_tracking and vibration_estimation algorithms
Good afternoon, I wanted to ask how it is better to combine spatter_tracking and vibration_estimation algorithms in one code, so that when i run that code, both features appear at the same window, and work flowlessly parallel. I used this scripts provided
Recording Application for two EVK4's and an RGB Camera
Greetings! In order to do research on event based vision in the context auf autonomous driving I am currently working on developing an application that is used to record two EVK4's- as well as images of an industrial RGB camera. The application is written
How to set MV_FLAGS_EVT3_UNSAFE_DECODER in Python api
I'm currently utilizing the EventsIterator to read EVT3 files, but I've encountered some errors. I'd like to know how to set the MV_FLAGS_EVT3_UNSAFE_DECODER option in Python so that it can ignore these errors.
EVK4 Stereo Camera ROS Driver Installation
We would appreciate your support in helping us resolve the ROS driver compatibility issue for our EVK4 stereo camera setup. We are currently working on setting up an event-based stereo vision system using two EVK4 cameras (serial numbers: P50905 and P50898).
How to play back data from raw files in real time?
When playing back data from file using OpenEB (5.0.0) using the code below (simplified), the data packets are delivered not in real-time (usually much faster). Is there anything that I'm doing wrong? ``` const auto cfg = Metavision::FileConfigHints().real_time_playback(true);
kv260 + IMX636 not reading any event?
Dear Prophesee community, I have been doing extensive tests (a lot of try and error) in order to setup my kv260 with the imx636 following line by line the quickstart guide... everythin' seems fine but when metavision_viewer is launched the screen is black
Help Getting Started: Streaming GenX320 Output with STM32F746 (No GUI Needed)
Hi everyone, I'm new to event-based vision and have just started working with a Prophesee GenX320 camera and an STM32F746 microcontroller. My initial goal is simple: Just open the camera and stream its output — no GUI, no visualization — just raw event
Stereo Calibration
Hello, I calibrated my two event cameras in a stereo setup to perform depth mapping but I am getting poor results and no where near the accuracy shown in the sample videos. I was wondering if you could provide the camera setup used for the courtyard stereo
eb-synced-pattern-detection not working in calibration pipeline
Hi, I am trying to calibrate two event cameras to perform depth mapping through a stereo setup. I am passing in my own calibration.json file to the metavision_calibration_pipeline script to extract the intrinsics and extrinsics parameters of each camera.
UTC-Referenced Absolute Event Timing in Metavision Studio/GUI
For my application, I need to reference event times to UTC with microsecond precision. In Metavision Studio, the .raw recording names are only given to the nearest 1 second and the events are referenced to the camera connect time (not the recording start
Having missing events in the images.
Hi PROPHESEE Support, So i got two problems, the first one is when i use the prophesee_ros_viewer.cpp from the prophesee_ros_wrapper(https://github.com/prophesee-ai/prophesee_ros_wrapper), sometimes i get blank lines in the middle of the image, see the
Upper bound of get_illumination hal api function
Hello! I was wondering what the upper bound is the get_illumination function in the HAL API. Experimentally, I've recorded 300 lux with the function, but I would be interested to know the upper bound on the sensing abilities.
Multiple DVS synchronization (more than 1 slave)
Hello everyone, Our team plans to synchronize more than one slave DVS (with one master and three slaves). We've followed the recommendations for multiple DVS synchronization, but during testing, we found that only one slave is precisely synced with the
Error modifying Metavision recipe
Hi! First, I wanted to point out a minor error in the KV260 Starter Kit Manual. In "Edit kria applications", the suggested command is petalinux-devtool modify metavision_x.x.x. However, petalinux-devtool modify takes the root recipe, without the version.
Next Page