Prebuilt metavision_dense_optical_flow.exe (Metavision 4.6.2, Win11)
Hello, I am trying to launch the prebuild metavision_dense_optical_flow.exe c++ example binary on the win11 Metavision v4.6.2 In the terminal it prints: Instantiating TripletMatchingFlowAlgorithm with radius= 1.5 then gui window is showed and quickly
Missing images in dataset structure for training YOLOV8 model
Hi, I have the following structure for my dataset to train a yolov8 model according to the documentation but I am getting the error: "AssertionError: train: No images found in /home/allen/code/proheese-camera/train_classification_model/train. Supported
RAM usage builds up when EVK4 captures high event rates
Greetings! When building a recording Application I noticed that while the EVK4 is facing conditions which result in a high event rates being captured, the system memory, or RAM that is used by the application is constantly increasing, but not decreasing
Help Getting Started: Streaming GenX320 Output with STM32F746 (No GUI Needed)
Hi everyone, I'm new to event-based vision and have just started working with a Prophesee GenX320 camera and an STM32F746 microcontroller. My initial goal is simple: Just open the camera and stream its output — no GUI, no visualization — just raw event
kv260 + IMX636 not reading any event?
Dear Prophesee community, I have been doing extensive tests (a lot of try and error) in order to setup my kv260 with the imx636 following line by line the quickstart guide... everythin' seems fine but when metavision_viewer is launched the screen is black
Stereo Calibration
Hello, I calibrated my two event cameras in a stereo setup to perform depth mapping but I am getting poor results and no where near the accuracy shown in the sample videos. I was wondering if you could provide the camera setup used for the courtyard stereo
eb-synced-pattern-detection not working in calibration pipeline
Hi, I am trying to calibrate two event cameras to perform depth mapping through a stereo setup. I am passing in my own calibration.json file to the metavision_calibration_pipeline script to extract the intrinsics and extrinsics parameters of each camera.
Is there a Python version of the Simple Window using C++ Example?
I want to replicate this example using Python because I have very little experience working with C++. https://docs.prophesee.ai/stable/samples/modules/ui/simple_window.html Has someone done this before? Or is this something that Prophesee has provided
UTC-Referenced Absolute Event Timing in Metavision Studio/GUI
For my application, I need to reference event times to UTC with microsecond precision. In Metavision Studio, the .raw recording names are only given to the nearest 1 second and the events are referenced to the camera connect time (not the recording start
EVT2.1 format valid bits meaning
Hello! According to the EVT 2.1 format description EVT 2.1 Format — Metavision SDK Docs 5.1.1 documentation , how should valid bits be understood correctly? 1. X and Y coordinates of the current valid event. 2. After receiving data from the image sensor,
Camera Time Sync
Hello Prophesee Team, I am using a C++ application to periodically call Metavision::Camera::get_last_timestamp() for the purpose of synchronizing and analyzing drift between system time and camera time. My expectation was that, by recording the value
Having missing events in the images.
Hi PROPHESEE Support, So i got two problems, the first one is when i use the prophesee_ros_viewer.cpp from the prophesee_ros_wrapper(https://github.com/prophesee-ai/prophesee_ros_wrapper), sometimes i get blank lines in the middle of the image, see the
Readout Saturation Error - Events Flashing Spontaneously
Hi, I'm currently using the EVK4 Prophesee event camera, and am recording events via ros2 subscription into a rosbag recording. When I try to replay my recording, I notice that spontaneously (for different durations too), the events on the screen appear
Upper bound of get_illumination hal api function
Hello! I was wondering what the upper bound is the get_illumination function in the HAL API. Experimentally, I've recorded 300 lux with the function, but I would be interested to know the upper bound on the sensing abilities.
Stereo Calibration Depth Mapping
Hi, I am trying to use the metavision_stereo_metavision.py script to create a depth map using two syncronized recordings obtained from the metavision_sync.py script but when I run the script I get a lot of NonMontonicTimeHigh and InvalidVectBase errors
Multiple DVS synchronization (more than 1 slave)
Hello everyone, Our team plans to synchronize more than one slave DVS (with one master and three slaves). We've followed the recommendations for multiple DVS synchronization, but during testing, we found that only one slave is precisely synced with the
Error modifying Metavision recipe
Hi! First, I wanted to point out a minor error in the KV260 Starter Kit Manual. In "Edit kria applications", the suggested command is petalinux-devtool modify metavision_x.x.x. However, petalinux-devtool modify takes the root recipe, without the version.
Metavision HAL Exception Error 105000:
Greetings! I wrote an application to configure and record Eventcameras. In this setup we are using an EVK4 with IMX636. When changing the bias_diff_off to over around 120 the camera loses its connection and an Metavision HAL Exception Error 105000: is
How to set MV_FLAGS_EVT3_UNSAFE_DECODER in Python api
I'm currently utilizing the EventsIterator to read EVT3 files, but I've encountered some errors. I'd like to know how to set the MV_FLAGS_EVT3_UNSAFE_DECODER option in Python so that it can ignore these errors.
EVK4 Stereo Camera ROS Driver Installation
We would appreciate your support in helping us resolve the ROS driver compatibility issue for our EVK4 stereo camera setup. We are currently working on setting up an event-based stereo vision system using two EVK4 cameras (serial numbers: P50905 and P50898).
How do I know how the FrequencyMapAsyncAlgorithm works
Hello, I'm currently using the event camera to detect frequency information, but the FrequencyMapAsyncAlgorithm doesn't seem to be open to the public, how can I know how the algorithm implements frequency detection
Are Metavision SDK Cmake files missing KV260
Are the Metavision SDK CMake (.cmake) files missing from the KV260 Petalinux image? Same question for OpenCV. I am attempting to build some of the examples and CMake cannot find the MetavisionSDKConfig.cmake. I have attempted to locate any CMake files
Slice Condition using Python
Hi, I'm trying to make use of the CameraStreamSlicer() function to change the duration of each slice. I've tried looking through all the documentation but I cant seem to figure out how to do this using the Python functions. Based on the information I
Vibration estimation seems to have a weird bottleneck
Hi, I've been using the vibration estimation code that is provided but im having some issues. The vibration estimation works fine even for detecting high frequencies however I want to run the detection it self at a higher frequency (so how often the algorithm
Issue of High latency and frame drop during event data capture for accumulation period less than 10 ms.
We are using - KV260 and imx636 mipi camera setup for our system under Kria Ubuntu, for one of the application with event camera. We are capturing data with high fps by setting the accumulation period of 1-10 msec using OpenEB Metavision SDK 4.6.0. For
Unable to locate the path to necessary DLLs for Metavision Python bindings
I am trying to follow the tutorial 'RAW File Loading using Python' (RAW File Loading using Python — Metavision SDK Docs 4.4.0 documentation (prophesee.ai)), but when I launch the program, it shows the following error: Unable to locate the path to necessary
No module named 'metavision_sdk_base_paths_internal'
Hi! When I tried to run the python tutorials, there is a problem: No module named 'metavision_sdk_base_paths_internal'. Please see the attached photo below: I found some related files but they are not named exactly the same with the module above: Could
Recording Application for two EVK4's and an RGB Camera
Greetings! In order to do research on event based vision in the context auf autonomous driving I am currently working on developing an application that is used to record two EVK4's- as well as images of an industrial RGB camera. The application is written
Advice to combine spatter_tracking and vibration_estimation algorithms
Good afternoon, I wanted to ask how it is better to combine spatter_tracking and vibration_estimation algorithms in one code, so that when i run that code, both features appear at the same window, and work flowlessly parallel. I used this scripts provided
How to play back data from raw files in real time?
When playing back data from file using OpenEB (5.0.0) using the code below (simplified), the data packets are delivered not in real-time (usually much faster). Is there anything that I'm doing wrong? ``` const auto cfg = Metavision::FileConfigHints().real_time_playback(true);
Serial Number Write Failed
We have four EVK3 gen 4.1 cameras that have serial numbers: "ffffffff" for all four cameras and would like them to be unique. Below is what all four cameras display when using the metavision_platform_info command: ## Prophesee Gen4.1 HD ## # System information
MVTec HALCON Cannot Recognize Evk4 Camera
I follow the guidence https://support.prophesee.ai/portal/en/kb/articles/mvtec-halcon-acquisition-interface#Installation, but it still cannot recognize the camera. In particular, I am confused about the specific implementation of the interface installation
Raw data capture in KV260-IMX636 sensor not decoding with metavision-viewer
Hi, We have imx636 mipi event camera sensor running in KV260 board and it is running with metavision_viewer app. But when we replay the raw data captured with metavision_viewer app, first it is not able to decode. Throws missing format error. Next added
EVK1 VGA is not seen by metavision studio 3.1
I have EVK1 VGA camera, one of the old ones, I also installed metaviison_sdk 3.1 beucase it is latest version that supports that cameras, but when I click open camera i get only "No Prophesee cameras available, please check that a camera is connected
ModuleNotFoundError: No module named 'metavision_sdk_stream'
I have an EVK4 and I am trying to interact with the camera using Python. I have installed the SDK according to the instructions on this page (https://docs.prophesee.ai/stable/installation/windows.html) but when I create/activate the venv, Python can import
Re-extract bias file
Hello, I left a message because I have a problem. I know that when I shoot, raw files and bias files come out. But I accidentally deleted the bias file, so only the raw file remains. At this time, is there a way to find out the bias setting value again
Installation of free Metavision sdk 4.6.2
metavision-sdk-analytics-bin : Depends: libopencv-core4.5d (>= 4.5.4+dfsg) but it is not installable Depends: libopencv-highgui4.5d (>= 4.5.4+dfsg) but it is not installable Depends: libopencv-imgproc4.5d (>= 4.5.4+dfsg) but it is not installable Depends:
ROS node with standard CPU architecture
Hello, I am trying to use the EVK4 to make a ROS node for a larger system. I am looking to build the ROS node on a PC with a traditional x86 Intel-i7 processor. I saw on the ROS node website (GitHub - prophesee-ai/prophesee_ros_wrapper: ROS driver for
Win 11 Metavision 4.6.2 cmake "Could NOT find TIFF" error
Hello, I am trying to install Metavision on Windows 11 and compile the metavision_viewer. With the Metavision 4.6.2, I got this error after the command: cmake .. -DCMAKE_BUILD_TYPE=Release CMake Error at C:/Program Files/CMake/share/cmake-3.31/Modules/FindPackageHandleStandardArgs.cmake:233
Meta studio keeps a blank tab in front of the camera livestream
Whenever I try to use metavision studio with the EVK4 camera, the tab shown in the figure stays in front of the main interface, even when I try to minimize it most of the time. Sometimes minimizing works but even then I am unable to use any of the buttons
Next Page