Multiple DVS synchronization (more than 1 slave)
Hello everyone, Our team plans to synchronize more than one slave DVS (with one master and three slaves). We've followed the recommendations for multiple DVS synchronization, but during testing, we found that only one slave is precisely synced with the
Help Getting Started: Streaming GenX320 Output with STM32F746 (No GUI Needed)
Hi everyone, I'm new to event-based vision and have just started working with a Prophesee GenX320 camera and an STM32F746 microcontroller. My initial goal is simple: Just open the camera and stream its output — no GUI, no visualization — just raw event
Error modifying Metavision recipe
Hi! First, I wanted to point out a minor error in the KV260 Starter Kit Manual. In "Edit kria applications", the suggested command is petalinux-devtool modify metavision_x.x.x. However, petalinux-devtool modify takes the root recipe, without the version.
Metavision HAL Exception Error 105000:
Greetings! I wrote an application to configure and record Eventcameras. In this setup we are using an EVK4 with IMX636. When changing the bias_diff_off to over around 120 the camera loses its connection and an Metavision HAL Exception Error 105000: is
Upper bound of get_illumination hal api function
Hello! I was wondering what the upper bound is the get_illumination function in the HAL API. Experimentally, I've recorded 300 lux with the function, but I would be interested to know the upper bound on the sensing abilities.
How to set MV_FLAGS_EVT3_UNSAFE_DECODER in Python api
I'm currently utilizing the EventsIterator to read EVT3 files, but I've encountered some errors. I'd like to know how to set the MV_FLAGS_EVT3_UNSAFE_DECODER option in Python so that it can ignore these errors.
EVK4 Stereo Camera ROS Driver Installation
We would appreciate your support in helping us resolve the ROS driver compatibility issue for our EVK4 stereo camera setup. We are currently working on setting up an event-based stereo vision system using two EVK4 cameras (serial numbers: P50905 and P50898).
How do I know how the FrequencyMapAsyncAlgorithm works
Hello, I'm currently using the event camera to detect frequency information, but the FrequencyMapAsyncAlgorithm doesn't seem to be open to the public, how can I know how the algorithm implements frequency detection
Are Metavision SDK Cmake files missing KV260
Are the Metavision SDK CMake (.cmake) files missing from the KV260 Petalinux image? Same question for OpenCV. I am attempting to build some of the examples and CMake cannot find the MetavisionSDKConfig.cmake. I have attempted to locate any CMake files
Slice Condition using Python
Hi, I'm trying to make use of the CameraStreamSlicer() function to change the duration of each slice. I've tried looking through all the documentation but I cant seem to figure out how to do this using the Python functions. Based on the information I
Vibration estimation seems to have a weird bottleneck
Hi, I've been using the vibration estimation code that is provided but im having some issues. The vibration estimation works fine even for detecting high frequencies however I want to run the detection it self at a higher frequency (so how often the algorithm
Issue of High latency and frame drop during event data capture for accumulation period less than 10 ms.
We are using - KV260 and imx636 mipi camera setup for our system under Kria Ubuntu, for one of the application with event camera. We are capturing data with high fps by setting the accumulation period of 1-10 msec using OpenEB Metavision SDK 4.6.0. For
kv260 + IMX636 not reading any event?
Dear Prophesee community, I have been doing extensive tests (a lot of try and error) in order to setup my kv260 with the imx636 following line by line the quickstart guide... everythin' seems fine but when metavision_viewer is launched the screen is black
Unable to locate the path to necessary DLLs for Metavision Python bindings
I am trying to follow the tutorial 'RAW File Loading using Python' (RAW File Loading using Python — Metavision SDK Docs 4.4.0 documentation (prophesee.ai)), but when I launch the program, it shows the following error: Unable to locate the path to necessary
No module named 'metavision_sdk_base_paths_internal'
Hi! When I tried to run the python tutorials, there is a problem: No module named 'metavision_sdk_base_paths_internal'. Please see the attached photo below: I found some related files but they are not named exactly the same with the module above: Could
Recording Application for two EVK4's and an RGB Camera
Greetings! In order to do research on event based vision in the context auf autonomous driving I am currently working on developing an application that is used to record two EVK4's- as well as images of an industrial RGB camera. The application is written
Advice to combine spatter_tracking and vibration_estimation algorithms
Good afternoon, I wanted to ask how it is better to combine spatter_tracking and vibration_estimation algorithms in one code, so that when i run that code, both features appear at the same window, and work flowlessly parallel. I used this scripts provided
How to play back data from raw files in real time?
When playing back data from file using OpenEB (5.0.0) using the code below (simplified), the data packets are delivered not in real-time (usually much faster). Is there anything that I'm doing wrong? ``` const auto cfg = Metavision::FileConfigHints().real_time_playback(true);
Serial Number Write Failed
We have four EVK3 gen 4.1 cameras that have serial numbers: "ffffffff" for all four cameras and would like them to be unique. Below is what all four cameras display when using the metavision_platform_info command: ## Prophesee Gen4.1 HD ## # System information
MVTec HALCON Cannot Recognize Evk4 Camera
I follow the guidence https://support.prophesee.ai/portal/en/kb/articles/mvtec-halcon-acquisition-interface#Installation, but it still cannot recognize the camera. In particular, I am confused about the specific implementation of the interface installation
Raw data capture in KV260-IMX636 sensor not decoding with metavision-viewer
Hi, We have imx636 mipi event camera sensor running in KV260 board and it is running with metavision_viewer app. But when we replay the raw data captured with metavision_viewer app, first it is not able to decode. Throws missing format error. Next added
EVK1 VGA is not seen by metavision studio 3.1
I have EVK1 VGA camera, one of the old ones, I also installed metaviison_sdk 3.1 beucase it is latest version that supports that cameras, but when I click open camera i get only "No Prophesee cameras available, please check that a camera is connected
ModuleNotFoundError: No module named 'metavision_sdk_stream'
I have an EVK4 and I am trying to interact with the camera using Python. I have installed the SDK according to the instructions on this page (https://docs.prophesee.ai/stable/installation/windows.html) but when I create/activate the venv, Python can import
Re-extract bias file
Hello, I left a message because I have a problem. I know that when I shoot, raw files and bias files come out. But I accidentally deleted the bias file, so only the raw file remains. At this time, is there a way to find out the bias setting value again
Installation of free Metavision sdk 4.6.2
metavision-sdk-analytics-bin : Depends: libopencv-core4.5d (>= 4.5.4+dfsg) but it is not installable Depends: libopencv-highgui4.5d (>= 4.5.4+dfsg) but it is not installable Depends: libopencv-imgproc4.5d (>= 4.5.4+dfsg) but it is not installable Depends:
ROS node with standard CPU architecture
Hello, I am trying to use the EVK4 to make a ROS node for a larger system. I am looking to build the ROS node on a PC with a traditional x86 Intel-i7 processor. I saw on the ROS node website (GitHub - prophesee-ai/prophesee_ros_wrapper: ROS driver for
Win 11 Metavision 4.6.2 cmake "Could NOT find TIFF" error
Hello, I am trying to install Metavision on Windows 11 and compile the metavision_viewer. With the Metavision 4.6.2, I got this error after the command: cmake .. -DCMAKE_BUILD_TYPE=Release CMake Error at C:/Program Files/CMake/share/cmake-3.31/Modules/FindPackageHandleStandardArgs.cmake:233
Meta studio keeps a blank tab in front of the camera livestream
Whenever I try to use metavision studio with the EVK4 camera, the tab shown in the figure stays in front of the main interface, even when I try to minimize it most of the time. Sometimes minimizing works but even then I am unable to use any of the buttons
KeyError: np.uint32(50000)
Hello, when I was using the Python inference pipeline for detection and tracking, an error "KeyError: np.uint32(50000)" occurred. The specific situation is shown in the screenshot below. This is the same whether I'm using the driving_sample dataset or
How to Improve Video Clarity and Edge Definition for Hand Movement with EVK4 in Indoor Lighting?
Hello, I'm currently working with the EVK4 and I'm trying to improve the clarity of the videos it captures, specifically making the edges of moving objects, like fingers, more defined. I've been experimenting with various parameters, but haven't been
LIBUSB_TRANSFER_ERROR after running for sometime
Camera: EVK4 Environment: Ubuntu2204, OpenEB 4.6.2, libusb 1.0.0 Problem: We are running USB camera for long time but sometimes get the following messages. [HAL][ERROR] ErrTransfert [HAL][ERROR] LIBUSB_TRANSFER_ERROR [HAL][ERROR] ErrTransfert [HAL][ERROR]
Load intrinsic and extrinsic parameters into the EVK4 using Python
Hello everyone, I have recently been working on a project where I use the EVK4 event-based camera along with a regular RGB camera to capture two types of data simultaneously. To ensure that the view and surface of the objects we shoot remain consistent,
Xorg blank screen issue: Kria + IMX636
I am trying to run the kv260-psee application by using the HDMI connected between Kria and a display monitor. Following the instructions at https://docs.prophesee.ai/amd-kria-starter-kit/application/app_deployment.html I could power on the sensor but
Timestamp of triggers
Dear all, I followed your documentation page to successfully record a RAW file with external trigger events using a wave generator. My question is about the exact function call from which the time is calculated in your example. When the time counter is
get_pixel_dead_time() erroring out on Gen4.1 EVK3-HD
I get the following error when trying to get pixel dead time for my EVK3-HD Gen4.1: Metavision HAL exception Error 104000. Has anyone experienced this or alternatively knows the dead time with default parameters?
question about python sample
I want to use the python sample "metavision_simple_recorder.py" to control the acquisition of event stream.The first problem is that when I want to use r to control the collection and stopping of event streams, I must ensure that the visualization interface
Inference Pipeline of Detection and Tracking using C++ not working when using live camera
I tried running the C++ detection and tracking sample as described here while using the live camera instead of a prerecorded file, but the camera doesn't seem to detect or classify any moving vehicles or pedestrians. I will only see bounding boxes get
Redirect all Metavision Error messages
Hello, I am trying to redirect all Metavision error messages from console, to spdlog logger, which is set up to log both in console and file. Metavision::Camera has a add_runtime_error_callback, but that would work only for the camera error. camera_.add_runtime_error_callback([](const
Fixed number events per callback
Hello, I have expanded upon the HAL example functionality for custom applications. From my own characterization of the code, it seems with a sufficient amount of event activity that there is a fixed amount of 320 events per callback (i_cddecoder callback).
Questions about 3d model tracking
Hello, I'm learning to use the metavision_model_3d_tracking program. Because the 3d model should be in JSON format, I have questions about this data format. The program needs an "object_init_pose.json" file and an "object.json" file. However, if I convert
Next Page