Use the trigger interface to collect, and open the collected RAW files with metasion software. There is a freeze on the screen. How can I eliminate it? Thank you!
External trigger source 1MZ, logic analyzer test results RAW file trigger information 3、When opening RAW files through METAVISON software, the screen keeps getting stuck, and the effect is as shown in the attachment. 4.problem 4.1. Want to know how to
LIBUSB_TRANSFER_ERROR after running for sometime
Camera: EVK4 Environment: Ubuntu2204, OpenEB 4.6.2, libusb 1.0.0 Problem: We are running USB camera for long time but sometimes get the following messages. [HAL][ERROR] ErrTransfert [HAL][ERROR] LIBUSB_TRANSFER_ERROR [HAL][ERROR] ErrTransfert [HAL][ERROR]
Data Collection Issues
I want to use a Python script to simultaneously control the real-time display and acquisition of event data and frame camera data from the same scene. When I use dual threading, the event stream thread displays and saves normally, but the frame camera
Load intrinsic and extrinsic parameters into the EVK4 using Python
Hello everyone, I have recently been working on a project where I use the EVK4 event-based camera along with a regular RGB camera to capture two types of data simultaneously. To ensure that the view and surface of the objects we shoot remain consistent,
Pre-trained Model
I want to implement event stream based object detection and tracking, I noticed that the official documentation mentions pre-trained models, what algorithms it uses and what datasets it is trained on, where to download pre-trained models
Contrast threshold IMX 636
Hello, I've been watching this video (Metavision Training Videos | Introduction to Event-Based Vision Sensor) to understand the definition of contrast threshold. They mentioned that IMX 636 sensor uses a log definition of contrast but I haven't found
Graphical User Interface for synchronized recording of Events and frames.
Hello everyone, I am currently working on a project in which a hardware trigger between the EVK 4 and an industrial camera is used to record events and images in a synchronized manner. The focus here is on ease of use. The aim is to create a GUI that
adding a hardware IR cutoff filter in EVK4
Hello Prophesee and community, we would like to record in an environment which is contaminated by "IR flickering light". So we plan to add an IR cutoff filter into EVK4 (HD). Similar to the description for EVK1 here: https://support.prophesee.ai/portal/en/kb/articles/inserting-filter
Xorg blank screen issue: Kria + IMX636
I am trying to run the kv260-psee application by using the HDMI connected between Kria and a display monitor. Following the instructions at https://docs.prophesee.ai/amd-kria-starter-kit/application/app_deployment.html I could power on the sensor but
Frequency / bias value matching table
Hi, I just found this table that relates bias_hpf and bias_fo in terms of high-pass cut-off frequency and Low-pass cut-off frequency but this is for GEN 3.1. Can you please share the table or the graphs (https://support.prophesee.ai/portal/en/kb/articles/bias-tuning-flow)
Timestamp of triggers
Dear all, I followed your documentation page to successfully record a RAW file with external trigger events using a wave generator. My question is about the exact function call from which the time is calculated in your example. When the time counter is
get_pixel_dead_time() erroring out on Gen4.1 EVK3-HD
I get the following error when trying to get pixel dead time for my EVK3-HD Gen4.1: Metavision HAL exception Error 104000. Has anyone experienced this or alternatively knows the dead time with default parameters?
RotateEventsAlgorithm
Hello, Where can I get the description/example of this please?
Camera sensor check
Hi Prophesee team, I want to find a sensor data sheet for our camera RDK2 IMX636. Could you please help me give me some information about it? Best regards, Yugang
question about python sample
I want to use the python sample "metavision_simple_recorder.py" to control the acquisition of event stream.The first problem is that when I want to use r to control the collection and stopping of event streams, I must ensure that the visualization interface
Inference Pipeline of Detection and Tracking using C++ not working when using live camera
I tried running the C++ detection and tracking sample as described here while using the live camera instead of a prerecorded file, but the camera doesn't seem to detect or classify any moving vehicles or pedestrians. I will only see bounding boxes get
Redirect all Metavision Error messages
Hello, I am trying to redirect all Metavision error messages from console, to spdlog logger, which is set up to log both in console and file. Metavision::Camera has a add_runtime_error_callback, but that would work only for the camera error. camera_.add_runtime_error_callback([](const
Understanding the relationship between bias settings and intensity
Greetings, I have an IMX636 sensor. I understand from this question (https://support.prophesee.ai/portal/en/community/topic/questions-about-biases) that the specific relation between the change in intensity of the incident light needed to trigger an event
IMX 636 conference proceedings
Hello, Is this proceedings conference document ''5.10 A 1280×720 Back-Illuminated Stacked Temporal Contrast Event-Based Vision Sensor with 4.86µm Pixels, 1.066GEPS Readout, Programmable Event-Rate Controller and Compressive Data-Formatting Pipeline |
Fixed number events per callback
Hello, I have expanded upon the HAL example functionality for custom applications. From my own characterization of the code, it seems with a sufficient amount of event activity that there is a fixed amount of 320 events per callback (i_cddecoder callback).
Why does frequency have to vary to measure LLCo and HLCO
Hello, I'm trying to measure the dynamic range of the GENX320 sensor. In the datasheet v1, the test conditions of the measurement are as follows : I was wondering why frequency had to vary between 2.5 Hz and 80 Hz? Cheers, Michael
S-Curve : why does pixel response probability go above 100% for GENX320 and not for IMX636 ?
Hello, I measured S-Curves for the IMX636 and the GENX320 and found that above a certain contrast value, pixel reponse probability values go above 100% for GENX320 and saturate at 100% for IMX636 . I saw you have a note on hot pixels for the GENX320.
Questions about 3d model tracking
Hello, I'm learning to use the metavision_model_3d_tracking program. Because the 3d model should be in JSON format, I have questions about this data format. The program needs an "object_init_pose.json" file and an "object.json" file. However, if I convert
Can I record only OFF events from hardware?
Hi, There are some variables of the hardware that we can control with Metavision studio, for example the bias and the ROI. Can I control the type of events that I want to record? Because in my specific case I only need to process the OFF events, but the
Verifying the Output of Extrinsic Calibration
Hello, I follow the steps in this website https://docs.prophesee.ai/stable/samples/modules/calibration/ground_plane_calibration.html?highlight=calibration The output of the Position of the camera with regard to world coordinate frame gave me X: 0.62…
Real-Time Values to MATLAB
Hello, I am using metavision_generic_tracking and was planning to send the data of pixel location to another program, in this case MATLAB. I would like to know if there is like any plug-ins or extensions available, so that i can send the data in real
Readout saturation definition
Hi, is it correct to say that read saturation is a combined effect between event rate and number of active rows in the image? I have done an experiment with different lighting conditions and speeds and I find that improving the lighting conditions, the
Cutting the RAW videos
Hello, I want to cut the raw videos into shorter samples. How can I do it using Python? I could find this link https://docs.prophesee.ai/stable/samples/modules/hal/hal_raw_cutter.html but it is working in C++. Is there any Python code that does the same?
How to make sensor to ULP mode with GEN320 EVK?
I read the datasheet, and find ULP_ENB and ULP_RST to set sensor to ULP mode, but I don't know where the pins on the evk. Can you tell me if there any way to ULP mode?
Generate Event Frame with sliding moving window
Hi, Is there a function that can create a sliding window for event generation? I understand that using OnDemandFrameGenerator only allows for a fixed time window, which cannot be set to overlap. Is there another way to achieve this? Thanks!
Using the IMX636 + MIPI CSI-2 with Jetson Orin instead of AMD Kria
Does anyone have experience making this work? - Device tree looks like it's available here: https://github.com/prophesee-ai/linux-sensor-drivers - I'm curious if there's a driver available for the Jetson, and if not, if anybody has tips to write one for
fisheye lens for EVK4 camera
Hi, We have procured a EVK4 camera. We were looking for fish eye lens that is compatible with EVK camera. Can you please recommend/suggest few of them.
Extrinsic Calibration of the Camera
Hello, I need to convert the units of measurements from pixel to meter and from pixel/sec to m/s that are the output of this code https://docs.prophesee.ai/stable/samples/modules/cv/sparse_flow_py.html?highlight=sparse For doing that, I need to calibrate
Color temperature of light source for electro-optical caratersitics of GENX320
Hello, I'm trying to measure the electro-optical caracteristics of my GENX320 sensor in the same conditions as the datasheet indicates. For the light source, the datasheet indicates : When you look at the wavelength associated to this color temperature
metavision_time_surface The delay problem is about 3 seconds. As time increases, the delay increases.
The delay problem is about 3 seconds. As time increases, the delay increases. https://github.com/prophesee-ai/openeb/blob/main/sdk/modules/core/python/samples/metavision_time_surface/metavision_time_surface.py There is a huge delay in capturing images.
ROI of the raw event data
Dear Sir or Madam, Greetings! I am trying to use event camera to build data set. And I want to reduce the memory size of the event data by restricting the ROI. I use the following command to set the ROI. The problem is that it seems the area out of the
Problems using SilkyEvCam VGA on ARM RK3588 development board.
Currently using camera model: SilkyEvCam VGA Software: Openeb 4.6.0 Plug-in: SilkyEvCam_Plugin_Installer_for_ubuntu 4.6.0 Currently, openeb is compiled and the SilkyEvCam_Plugin_Installer_for_ubuntu plug-in is installed. Using the lsusb command, the device
Inquiry about Temporal Resolution and Latency Measurement for EVK4 HD Camera
I was reviewing the technical specifications of the EVK4 HD camera(imx 636 sensor) and noticed that it states the camera has a temporal resolution equivalent to >10k fps. I have a couple of questions regarding this specification: Measurement of Temporal
How to synchronize the event-based camera and the frame-based camera
I am using the IMX646 sensor. And how can I synchronize the event camera and the grayscale camera? Looking forward to your reply.
Why do some pixels keep outputting negative events?
Hi, when I lower the bias_diff_off value (to less than 0), some pixels are always outputting negative events in all rendered frames. In other words, these pixels are always displayed as light blue and no longer change. But positive events are displayed
Next Page