LIBUSB_TRANSFER_ERROR after running for sometime
Camera: EVK4 Environment: Ubuntu2204, OpenEB 4.6.2, libusb 1.0.0 Problem: We are running USB camera for long time but sometimes get the following messages. [HAL][ERROR] ErrTransfert [HAL][ERROR] LIBUSB_TRANSFER_ERROR [HAL][ERROR] ErrTransfert [HAL][ERROR]
adding a hardware IR cutoff filter in EVK4
Hello Prophesee and community, we would like to record in an environment which is contaminated by "IR flickering light". So we plan to add an IR cutoff filter into EVK4 (HD). Similar to the description for EVK1 here: https://support.prophesee.ai/portal/en/kb/articles/inserting-filter
Xorg blank screen issue: Kria + IMX636
I am trying to run the kv260-psee application by using the HDMI connected between Kria and a display monitor. Following the instructions at https://docs.prophesee.ai/amd-kria-starter-kit/application/app_deployment.html I could power on the sensor but
Frequency / bias value matching table
Hi, I just found this table that relates bias_hpf and bias_fo in terms of high-pass cut-off frequency and Low-pass cut-off frequency but this is for GEN 3.1. Can you please share the table or the graphs (https://support.prophesee.ai/portal/en/kb/articles/bias-tuning-flow)
Timestamp of triggers
Dear all, I followed your documentation page to successfully record a RAW file with external trigger events using a wave generator. My question is about the exact function call from which the time is calculated in your example. When the time counter is
get_pixel_dead_time() erroring out on Gen4.1 EVK3-HD
I get the following error when trying to get pixel dead time for my EVK3-HD Gen4.1: Metavision HAL exception Error 104000. Has anyone experienced this or alternatively knows the dead time with default parameters?
RotateEventsAlgorithm
Hello, Where can I get the description/example of this please?
Camera sensor check
Hi Prophesee team, I want to find a sensor data sheet for our camera RDK2 IMX636. Could you please help me give me some information about it? Best regards, Yugang
question about python sample
I want to use the python sample "metavision_simple_recorder.py" to control the acquisition of event stream.The first problem is that when I want to use r to control the collection and stopping of event streams, I must ensure that the visualization interface
Inference Pipeline of Detection and Tracking using C++ not working when using live camera
I tried running the C++ detection and tracking sample as described here while using the live camera instead of a prerecorded file, but the camera doesn't seem to detect or classify any moving vehicles or pedestrians. I will only see bounding boxes get
Redirect all Metavision Error messages
Hello, I am trying to redirect all Metavision error messages from console, to spdlog logger, which is set up to log both in console and file. Metavision::Camera has a add_runtime_error_callback, but that would work only for the camera error. camera_.add_runtime_error_callback([](const
Understanding the relationship between bias settings and intensity
Greetings, I have an IMX636 sensor. I understand from this question (https://support.prophesee.ai/portal/en/community/topic/questions-about-biases) that the specific relation between the change in intensity of the incident light needed to trigger an event
IMX 636 conference proceedings
Hello, Is this proceedings conference document ''5.10 A 1280×720 Back-Illuminated Stacked Temporal Contrast Event-Based Vision Sensor with 4.86µm Pixels, 1.066GEPS Readout, Programmable Event-Rate Controller and Compressive Data-Formatting Pipeline |
Fixed number events per callback
Hello, I have expanded upon the HAL example functionality for custom applications. From my own characterization of the code, it seems with a sufficient amount of event activity that there is a fixed amount of 320 events per callback (i_cddecoder callback).
Why does frequency have to vary to measure LLCo and HLCO
Hello, I'm trying to measure the dynamic range of the GENX320 sensor. In the datasheet v1, the test conditions of the measurement are as follows : I was wondering why frequency had to vary between 2.5 Hz and 80 Hz? Cheers, Michael
S-Curve : why does pixel response probability go above 100% for GENX320 and not for IMX636 ?
Hello, I measured S-Curves for the IMX636 and the GENX320 and found that above a certain contrast value, pixel reponse probability values go above 100% for GENX320 and saturate at 100% for IMX636 . I saw you have a note on hot pixels for the GENX320.
Questions about 3d model tracking
Hello, I'm learning to use the metavision_model_3d_tracking program. Because the 3d model should be in JSON format, I have questions about this data format. The program needs an "object_init_pose.json" file and an "object.json" file. However, if I convert
Can I record only OFF events from hardware?
Hi, There are some variables of the hardware that we can control with Metavision studio, for example the bias and the ROI. Can I control the type of events that I want to record? Because in my specific case I only need to process the OFF events, but the
Verifying the Output of Extrinsic Calibration
Hello, I follow the steps in this website https://docs.prophesee.ai/stable/samples/modules/calibration/ground_plane_calibration.html?highlight=calibration The output of the Position of the camera with regard to world coordinate frame gave me X: 0.62…
Real-Time Values to MATLAB
Hello, I am using metavision_generic_tracking and was planning to send the data of pixel location to another program, in this case MATLAB. I would like to know if there is like any plug-ins or extensions available, so that i can send the data in real
Readout saturation definition
Hi, is it correct to say that read saturation is a combined effect between event rate and number of active rows in the image? I have done an experiment with different lighting conditions and speeds and I find that improving the lighting conditions, the
Cutting the RAW videos
Hello, I want to cut the raw videos into shorter samples. How can I do it using Python? I could find this link https://docs.prophesee.ai/stable/samples/modules/hal/hal_raw_cutter.html but it is working in C++. Is there any Python code that does the same?
How to make sensor to ULP mode with GEN320 EVK?
I read the datasheet, and find ULP_ENB and ULP_RST to set sensor to ULP mode, but I don't know where the pins on the evk. Can you tell me if there any way to ULP mode?
Generate Event Frame with sliding moving window
Hi, Is there a function that can create a sliding window for event generation? I understand that using OnDemandFrameGenerator only allows for a fixed time window, which cannot be set to overlap. Is there another way to achieve this? Thanks!
Using the IMX636 + MIPI CSI-2 with Jetson Orin instead of AMD Kria
Does anyone have experience making this work? - Device tree looks like it's available here: https://github.com/prophesee-ai/linux-sensor-drivers - I'm curious if there's a driver available for the Jetson, and if not, if anybody has tips to write one for
fisheye lens for EVK4 camera
Hi, We have procured a EVK4 camera. We were looking for fish eye lens that is compatible with EVK camera. Can you please recommend/suggest few of them.
Extrinsic Calibration of the Camera
Hello, I need to convert the units of measurements from pixel to meter and from pixel/sec to m/s that are the output of this code https://docs.prophesee.ai/stable/samples/modules/cv/sparse_flow_py.html?highlight=sparse For doing that, I need to calibrate
Color temperature of light source for electro-optical caratersitics of GENX320
Hello, I'm trying to measure the electro-optical caracteristics of my GENX320 sensor in the same conditions as the datasheet indicates. For the light source, the datasheet indicates : When you look at the wavelength associated to this color temperature
metavision_time_surface The delay problem is about 3 seconds. As time increases, the delay increases.
The delay problem is about 3 seconds. As time increases, the delay increases. https://github.com/prophesee-ai/openeb/blob/main/sdk/modules/core/python/samples/metavision_time_surface/metavision_time_surface.py There is a huge delay in capturing images.
ROI of the raw event data
Dear Sir or Madam, Greetings! I am trying to use event camera to build data set. And I want to reduce the memory size of the event data by restricting the ROI. I use the following command to set the ROI. The problem is that it seems the area out of the
Problems using SilkyEvCam VGA on ARM RK3588 development board.
Currently using camera model: SilkyEvCam VGA Software: Openeb 4.6.0 Plug-in: SilkyEvCam_Plugin_Installer_for_ubuntu 4.6.0 Currently, openeb is compiled and the SilkyEvCam_Plugin_Installer_for_ubuntu plug-in is installed. Using the lsusb command, the device
Inquiry about Temporal Resolution and Latency Measurement for EVK4 HD Camera
I was reviewing the technical specifications of the EVK4 HD camera(imx 636 sensor) and noticed that it states the camera has a temporal resolution equivalent to >10k fps. I have a couple of questions regarding this specification: Measurement of Temporal
How to synchronize the event-based camera and the frame-based camera
I am using the IMX646 sensor. And how can I synchronize the event camera and the grayscale camera? Looking forward to your reply.
Why do some pixels keep outputting negative events?
Hi, when I lower the bias_diff_off value (to less than 0), some pixels are always outputting negative events in all rendered frames. In other words, these pixels are always displayed as light blue and no longer change. But positive events are displayed
How to calculate the threshold
Hello, I have a few questions that are a bit confusing. ● What does the "factory default" in the figure mean? Is the calculation of the positive threshold: factory default + bias_diff_on (%)? ● If not, How can I get the specific positive and negative
How does Bias work?
Hello. My sensor is IMX646. And I want to know how the comparator's positive and negative threshold values change after adjusting the bias_diff_off and bias_diff_on values?
Questions about biases
Hello, I have learned that bias_diff_off, bias_diff_on, bias_fo, bias_hpf, bias_refr have no physical units, they are just numbers that represent the deviations. I can get the absolute deviation values of each parameter from Figure 1. But I still have
IMX636 pixel reset
Hello, I would like to know how to set the reset function to a fixed time,there is an example in the IMX636 productspeification which inputting an approximately 3 ms reset pulse from the TDRSTN pin. And, here's the code I added in the metavision_vie
EVK4 sparking curve under different ambient light
Do you have similar probability curves for EVK4 cameras?(To be honest, I can't find out which of your web pages I saw this picture on……)
Maximum temporal resolution of the PPS3MVCD sensor
The datasheet the PPS3MVCD sensor states that the pixel latency is nominally 100μs. According to support, this latency is due to the processing that occurs on the chip and the speed of transfer supported by the USB interface. The pixel delay is nominally
Next Page