Dataset Structure for YOLOv8 Training Issue:
I'm encountering problems while training a YOLOv8 model using a custom dataset structure. I've followed the steps outlined in Prophesee YOLOv8 tutorial and used the Prophesee ML labeling tool. I am getting the following error: ValueError: num_samples
How to read .h5 file format in Metavision 4.4.0
Hello, I am trying to read the dataset DSEC (Download – DSEC (uzh.ch)) with the Metavision 4.4.0. The dataset is supposed to be from Prophesee Gen3, and the file format provided is .h5. From my understanding .h5 and .hdf5 have the same porperties, and
Metavision SDK version.
Hello, To work with my EVK2 Prophesee Evaluation Kit Camera - Gen41 which version of Metavision SDK should I install? can I have the link please? Thank you. Have a great day.
Multi camera sync connection
I've connected two cameras by aligning the sync out and sync in as shown in the following photo. However, upon consulting the schematic for connecting the devices, I noticed that two sync ins are originating from a single sync out. In such a case, would
FPGA not properly configured
When trying to open my EVK1 with the Metavision Studio 3.1 i get the following error message in the console: FPGA not properly configrured. I have never changed any settings or send any FPGA commands. Does anybody have an idea what happened or how i can
Capturing events between event triggers
I need a way to generate frames using only events between 2 consecutive event triggers. For example, consider the diagram below, where dashes represent CD events, and pipes represent trigger events. I need to be able to ignore events labeled "x" and use
Whether new events are generated during readout latency
Hello. I use the EVK4 event camera for some high activity cases, such as recording the rapid movement of many particles. In this case, due to the high number of particles, the timestamps of the generated events will have a maximum delay of 60 microseconds.
Recommendation for Frame based Camera for Integration with DVS
Hi, I plan to sync frame based camera with DVS, according to guideline the frame based camera should output sync out signal to DVS Trig in. my question is do you have any recommendation of frame based camera that has dedicated port to sync out signal?
PyTorch version of detection_and_tracking_pipeline.py.
Hi, I have encountered an issue while running detection_and_tracking_pipeline.py. Could you please advise which version of PyTorch I should install? It seems that the version I have installed is incorrect. My graphics card is NVIDIA 4060.
Request for Information on Video Deblurring Technique from Prophesee YouTube Demo
Hi everyone, I recently watched a video on Prophesee's official YouTube channel (link) demonstrating video deblurring. It showcased a method combining a CMOS camera's output with an event sensor's stream. Unfortunately, I couldn't find any related samples
get_illumination returning -1
I'm trying to compare the sensor lux readout to the events/sec at different bias settings, but the get_illumination readout from the I_monitoring interface only returns -1. Am I missing something about how to get this information? Thanks, Rich
How to calibrate?
Hey, How to use the calibration command? I am using this command with `metavision_mono_calibration_recording` the checkerboard html blinking on my iPad. The right side window (black screen) of the application is empty. I am not even seeing any pattern
Quantification of active pixels/s.
Hi, I would like to know please if there is any code from Metavision to calculate the active pixels/second in a recording? I was thinking in converting my recording to frames by using the Frame Generator and then, having frames with certain accumulation
OPENEB 4.3.0 on ubuntu 20.04, compiling cpp examples
Hello Prophesee Team, I have compiled OPENEB 4.3.0 on arm64 ubuntu 20.04 using the guide https://docs.prophesee.ai/4.3.0/installation/linux_openeb.html and have used Option 1 - working from build folder. the test suite: ctest -C Release run without errors.
MVTec HALCON Camera Not recognized evk4
经过: https://support.prophesee.ai/portal/en/kb/articles/mvtec-halcon-acquisition-interface 目前我下载的halcon版本是22.11 相机无法识别 软件开发工具包4.1 MV SDK4.1 目前测试通过后,软件无法识别EVK4相机。我需要安装其他插件吗?
Could not find a configuration file for package "MetavisionSDK"
When compiling the C++ “sample metavision_sdk_get_started.cpp” with the cmake command, the following error occurs: ------------------------------------------------------------------------------------------------------------- -- Selecting Windows SDK version
EVK4 measurement error information
I want to track an object using a Kalman filter, and among the variables, there is an item called variance of the sensor's measurement error, which can be provided by the manufacturer, where can I request it? My model is evk4
OPENEB_4.3.0 Compling Problems Met
I was compling openeb-4.3.0 no my jetson and was in the 'cmake .. -DCMAKE_BUILD_TYPE=Release -DBUILD_TESTING=OFF' step when this error occured: CMake Warning at sdk/modules/core/cpp/samples/metavision_sdk_get_started/CMakeLists.txt:20 (add_executable):
After cutting the RAW CSV text, the time is inconsistent, this is the cause.THK
https://docs.prophesee.ai/stable/samples/modules/driver/file_info.html#chapter-samples-driver-file-info Attached is the same time as the previous one, the same time record, the original RAW text has a lot of points, the more you have written, the more
Reset timestamps for multi-cam sync
Hello - I am utilizing 4 SilkyEvCam modules in a stereo setup. I have connected configured one of the cameras as the master and the other three as the slaves in software. I then connect the sync out of the master to the sync in of each slave. Distances
Background information/reference on analytics tracking schema
Hello, I am looking for information or literature describing the tracking algorithms used in the Analytics toolbox: Generic Tracking using Python — Metavision SDK Docs 4.4.0 documentation (prophesee.ai). I am specifically interested in the approaches
Frequency Histogram
Is there any existing algorithm for determining the frequency component at every x,y location? Similar to the histogram, except instead of time bins, they are divided into frequency bins. If not, can anyone advise on how I might adjust an existing algorithm
Negative bias_diff_off and bias_diff_on values of IMX636. What do they mean?
I have set the following biases in the IMX636 Sensor. bias_diff_off = -10 and bias_diff_on = -50. I have read the documentation and would expect to get more OFF events with these settings. However I am getting more ON Events. Why is that? I am still not
Using OpenEB expelliarmus to decode data
Hi Community, I was trying to decode the raw files from the camera using expelliarmus to decode the data and put it to a pandas df. What is the time stamps scale from it. When I use the raw to csv python scripts I get different results and the timescale
Converting Raw Files to Video - Questions regarding FPS
In the Metavision Studio application, you can export videos at varying frame rates. Regardless of the frame rate chosen, the resulting video is rendered at 30 FPS. This results in a slow/fast motion effect depending on whether the FPS is more or less
Higher Bias_off more OFF events?
Hi, I would like to know why if we increase the value of the OFF contrast threshold (Bias_off) we end up having more OFF events instead of less. What I understand is that if we increase the threshold is going to be harder to trigger an event but in this
Active pixels behind the moving object - Pixel latency
Hi, I'm doing an experiment about a rotating black DOT in a white background with backlight to maximize contrast. In all the cases I have this ON events after the movement of the DOT which in this case is going clockwise, this actually looks like a tail
Assistance Needed in Creating Dataset from Golf Shot Recordings
Hello, I'm currently working on a university project where I've encountered difficulties in creating a dataset. My project involves analyzing golf shots, for which I've recorded a large quantity of strikes, specifically focusing on the moment of impact
noise filtering post recording
I have an object that has an intrinsic blinking that can be captured with the EVK3, but the blinking is excited by a vibrating light. The vibrations have a known frequency, and I'm wondering if there is any way to filter this frequency after recording
The quickest recording of a range of events
Hello Prophesee Team, I want to write some amounts of events (200ms, several pipeline slices) before a dynamically calculated point in time. I am always saving last 5 pointers of event buffers to deque, and use it to write events when I need it. WriteCSV()
Pipelines Supported Algorithms
Hello, I was wondering what algorithm are supported as part of the data pipeline systems? We are looking at using this to test out multi-threading by converting the Event Frame Generation Sample into one using a pipeline approach. From what I have been
Unable to locate the path to necessary DLLs for Metavision Python bindings
I am trying to follow the tutorial 'RAW File Loading using Python' (RAW File Loading using Python — Metavision SDK Docs 4.4.0 documentation (prophesee.ai)), but when I launch the program, it shows the following error: Unable to locate the path to necessary
How to read camera data using fpga
We hope to use fpga to read the camera data instead of SDK. Is there any relevant information or documentation?
Getting GPU device error when running the detection pipeline
Hi team, I am trying to run the detection pipeline on my system. The command i am entering is python3 sdk/ml/python_samples/detection_and_tracking_pipeline/detection_and_tracking_pipeline.py --object_detector_dir ~/Documents/pre-trained_models/metavision_sdk_3.x/red_event_cube_05_2020/
Data transmission & Processor
Hello, According to the data announced on your site, the equivalent temporal precision of the event-technology is over 10 000 fps. Is this value fixed? Or can it decrease again in the most unfavorable cases? For example, we have a VGA EVK1 with a resolution
No plugin found error on Jetson orin nano
Hi, I sussessfully compiled OPENEB-4.3.0 on my Jetson Orin Nano ( arm64 coretex, Ubuntu 20.4 ), and I lunched 'metavision_viewer' from the 'build' fowder. And I plugged EVK3 to my Jetson, but there comes the warning: 'no plugin found' and 'Metavision
LIBUSB_ERROR_TIMEOUT in Ubuntu (VMware Virtual Machine)
Hi! When I launch this example in VMware Virtual Machine - Ubuntu OS, I receive this error: "[HAL][ERROR] Error while opening from plugin: hal_plugin_gen31_evk3 [HAL][ERROR] Integrator: Prophesee [HAL][ERROR] Device discovery: Metavision::TzCameraDiscovery
python SDK samples problem
Dear All, I'm trying to interface with a vision camera EB by Imago (linux image 1.3b) using python and MetaVision 2.3.1 on Windows 10. As far as I understand, the HAL plugin provided by Imago is linked against MetaVision 2.3.1, so I believe I'm stuck
Mv_iter stuck when no events occurred
Hello, I am using the Metavision recorder (https://support.prophesee.ai/portal/en/kb/articles/how-to-record-data-with-metavision-recorder), and notice that when no events are received from the camera (For example, after use small roi and strict biases),
Metavision interface for Mvtec Halcon
There is a problem to install the MVTec HALCON acquisition interface for prophesee camera. the last version of your SDK don't mutch with your dll file for the interface. These attachemements works only for 4.1 version and not for 4.3 is it possible to
Next Page