Build your Event-based Application 1/5: Introduction (Full PDF Version attached)

Build your Event-based Application 1/5: Introduction (Full PDF Version attached)

All rights reserved © 2024 PROPHESEE S.A.

This document is a proprietary information of PROPHESEE S.A. It is not to be copied, distributed or otherwise disseminated to any third parties without the prior written acceptance of a duly authorized representative of PROPHESEE S.A.

It is referred to as “the technical specification of the Product” in the Prophesee’ “General Terms and Conditions of Sale” (“T&C’s”). It forms part of and is subject to such T&C’s. Information furnished in the document herein is believed to be accurate and reliable. However, Prophesee disclaims any liabilities for the consequences which may result from its use.

Information mentioned in this document are subject to change without prior notice.

Revision

Table 1. Document revision history

Release

Date

Description

1.0

2024-06-19

Initial version.

1. Introduction

This document provides the necessary information to develop the adequate setup for one’s specific application using Prophesee technology. It is intended to supplement all the shared and exclusive documentation available online:

To fulfill this purpose, we will walk you through the journey which will allow you to take the most out of your event- based camera.

1.1. What is event-based sensing?

As a reminder, our event-based technology allows for the detection of positive and negative light variation, at the pixel level at a very high temporal resolution (on the order of microseconds).

At a macro level, its translates in detecting:

  • Movement

  • Light variation

  • Noise

Movement detection is often the first feature that comes to mind, but our technology also allows us to detect extremely fast light variations, such as blinking LEDs. This can be extremely useful for some applications, such as the active marker one.

In fact, there is an incredible number of cases where event-based technology has an enabler potential, ranging from high-speed particle tracking to multi-scale vibration monitoring; from very-high frequency blinking light tracking to motion estimation, etc.

Below are a few illustrations of applications for which event-based vision has a strong potential.

Event-based application examples
Figure 1. Particle tracking
Figure 2. Vibration monitoring
Figure 3. Active marker tracking XYT view
Figure 4. Active marker live view
Figure 5. Motion estimation XYT view
Figure 6. Motion estimation live view

1.2. General headlines for event-based product development

The workflow for developing a product based on an event-based camera is fairly similar to a classical frame-based product development pipeline.

However, while some matters require fewer coverage (such as the dynamic range, which is pixel-wise variable and "automatically updated"), while some other subjects need to be treated with more attention (contrast), and some new matters arise (such as the event rate).

To provide examples of various possible configurations, we have selected two use-cases which will be detailed at the end of this document.

2. Camera setting

Camera settings article can be found here. This pages describes how to better understand and leverage the possibilities and limits of the sensor.

3. Setup tips

Setup tips article can be found here. You will find there some tips to take the most out of your event-based camera.

4. Event processing

This article is dedicated to the basis of event-based processing for machine vision.

5. Example use-cases

We describe here two example applications of event-based technology which leverage different properties of the sensor.

6. Conclusion

In summary, vision systems using event-based sensors should be configured for optimal performance in the target application.

First, the sensor itself provides a range of capabilities which need to be evaluated in the light of the application, and set accordingly after desired results have been reached. The camera module might also need tuning, such as focusing for instance. In particular, event filtering will have a major impact on the quality of the data produced by the sensor as well as the processing performance.

From the processing point of view, many algorithms will use a slicing time, which also needs to be chosen to work well with the observed scene. This slicing time can correspond to an integration time into a temporary data structure (a timesurface for instance). A benchmark should be conducted to quickly observe what can be seen with a range of slicing times.

    As a Prophesee customer, get access to your personal ticketing tool, application notes, product manuals, SDK Download Center and more resources. 
    Request your free access today.