SynCity Sensor Types

Using our experience in device mechanical function, and how those devices interact with the world, CVEDIA has developed its own sensor modelling method of generating meaningful metrics to collect data. These are used within the simulator to replicate the same noise patterns and conditions observed in the real world.

This takes in consideration several factors, including light, weather, atmospheric vectors, distance and material properties. Data is acquired in our custom lab using tools based on the SynCity framework.

LiDAR
  • Scanning LiDAR
  • Solid state / flashing LiDAR
  • 2D LiDAR

Modelling off the shelf and custom LiDAR devices, SynCity mimics behaviour, intensity returns, and data protocols. Configure LiDAR device parameters in including laser quantity, angle, rotation speed, and more to experiment with LiDAR response in real-time. Compare commercial LiDAR models in your simulation and extract returns in the same packet format as physical devices.

Radar
  • Millimetre wave

CVEDIA models FMCW (frequency modulation continuous wave) design and can provide pulse and FSK radar sensor simulations. Our radar implementation allows for antenna pattern configuration, enabling SynCity to generate accurate RCS signature objects. Radar data output can be plotted in 2D point clouds, allowing for Doppler effect analysis within distance estimation.

RGB Cameras
  • Fisheye
  • Depth cameras
  • Monochromatic
  • High resolution
  • Panoramic cameras (up to 360)
  • Stereoscopic

CVEDIA supports non-geometric effects such as chromatic aberrations, lens flare, blooming and underexposure. Configure both intrinsic and extrinsic camera parameters with SynCity.

GPS / GNSS

SynCity provides longitude, latitude and altitude including a configurable noise and drift parameters.

Infrared / Thermal

Our partnership with FLIR, the world's leading thermal sensor producer, enables us to generate unmatched, highly realistic thermal imagery. Thermal noise, AGC signatures, and environmental conditions are fully configurable in the SynCity GUI and APIs.

SynCity matches real world sensor performance

  • icon
    Benchmarking

    Benchmark synthetic against real world data with SynCity sensor modelling. We provide benchmarked intensity returns and noise patterns, all while modelling physical mechanics and the motion of devices.

  • icon
    Communications Protocol Development

    SynCity is the only simulator that is developed at the communications protocol level - meaning data collected from SynCity synthetic sensors is nearly indistinguishable from field data.

  • icon
    Sensor Fusion

    SynCity provides a custom environment and analytics tool set allowing for complex sensor configurations. Combining data from multiple sensors corrects for deficiencies of individual sensors and reduces dataset uncertainty. SynCity can natively synchronize and align multi-sensory data regardless of which frequency is being produced.

  • icon
    Sensor Degradation

    Range sensors such as LiDAR, stereo vision and RGB-D are susceptible to environmental effects including changes in lighting and the presence of dust, water, or fog. SynCity provides a custom environment to configure, test, and verify sensor algorithms in reduced visibility, as well as safely recreate ng edge cases. Use SynCity's GUI or APIs to instantly configure sensor conditions in real-time.

  • icon
    Sensor Placement Optimization

    SynCity provides the ability to arbitrarily spawn objects and elements in active scenes, allowing for real-time experimentation with sensor positioning. Move sensors from one location to another, change sensor direction, or add or remove sensors as needed.

  • icon
    Six Degrees of Freedom

    SynCity provides meaningful data about 6DoF for any object in a scene. This includes object placement, orientation, and accelerometer readings.