LeddarSim – Simulation for the Future of Mobility

 

What is LeddarSim?

LeddarSim™ is an advanced ADAS and autonomous driving simulator designed to reduce the simulation-to-reality gap. It reconstructs real-world driving scenarios using neural reconstruction techniques like Gaussian splatting, enabling real-time, photorealistic rendering from multi-modal sensor data including cameras, radars and LiDAR. 


Why LeddarSim?

  • Accelerate Time-to-Market
    Test millions of configurable scenarios virtually—reducing development cycles and speeding up validation.

  • Cut Costs, Not Corners
    Lower the cost per simulation frame by over 90% compared to real-world testing, while increasing accuracy.

  • Design Once. Deploy Anywhere.
    Easily adapt sensor setups, vehicle types and regional driving conditions to scale development across platforms.

Automotive companies face exponential challenges in the cost, complexity and scalability of the data required to deploy safety-compliant and regulation-ready ADAS and AD systems at scale. LeddarSim brings the latest advances in AI-powered, multi-modal sensor dataset generation to recreate real-world driving scenarios with high fidelity.
Antonio Polo, Sr. Vice-President of Product and Business Development at LeddarTech

The LeddarSim Difference

  • Data-Driven Simulation
    Build realistic scenarios directly from real-world data—not synthetic environments—for greater accuracy and relevance.

  • Multi-Modal Sensor Support
    Simulate LiDAR, radar and camera data together to optimize and validate multi-sensor perception systems.

  • Cloud-to-Edge Workflow
    Run simulations in the cloud or on-prem, with a seamless path from development to edge deployment.

  • Advanced Scenario Editor
    Create and modify complex driving scenarios—quickly testing edge cases and custom conditions at scale.

  • Reusable Data Assets
    Store and adapt scene components for future projects, reducing the need for repeated data collection and labeling.

Key Use Cases

  • Sensor Configuration Testing
    Test different sensor placements, specs and combinations across diverse vehicle types.

  • Actor Simulation
    Insert rare or under-represented road users like animals, cyclists or emergency vehicles.

  • Environmental Variables
    Simulate changing conditions—weather, signage, construction zones and more.

  • Ego-Vehicle Behavior
    Model lane changes, speed shifts and trajectory adjustments in real-world road layouts.

The Problem

Real-world data collection remains a major bottleneck in ADAS and autonomous system development. Gathering diverse driving data is time-consuming, expensive and often unsafe—especially when trying to capture rare or high-risk edge cases like low-visibility scenarios, unexpected pedestrian behavior or complex traffic interactions.

Digital twin-based simulations were introduced to address these challenges, but they come with their own limitations. These systems are expensive to build, difficult to scale, and often fail to replicate the full fidelity of real sensor behavior—leading to a gap between virtual testing and real-world performance.

LeddarSim Changes the Equation

By combining real-world data with AI-powered reconstruction techniques like neural radiance fields (NeRF) and Gaussian splatting, LeddarSim generates high-fidelity, fully reconfigurable 3D scenes. Developers can simulate complex scenarios with true-to-life detail—without the delays and costs of physical testing.

LeddarSim – A simulator for real-world neural reconstruction of driving scenarios

Most ADAS/AD simulators in the market use a synthetic, gaming-kind of environment

  • Modalities with no real multi-modality output
  • Huge gap in validation compared to real-world

LeddarSim utilizes real-world video footage to reconstruct real-world scenarios

  • Simulation comes with multi-modality output
  • Close-to-zero simulation gap in validating

 

Autonomous-driving developers invariably hit a wall at Level 3 (L3) autonomy and above due to three core challenges:

  • Intractable corner cases: The vast number of rare events encountered in routine driving makes exhaustive field validation impossible.

  • Manual data workflows: Hand-crafted data collection and annotation slow down development cycles and drive up costs.

  • Sensor-configuration silos: Datasets tied to specific vehicle sensor setups cannot be generalized or shared across platforms.

LeddarSim delivers a transformative solution:

  • Close-to-zero-simulation gap: Provides end-to-end training and validation in a unified simulated environment, ensuring fidelity to real-world conditions.

  • No 3D environment modeling required: Automatically generates realistic scenarios without the overhead of manual scene reconstruction.

Frequently Asked Questions

What is LeddarSim?
LeddarSim is a next-generation simulation platform that uses AI and neural rendering to reconstruct Sensor-realistic real-world driving scenarios by using real-world data. It enables ADAS and autonomous driving developers to test, validate, and train perception systems with high fidelity and realism.
How does LeddarSim differ from traditional simulators?
LeddarSim reduces the gap between simulation and reality by leveraging neural reconstruction to generate sensor-accurate data outputs for radar, LiDAR, and camera. Unlike synthetic-only tools, it recreates real-world edge cases with unmatched realism and precision.
What types of sensors does LeddarSim support?
LeddarSim provides multi-modal output for LiDAR, radar, and camera sensors, enabling developers to test full sensor fusion stacks across a wide range of scenarios and lighting conditions.
What are the benefits of using LeddarSim?
LeddarSim helps reduce development time and cost by offering millions of configurable test scenarios, rare edge cases, and a near-zero simulation gap. It enables up to 10× savings in data collection and annotation while improving testing coverage and speed.
Who should use LeddarSim?
LeddarSim is built for ADAS/AD developers, OEMs, and Tier 1 suppliers looking to accelerate perception system development, validation, and deployment with greater realism and cost-efficiency.