3D Cabin Perception Mapping breaks new ground

Summary by AI BETAClose X

Seeing Machines has unveiled its next-generation 3D Cabin Perception Mapping platform at CES 2026, offering real-time, comprehensive in-cabin understanding and accelerating safety innovation for future mobility. This new architecture, built from the ground up, supports multiple cameras and occupants, providing a unified 3D perception of the cabin that improves accuracy, consistency, and scalability compared to traditional methods. The platform enables features like body size and pose estimation, out-of-position detection, child seat identification, and random object detection across up to three rows and seven occupants, with potential applications beyond automotive in robotics and human-machine interaction.

Disclaimer*

Seeing Machines Limited
13 January 2026
 

PRESS RELEASE

 

Seeing Machines breaks new ground at CES 2026 with 3D Cabin Perception Mapping

Next-generation architecture delivers real-time, in cabin understanding and accelerates scalable safety innovation to support future mobility

Canberra, Australia - 13 January 2026: Seeing Machines, a global leader in vision-based safety technology, today announced the successful debut of its next-generation 3D Cabin Perception Mapping platform at CES 2026, marking a significant step forward in real-time in cabin monitoring intelligence for future mobility.

While exterior perception mapping in automated driving focuses on a multi-sensor reconstruction of the vehicle's surrounding environment, the Seeing Machines 3D perception mapping solution demonstrated live at CES delivers a comprehensive, real-time digital reconstruction of inside the vehicle cabin, enabling a holistic, accurate and scalable approach to interior sensing. Built on a "clean-sheet" architecture, the platform is designed to support multiple cameras, multiple occupants and a wide range of features, all from a single, high-trust perception layer.

Unlike traditional feature-by-feature approaches, Seeing Machines' 3D Cabin Perception Mapping solves for the entire cabin simultaneously, improving consistency across features and maintaining accuracy even in the presence of intermittent or noisy sensor data.

The architecture provides a powerful abstraction layer that decouples feature development from underlying camera configurations and raw sensing implementations. This allows features to be built once and deployed seamlessly across multiple product configurations, reducing development effort, cost and time to market.

"This is a fundamental shift in how interior sensing systems are designed and deployed and the feedback from our CES engagements has been overwhelmingly positive," said John Noble, Chief Technology Officer at Seeing Machines. "By moving from feature-specific pipelines to a unified 3D perception of the cabin, we enable higher accuracy, consistency and scalability. Importantly, this platform will allow our customers to evolve their individual feature strategies, while dramatically lowering the cost and complexity of developing new safety and user experience capabilities."

The platform is designed to extend beyond automotive applications, with potential use cases across robotics and other human-machine interaction environments, including human-robot interaction (HRI) environments, where accurate and scalable perception of people and space is critical. The architecture also supports a mix-and-match approach to 3D technologies, enabling deployment flexibility as sensing hardware and use cases evolve.

Seeing Machines' CES 2026 showcase reinforces the company's commitment to advancing integrated, vision-based safety systems that support the next generation of intelligent, human-centred mobility.

Key CES 2026 Demonstration Features

Real-time digital reconstruction of an automotive cabin environment was showcased, including:

  • 3 cameras covering 3 rows of seating with support for up to 7 vehicle occupants;
  • Body size, shape and a full 3D pose solution for all occupants, including height and weight classification;
  • Out-of-position detection for all occupants, for example, occupant reclining, feet on dash, near-airbag detection for driver;
  • Seat configuration, for example, headrest presence, seat position and recline angle;
  • Child seat detection across entire cabin; and
  • Random object detection across cabin, for example, phones, bags and boxes


~ends~

About Seeing Machines (LSE: SEE)

Seeing Machines is a global company founded in 2000, headquartered in Australia and is an industry leader in vision-based monitoring technology that enable machines to see, understand and assist people. Seeing Machines is revolutionising global transport safety with. its technology portfolio of AI algorithms, embedded processing and optics, that power products which must deliver reliable real-time understanding of vehicle operators. The technology spans the critical measurement of where a driver is looking, through to classification of their cognitive state as it applies to accident risk. Reliable "driver state" measurement is the end-goal of Driver and Occupant Monitoring System (DMS/OMS) technology that Seeing Machines develops to drive safety for Automotive, Commercial Fleet, Off-road and Aviation. The company has offices in Australia, USA, Europe and Asia, and supplies technology solutions and services to industry leaders in each market vertical. www.seeingmachines.com

 

Seeing Machines contacts:

-     Media/investor: Sophie Nicoll, sophie.nicoll@seeingmachines.com, +61 419 149 683

-     Sales: Lori Markatos, lori.markatos@seeingmachines.com

This information is provided by Reach, the non-regulatory press release distribution service of RNS, part of the London Stock Exchange. Terms and conditions relating to the use and distribution of this information may apply. For further information, please contact rns@lseg.com or visit www.rns.com.

RNS may use your IP address to confirm compliance with the terms and conditions, to analyse how you engage with the information contained in this communication, and to share such analysis on an anonymised basis with others as part of our commercial services. For further information about how RNS and the London Stock Exchange use the personal data you provide us, please see our Privacy Policy.
 
END
 
 
UK 100

Latest directors dealings