Hannover Messe

XR visualization in industry: From 3D models to Gaussian Splats

At Hannover Messe 2026, 3D models merged with the real world and Gaussian Splats. We tried out various solutions.

listen Print view
Man wearing VR glasses at a trade fair

(Image: Jan Philipp Wöbbeking / heise medien)

8 min. read
Contents

Whether machines, production halls or art exhibitions: at the Hannover Messe, exhibitors such as MR4B, CMC Engineers, and Fraunhofer IAO showed how they envision spatial visualization in 2026. In addition to VR glasses and large-scale holo walls, solutions were also used that combined 3D models with Gaussian Splats or the real external world.

It was particularly exciting to dive into a mixed world of conventional 3D wireframe models and almost photorealistic Gaussian Splats in the “CMC ViewR”.

In the “ViewR Showroom” by CMC Engineers, for example, several people can put on 3D glasses to move together through an animated sawmill or an plant park in front of a large 3D holowall. We put on shutter glasses for two people to float through the mechanical details of the plants with a Vive controller. CMC's viewer used data that had previously been recorded with the XGrids PortalCam and loaded into the Unity scene after processing. This camera, which costs around 4500 Euros, was developed for recording Gaussian Splats. This allows photorealistic scenes to be displayed relatively resource-efficiently.

Here we immersed ourselves in the spatial scenery with the Vive Pro 2, and also in front of the wall behind it.

(Image: Jan Philipp Wöbbeking / heise medien)

In the Gaussian Splat process, ellipsoids with values such as color, opacity, size, and orientation are calculated from the recordings of four cameras and a lidar sensor. The many small spatial color splashes together form an authentic 3D image, which only frays a little at close details and at the edges. The principle is reminiscent of pop art images from the sixties, which were also composed of many individual points.

In a presentation at the CAE Forum stand, CMC Managing Director Max Hirlinger emphasized the speed and low cost of recording. After just ten minutes of recording and about three to four hours of calculation, photorealistic scenes can be presented. With an accuracy of about three centimeters, Gaussian Splats do not achieve the millimeter accuracy of other models, but this is often sufficient for many presentation purposes. These include sales at trade fairs or police safety inspections. Other security forces often lack interior data of buildings before an event, Hirlinger added.

In the scene half with the Gaussian Splats, we could hardly detect any blurry “clouds” on the floor or in the corners, which simpler splat recordings suffer from. For even more immersion, we later explored the model with the VR glasses Vive Pro 2, although communication with the outside world became trickier. A line ran through the scenery, allowing us to move into the area with detailed wireframes of machines – including attached note panels with technical details. A disadvantage of Gaussian Splats, however, is the lack of animation. This problem could be solved in the future by “four-dimensional” or dynamic Gaussian Splatting (4DGS), which also incorporates movement. However, this currently requires expensive recording technology.

Videos by heise

The federally funded “WIR!-Bündnis” MR4B is also working on various spatial research projects for presentations or training in the Berlin-Brandenburg metropolitan region. This includes, for example, the planning and installation of a heat pump (MR4Heatpump). We were able to move through the “Kepler” project with a Meta Quest 3, which is intended to facilitate seamless, collaborative pre-engineering for plants and buildings. The solution for multiple users brings together the traditionally separately available IFC data of buildings and DEXPI files for plants at an early stage, explained software developer Markus Gottschalk from X-Visual Technologies. After a joint inspection with VR glasses, the changed data can be exported back into both original formats immediately. The integration of Gaussian Splats was not yet possible here, but is planned according to Gottschalk.

Empfohlener redaktioneller Inhalt

Mit Ihrer Zustimmung wird hier ein externes YouTube-Video (Google Ireland Limited) geladen.

Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (Google Ireland Limited) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.

“MR4Safe Operations” for mixed reality maintenance of machines, which supports various VR and AR glasses according to the OpenXR standard, was more immersive. Currently, the project is working with the Meta Quest 3, which we also put on. In the mixed reality view, we connected hoses to a real machine model and went through the displayed checklists. The tracking of our hands worked intuitively even in the hustle and bustle of the trade fair. Moreover, we always had our hands free instead of having to hold a physical list or a tablet.

Fraunhofer IAO from Stuttgart presented two projects on the “Industrial Metaverse” at the trade fair simultaneously. On the collaborative Powerwall, two users can move through any 3D real-time data, for example for engineering reviews, simulations, or virtual showrooms. A projector and self-developed shutter glasses with a refresh rate of 240 Hertz are used. The images are split between the four lenses of the two viewers. Each lens thus receives an image 60 times per second. The perspective-correct implementation appeared clean, even though the image is inevitably somewhat darker than on the screen due to the constantly closing lenses. An OptiTrack system is used for tracking the glasses. It is similar to Valve's old Lighthouse laser tracking but offers higher precision for business applications.

For better photos and videos, the AI avatars from Fraunhofer IAO next door were also presented on 2D screens.

(Image: Jan Philipp Wöbbeking / heise medien)

On the screen, we could converse with AI avatars like Joseph Fraunhofer, the optician and instrument maker who died 200 years ago, who became the role model and namesake of today's Fraunhofer Society. Some employees also had a personal avatar (also called a persona) prepared with prompts on site. Currently, the Unreal Engine, Epic's Metahuman framework, and ChatGPT (via API) are used for this. In Stuttgart, the research institution is experimenting with local models in addition to a locally installed version of ChatGPT. The institute sees potential for avatars in education, administration, tourism, and digital foremen, among other areas. Josef Fraunhofer provided information about his vita, his riding skills, and his technical achievements at the stand.

However, the most enjoyable part of the Hannover Messe for us was the mixed reality exhibition by Solid Bytes Interactive, which could be tried out on the sidelines of the HannoVR Meetup. There, we simply put on a Meta Quest 3 and walked through a real themed island about 20 meters long, in which a virtual city model and virtual spheres were floating. When you sat down on one of the stools above which the spheres were hanging in the air, you were suddenly enveloped by a 360-degree video. In this case, the videos, with pleasantly sharp resolution (approx. 6K), showed a tour through various city districts of Hanover.

Mixed-Reality-CMS von Solid Bytes Interactive (2 Bilder)

Denis Griethe von Solid Bytes Interactive zeigte sein geplantes CMS fĂĽr Mixed-Reality-Ausstellungen. (Bild:

Jan Philipp Wöbbeking / heise medien

)

Correspondingly, Solid Bytes Interactive presented its still unnamed, in-development content management system (CMS) for mixed reality applications. The software aims to make it easier for museums and training projects to get started with mixed reality exhibitions. Organizers and lecturers should not have to deal with complicated technology, which has often held back the XR medium so far, explained company founder Dennis Griethe. Instead, in the CMS on the computer, you upload various 2D or 3D media into exhibition slots, which appear in the real world using the associated Quest app. During our trade fair visit, the developers were experimenting with the planned integration of a freshly recorded Gaussian Splat.

Since we as users saw the real world, the “walk” through the exhibition was mostly effortless and the inside-out tracking worked flawlessly even in the trade fair hustle. In between, however, a bug caused the objects to sink into the floor. According to Griethe, a corresponding patch is already in the works. Thanks to Meta's “Shared Spatial Anchors”, several people can simultaneously walk through the mixed reality exposition with correct perspective. At the trade fair, the developers initially used only a single, centrally placed anchor to align the various points of interest (POI). The studio is currently working with the Quest 3; however, other mixed reality headsets could be used in the future.

At the HannoVR Meetup, XR fans could try out some exciting gadgets. Richard Hoffmann demonstrated the fifth generation of the AR glasses „Spectacles“, available for developers, which can be seen here on the head of heise editor Jan Wöbbeking on the right. On the left is a planned pinball controller from Gunnar Gertzen (locomotionVR) for the Meta Quest. The box with a stick and side buttons is expected to be available as a series model in August 2026 at the Maker Faire Hannover for around 200 Euros.

(Image: Jan Philipp Wöbbeking / heise medien)

Empfohlener redaktioneller Inhalt

Mit Ihrer Zustimmung wird hier ein externer Preisvergleich (heise Preisvergleich) geladen.

Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (heise Preisvergleich) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.

(jpw)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.