top of page

From Reality to Virtual Mastery

  • Writer: Sales
    Sales
  • Jun 1, 2025
  • 2 min read
3-D Scan → Asset Administration Shell → NVIDIA Omniverse
Building a high-fidelity digital-twin that you can simulate and teach is no longer a multi-year R-&-D project. With the right toolchain you can move from a laser-scan of the shop-floor to reinforcement-learning in NVIDIA Omniverse in weeks, not months.

1. We capture the Physical World

We start with an accurate point-cloud or photogrammetric mesh. Modern handheld LiDAR units or tripod scanners produce sub-millimetre accuracy and colourised clouds that already hint at semantic layers. Yet a raw scan is only geometry – a true twin also needs live and contextual data. Eg. Artec 3DHexagon

Our Tip ▲ Scan in a single plant coordinate frame and keep the origin consistent; it will later become the USD stage origin in Omniverse.

2. We Clean, Re-mesh & Export to OpenUSD

We bring the point-cloud into a mesh editor (e.g., MeshLab, Blender) to:

  • remove outliers & noise

  • segment important assets (machines, safety fences)

  • reduce polygon count while retaining critical tolerances


3. We enrich with the Asset Administration Shell

The Asset Administration Shell (AAS) is the semantic wrapper that turns mere 3-D models into cyber-physical assets.

  1. Create a Shell (JSON or AASX) with identifiers, lifecycle state, OPC-UA endpoints, etc.

  2. Attach the 3-D sub-model using the IDTA “Provision of 3D Models” template (spec 02026). IDTAIDTA

  3. Optionally add sub-models for name-plate, performance curves, skills, AI checkpoints,

example : Open-source stacks such as Eclipse BaSyx host & serve these shells over REST/MQTT, giving Omniverse live access to telemetry.
example : Open-source stacks such as Eclipse BaSyx host & serve these shells over REST/MQTT, giving Omniverse live access to telemetry.
  1. We Import into Omniverse & Add Physics

Drag-and-drop the USD files (or connect your DCC tool via a Live Link) into an Omniverse Nucleus server. The platform recognises the SimReady metadata and lets you:

  • assign materials & RTX lighting

  • add rigid-body, articulation or fluid solvers in the Physics Core

  • script behaviours or PLC logic via Python → Extension API

The Add Physics workflow exposes sliders for mass, friction, motor limits and more, so your twin obeys real-world laws

5. Simulate & Train Intelligent Agents

With physics in place you can now spin up Isaac Gym/Isaac Sim extensions and launch thousands of parallel roll-outs for reinforcement learning:

The GPU-accelerated engine streams millions of contacts per second, shortening training cycles dramatically.
The GPU-accelerated engine streams millions of contacts per second, shortening training cycles dramatically.
Our Tip ▲ Mount the same AAS endpoint inside the training loop so the policy can read live sensor values or write set-points directly back to the shell.

6. Iterate & Deploy

Because the USD stage, AAS metadata and Omniverse scene stay loosely coupled, you can hot-swap:

  • a new scan (e.g., re-scanned conveyor)

  • updated PLC tags in the AAS

  • a fine-tuned policy checkpoint

…and see the effects immediately in the twin or on the real asset through OPC-UA or MQTT bridges.

 
 
 

Recent Posts

See All

Comments


bottom of page