PART 4: Bosch Rexroth Customer Story: Digital Twin – Factory of the Future


In the previous article of the series we introduced the dataset and explained how we prepared and processed the individual components of this Realtime Experience.
Picking up where we left off, we will now continue with the technical explanation on how to realize the final Experience.


Therefore, we will cover the following highlights:

  • Designing the scene & interactions
  • Music & Sound
  • Play Optimization
  • Review & Approval
  • Play Experience


Designing the scene & interactions

With every experience buildup we should structure and organize the scene.

The final scene is composed of:

  • The assembly line
  • Virtual Reality preset (camera, controllers, immersive behaviors)
  • A floor to constrain the users movements
  • A surrounding

The assembly line is composed of :

  • Chips
  • Chip carriers
  • Two robot arms
  • A laser edging compartment with moving doors

Doors

First, let’s identify parts we can handle with built-in functionality like Library Behaviors. A predefined behavior is available in the Library called OpenClose.

This Behavior does exactly what you would expect. It animates the door between the open and closed position. It is as simple as attaching this Behavior to the Product representing the door.


Chips

The assembly line is represented as an endless loop. As such, there is an infinite number of Chips. In order to represent this constant production of chips, we used Dynamic Actor Instantiation. In the scene, there is an Actor Template representing a single chip. This template instantiates and creates a new chip every time the robot arm reaches to the panel.


Carriers

On any assembly line, the number of carriers can vary depending on the tasks they need to complete. In real life, both the carriers and the robot arms move extremely fast. But, the point of the Digital Twin is to market and make the product more accessible to the client. In order to do this, we decided to drastically slow down the movements. Like in all marketing context, you need to have the ability to tweak the final result. In the traditional CGI world you might simply animate the carriers with a traditional timeline toolkit and with this, lose the flexibility of tweaking. We decided to use a custom JavaScript behavior to navigate the carriers between the 19 stations the assembly line contains. This custom behavior utilizes a collection of Point Actors laid out through the scene signifying the location of each station. Whenever a carrier arrives to one of the stations, it announces itself by dispatching an Event. Thanks to the event mechanism in creative experience our Scenario Manager Behavior is capable of listening for these events and triggering the correct part of the scenario at the correct time. This allowed us to have a variable number and speed of carriers.


Robot Arms

Within Creative Experience there is a possibility to control and consume Catia Mechanical Commands. Complicated engineering products like this powerful KUKA robot were very hard to animate and represent in the marketing context until now. With our portfolio you are able to consume the movements directly through these commands created by a product expert. This allows for a drastically faster experience build-up. Our Marketing Experience Artist received a script from a knowledgeable mechanical engineer describing in detail the complicated movement of the arms and carriers. Thanks to the possibility to write custom Behaviors with JavaScript, this script was converted into a machine readable format which a custom Behavior understands and uses for scenario control.

Sample script from engineer:

  1. every time a carrier arrives to the KUKA robot set the horizontal alignment to -32 millimeters
  2. pump up and down from 218 mm to 200 mm
  3. proceed to next chip in line

Sample script re-written for usage in the scene:

  1. trigger pumpPress - trigger upon the carrier's arrival to the pump station
  2. command PressHorizontalAlignment - set control to the horizontal command of the robot arm
  3. absolute 0 -32 550 - in 0 seconds delay, animate the value to -32mm through a period of 550ms
  4. command PressVerticalAlignment - set control to the vertical command of the robot arm
  5. relative 750 218 140 - after 750ms animate to 218mm through a period of 140ms
  6. relative 200 200 70 - pump back the vertical action after another 200ms

Note, each step above has a custom timing allowing for fine tweaking of the final Experience.


VR rendering and controller interaction

Another easy step in the buildup is the integration of VR support. Under the Immersive Reality panel you can find the VR preset provided by the MarketingExperienceScripter (VRS) extension. This preset adds into the scene not only the virtual camera and controllers but also a couple of useful VR Behaviors. In this scene we use the Teleport Behavior on both controllers.

Our final step is to create a Plane Primitive in order to define the floor area in which the user is allowed to navigate.


Music & Sound

To further enrich the experience it is also possible to enhance the scenario with multiple layers of custom sound effects and music. These elements will tremendously increase the effect of immersion, especially in Virtual Reality. For the Bosch Rexroth Experience, we've chosen some factory noises and layered those with a very subtle electronic ambient soundtrack.

Creative Experience offers the possibility to tweak playback speed and volume, usage of spatial 3D sound or loop any given file.

Adding sound to an experience is very simple:

  • Import custom sound files (.wav) to the Experience resource library
  • Add default SoundPlayer Behavior from the Library to the scene and select a sound file
  • Play sound through simple JS code :
    • beScript.onStart = function () {
               this.actor.SoundPlayer.play();
           };


Play Optimization

Now it is time to prepare the Experience with Play Optimization for its specific target platform, in this case for Virtual Reality. Driven by the device and data size requirements, an optimized Experience will be created based on a custom defined Publication Operation. A Publication Operation is a set of operations describing what, how and in which manner the different Actors of an Experience should be optimized. After executing the operation, the resulting Experience can be reviewed and if needed, the user can adjust and tweak the parameters of the optimization until the result is satisfying.

Play Optimization integrates the model structure, geometry, materials, textures, lighting and ambiences for publishing in 3DSpace, which also gives the possibility to playback the Experience in 3DPlay.

To be able to measure and compare the results of Play Optimization, it offers some handy scene statistics.

Original Experience:

Optimized Experience:


Review & Approval

Through the MarketingExperienceReviewer role (EXV), Immersive Visual Experience (IHD) and Immersive Collaboration Experience role (ICE) creative directors, marketers and stakeholders can review interactive marketing experiences directly in 3DPlay and provide instant feedback. Early and recurring loops ensure the on-time delivery of approved Experiences thanks to the "Digital Continuity" of the 3DExperience® Platform.

Reviewing an interactive marketing Experience does not require any author role.


Play Experience

The Marketing Experience Reviewer can also be used to play Experiences after deployment within a controlled kiosk or event environment like the Hannover Messe.
With the supported VR hardware, it is of course also possible to review and play VR experiences.


Thank You

Thank you for following this series of posts on the "Product Experience Creation".
We hope you enjoyed it and for all upcoming questions, feel free to use the comment section.