The previous article of this series provided an overview of our "Product Experience Creation" (IPE) offering showcased on the Bosch Rexroth (BR) "Digital Twin".
In this article we will dive deeper into the creation processes and discover the pieces which are necessary to produce an immersive VR experience.
But now, lets get started covering the following highlights:
- Introducing the Dataset
- Optimizing the Data
- Creating the Material Library
- Creating the Ambience
- Modelling the Environment
- Tweaking the Look
Finally, how everything clicks together in the first stage of the experience build-up phase
Play the video:
Introducing the Dataset
We started out with a shared engineering dataset of the Bosch Rexroth production line in our ENOVIA database. This enabled us to leverage the product life cycle, work in parallel and collaborate throughout the experience build-up.
This was especially helpful since a lot of tasks could be performed in parallel and delivered between colleagues – such as data preparation, material library creation, working on the look & feel and setting up mechanical animations. Everyone had access to up-to-date data at all times.
Note: The 3DEXPERIENCE® platform supports a broad variety of CAD and polygon converters, simplifying the process of importing data and benefiting from the collaborative and managed environment.
The model came with CATIA Mechanics, which are driving the robots and carriers. We benefitted from the use of an inverse kinematic on the robot arm for the animation task we had to do next. (have a look at the video above)
Optimizing the Data
The dataset of the Bosch Rexroth production line is not a lightweight model, due to the sheer amount of screws, holes and complex aluminium profiles. Artists who already have experience with VR will understand the specific technical requirements that are needed to achieve decent performance.
But don't worry, 3DEXCITE's Product Processor batch will take over the job of optimizing the data:
State-of-the-art operators will automatically clean up the model through the following methods:
- Defeaturing
- Jacketing (Hidden Surface Removal)
- DELTAGEN's renown Tessellator
- Polygon Decimation
- Small part removal
Below you can see two example parts where the operation automatically removed the internal faces and holes with ease.
With the data prep task completed, it was time to now concentrate on the visualization of the experience.
Creating the Material Library
Best practice is to establish a cross-product material library early in the process which can be re-used later whenever needed.
There are several ways to collect materials. One example is to use Substance Source from Allegorithmic, from which materials can be imported in just a few clicks.
Even better: Current 3DEXCITE DELTAGEN material libraries can be imported with the DELTAGEN Converter role, using the 3XF importer.
As soon as a material library is established, populating a product with materials is an easy and fast process. (» stay tuned for our solution to automatize material assignment)
Here is an example of how to make the little KUKA robot material pop we used a Substance material in this case:
Creating the Ambience
The HDRI lighting was generated using 3DEXCITE's Ambience Studio application by just selecting one of the many standard ambiences shipped with the role.
Define light and reflection by modifying the HDRI texture or creating additional light sources in an interactive manner.
Modeling the Environment
The dome environment was modeled using one of many CAD modeling tools provided within the 3DEXPERIENCE platform. Choose between SOLIDWORKS or CATIA modelling tools or just import the geometry using the DELTAGEN Converter role (3XF importer).
Iterating on different shapes for the perfect dome geometry is an easy and fast process. Thanks to the collaborative foundation of the 3DEXPERIENCE platform, all changes are immediately available to your colleagues who are also working on this experience.
Assembling the Scene
All we had to do next was drop everything into the CreativeExperience app:
- The optimized model with CATIA Mechanics and materials
- The Ambience
Only some final tweaks were missing to complete the staging.
Tweaking the look
At the end of the staging process, we generated a custom LUT (Look Up Table) in Photoshop to color grade the image to our liking.
Coming up next
This was part 1 of the technical breakdown from the Bosch Rexroth VR experience. Stay tuned for the upcoming piece focusing on VR rendering, scripting and the final experience.
If you would like to learn more about one of the topics, feel free to leave a comment!
