FROM SKETCH TO 3DEXPERIENCE: 2355 NISSAN CONCEPT.

In May 2020, Nissan Design Studio Latin America Instagram page shared a post announcing a competition for designers and artists in order to imagine design in a cooler future in the year 2355, so any type of design creation was allowed containing and showing Nissan's design language, branding and culture identity. Participations were very varied with proposals from around the world, with different styles, characteristics, functions and areas of design.

I joined this competition and surprisingly made to the Top 3 of the best creations with the best concept for automotive design.

My idea is based on a service car made by Nissan, more precisely an express delivery vehicle aimed at the urban environment and roads, with aerodynamic shapes and dedicated packaging for luggage and practicality of transporting contents. The vehicle created is 100% electric and 100% autonomous with a design focused on the compatibility of these new technologies.

The creative process in the conception of the car have inspirations in:

The 2D digital sketches and renderings symbolize the first part of this project that was among the top 3 best creations:

After this achievement, the LATAM tech sales team decided to give me a challenge, to take a step further and elevate this project to the context of the experience, that is, transform the 2D project into a 3D project using the tools of the 3DEXPERIENCE platform.

With that, the new challenge began, creating life to this project using the entire 3DEXPERIENCE Creative Design workflow (from 3D sketch to real-time experience of the product).

To create the real-time experience and renderings, the 3DEXCITE Creative Experience tool was used, where it was possible to configure environments, lights and rendering settings.

Using Stellar renderer it was possible to adjust light bounces, iterations, image size, caustics and many other options.

In Creative Experience it will also be possible to create cameras, camera animations, model animations and behaviors for real-time scenes, as well as the possibility and simplicity of configuring the entire scene to work as a high quality VR experience.

The examples below shows the car controller behaviour applied and videos of the camera animations and the car moving around.

What do you think? any comments?