Editorial

“Fast and easy previsualisation for creative industries“ is an H2020 project funded by the European Commission and planned with a duration of 36 months with 8 partners from 5 countries.

During the last three decades information technology has facilitated a paradigm shift in almost all areas of human life, from manufacture, technology, science to sociology and art. Cutting edge computer animation has become a staple in almost all high budget triple-A productions. But information technology has yet to extend its reach beyond the movie blockbusters to smaller productions, and to media that are only slowly adapting to new technologies. One of the areas in which information technology would be very helpful, but no efficient solutions are available is previsualization. Previs, for short, allows directors and other creative people to develop ideas, break boundaries and convert their artistic vision into reality. This is not to say that there are no tools available for previs at the moment. Their main drawback is the high degree of technical expertise required to operate them. This, in effect, means that any production needs to hire an expert in this area specifically for this purpose. This is not feasible due to monetary restraints and the fact that this abolishes one of the major benefits generated by previs, saving valuable time. The first.stage project has taken up the challenge to develop new solutions to allow creative people utilise previs for their project.

With this first newsletter we would like to introduce you more deeply to our project work and the different tasks of our partners.

Requirements to a previsualisation Software: first.stage requirements and lessons learnt.

Visual Effects (VFX) are a collection of content production techniques that are used mainly by the Film, TV and Animation Industries to enrich the looks of an audio-visual content, normally as a substitute of practical visual effects. VFX are also used in Stage Productions and Videogames, which often contain cinematics (this is, small film pieces that introduce the player to its backstory, and that drive its narrative along the game).

Regarding VFX, the stakeholders contacted in the frame of the first stage project insisted in the necessity of a strong pipeline integration, in which the outcome of the project is fully compatible with other content production platforms, and in which the libraries are platform-agnostic and can be integrated in third-parties’ software easily.

They have also pointed out that photorealism is a must for any audiovisual content that aims to be “immersive”. This is particularly important in Augmented and Virtual Reality experiences, as well as for stage production, in which real-world elements must coexist with digitally-produced VFX.

New formats and media will be produced in game engines. VR and AR experiences require interactivity, and this can be easily programmed on game engines; however, even traditional animation films are being produced on these platforms.

Computer simulation is mentioned as a key to speed-up physically-based animations and to achieve an excellent level of realism even on early stages of development of a digital content. VFX have gradually replaced practical visual effects, as they become more and more realistic. This realism, especially when creating a VFX that relates to real-life (explosions, water, clothes, smoke…), requires computer simulations; e.g., fluid simulation allows producers to produce digitally any liquid that will be shown on screen.

Utilising Virtual Reality for Previsualisation in film and theatre productions

Visualising scenes before the actual production of a film or a play starts is often a time-consuming, expensive process. Virtual Reality can help to make these task a lot more efficient, allowing directors to try many different variations in a short period of time. It can also make previsualisation technologies accessible for users with very little technical expertise and allow members of a production team to collaborate on a scene while being thousands of kilometres apart. The University of Bremen’s Center for Computing and Communication Technologies (TZI) is developing such a system within the project “first.stage”. The first prototype has recently been finished.

Wearing a VR headset, users can observe a scene within a 3D environment. Two controllers allow them to rearrange objects or characters just by clicking on them and dragging them to a new position. Users can also change their perspective within the 3D scene by moving their (non-virtual) bodies or using the controllers. The process of adapting a scene to find the best-possible set-up therefore becomes intuitive. This is a sharp contrast to the previsualisation software that is currently available. In many cases, a team of five or six people works for weeks at a time on the scenes then travels to meet the director and receive feedback, and finally heads back to its offices to start the next cycle of this iterative, expensive process. “The currently used solutions maybe allow users to create one scene a day,” explains Thomas Münder, one of two TZI scientists working on “first.stage”. “Our VR system will speed up the process significantly. And directors can do it themselves.”

Before a scene can be adjusted according to the director’s ideas, the necessary objects and characters need be put in place. This is easy, too: About 1000 assets are readily available from the start, e.g. architectural structures, furniture, and various types of people and animals. These can be adapted using “legacy technologies” such as mouse and keyboard. Assets can also be imported from other sources or designed completely new.

Since the project’s goal is to make previsualisation technology available to small businesses among the creative industries, the TZI team is making sure that its software runs on standard consumer hardware. The total cost of an adequate VR headset such as HTC Vive 800, controllers and a desktop PC remains below 2000 Euros. However, integrating the hardware components creates its own challenges, as they are relatively new. The adequate tracking of motions is a particularly difficult problem to solve. Another one is usability: Interacting with software in 3D is a new concept and there are no standards or best practices to follow yet. What users perceive as intuitive interaction in a virtual-reality setting is still open for discovery. “We are testing the prototype together with our project partners,” says TZI scientist Thomas Fröhlich, who works with Thomas Münder on the VR system. “We use that feedback to improve the technology and design it in a way that makes it as easy as possible to use.”

In addition, the system will be equipped with many new features over the course of the project. One of the next steps is to add a function that enables users to set people or objects in motion within the VR world. In the longer term, users will even be able record the movements of their own body and display them in the scene. They will also be able to interact with other characters in the scene, e.g. shaking hands. Other options include testing a variety of camera angles or watching a scene from different perspectives in a theatre audience. New, unforeseen uses may also emerge once the system is applied in real life. A road of discovery lies ahead.

Download the first.stage newsletter here.