Savai_Khushnuma_Production Qua

Transcription

Savai_Khushnuma_Production Qua
 Production Quality in Real Time A Thesis Submitted to the Faculty of the Visual Effects Department in Partial Fulfillment of the Requirements for the Degree of Master of Fine Arts in Visual Effects at Savannah College of Art and Design Khushnuma Percy Savai Savannah, GA © September 2015 Bridget Gaynor, Committee Chair Deborah Fowler, Committee Member Manuel Prada, Committee Member ACKNOWLEDGEMENTS I would like to thank the members of my Committee, chair Bridget Gaynor, committee member Deborah Fowler and Manuel Prada. I would not have been able to complete this thesis without their patience and their willingness to help and encourage. I really appreciate their help not only because they are the committee members, but also I've learned so much from them while I was studying at Savannah College of Art and Design. I would also like to thank Nick Barton and Mohamded Sinbawy for taking time out of their busy schedule to answer my questions. I would likewise want to express my gratitude toward Stephan EHL for modelling all the assets for my visuals. And I really appreciate all my friends, who are too many to be listed, for giving me warm supports and endless encouragements. I've learned some much just from working projects right beside of them. Lastly I would like to thank all my family, specially my mother Behroz Savai, father Percy Savai, and my sister Mehernaz Savai. Without their understanding and support I would not have been able to even start pursuing my dream. Index List of Figures ……………………………………………………………………………………………………………………………………… 1
Abstract ………………………………………………………………………………………………………………….…………………………… 3 1.0 Introduction …………………………………………..……………………………………………………………….……………………. 4 1.1 Thesis Statement ………………………………………………………………………………………................................ 5 2.0 Limitations for current 3D Production pipeline with CPU rendering …………………………………………... 6 3.0 Industry advancements 3.1 Background of 3D Animation in Film and Television …………………………………………………………….... 7 3.2 Steps taken by the Industry towards Real‐Time ……………………………………………………………………… 10 4.0 Technical Discussions 4.1 Proposed Solutions ……………………………………………………………………………………………………………….. 14 5.0 Methodology ……………………………………………………………………………………………………………………………... 20 5.1 Familiarizing with the new workflows and look development using unreal…………………………….. 20 5.2 Parallel research in a project that includes users proficient in both approaches, rendering in Maya vs Unreal …..……………..…………………………………………………………………………………………………. 24 5.3 Real‐time engines in Visual Effects and 3D industry ………………………………………………………………… 27 5.4 Wabi‐Sabi ……………………………………………………………………………………………………………………………….. 28 5.5 Artist Statement ……………………………………………………………………………………………………………………… 31 6.0 Conclusion…………………………………………………………………………………………………………………………………… 33 Appendix ‐ Conversation with Nick Bartone – Lighting TD, PIXAR ..………………………………………...……………….. 34 Bibliography ………………………………………………….………………………………………………………………………………… 40 List of Figures 1. William Fetter, Cockpit Drawings, 1960, digital wireframe…………………………………………… 7 (“An Historical Timeline of Computer Graphics and Animation”. Wayne E. Carlson, 2003. Web) 2. Still from Toy Story 3, Shot example, PIXAR….……………………………………………………………… 10 3. Indirect Illumination, Toy Story 3, PIXAR……….……………………………………………………………. 11 “Pixar’s New GPU‐Based Lighting Workflow.” Jean Daniel. Siggraph 2013 4. Indirect Illumination, Toy Story 3, PIXAR……….……………………………………………………………. 12 “Pixar’s New GPU‐Based Lighting Workflow.” Jean Daniel. Siggraph 2013 5. New Light Rig with Physically Base Model, PIXAR………………………………………………………… 12 “Pixar’s New GPU‐Based Lighting Workflow.” Jean Daniel. Siggraph 2013 6. Test Render by Otoy – Rendered using Octane renderer…………………………………………….. 14 Pete. The Octane RealTime Renderer – Physically Based Rendering on the GPU. 15th Jan 2014 7. Comparison of render times between Mental Ray and Furry Ball by FurryBall……………. 15 “Furry Ball.” 3D‐test, 3D Interactive Et Nouvelles Technologies. Carlos Ortega. 1st Mar 2010 8. Comparison Test by Furry Ball, Test Scene 1 – FurryBall……………………………………………… 16 “GPU vs CPU – GPU Rendering vs Softwre CPU Rendering.” Speed Comparison. Furry Ball. Web. 9. Comparison Test by Furry Ball, Test Scene 1 – Arnold..………………………………………………. 16 “GPU vs CPU – GPU Rendering vs Softwre CPU Rendering.” Speed Comparison. Furry Ball. Web. 10. Comparison Table by Furry Ball, Test Scene 1 ……………………………………………………………. 17 “GPU vs CPU – GPU Rendering vs Softwre CPU Rendering.” Speed Comparison. Furry Ball. Web. 11. Beniot Dureau’s UNREAL Paris Virtual Tour, using Unreal 4 ………………………………………. 18 “Tour a Virtual Paris Apartment in Stunning Unreal Engine 4 Tech Demo.” Andy Chalk. 26th Jan 2015 12. Epic Games, The Boy and His Kite, March 2015, rendered using Unreal 4 …………………. 19 13. Material editor in Unreal 4, Tatimi material, Wabi‐Sabi …………………………………………….. 21 14. Lights used in the scene, Top view Wireframe, Wabi‐Sabi …………………………………………. 22 1 15. IES Profiles, Wabi‐Sabi ………………………………………………………………………………………………. 23 16. Light Shafts, Wabi‐Sabi …………..…………………………………………………………………………………. 23 17. Gulfstream Aircraft Interior‐ Day time, SCAD …………………………………………………………….. 25 18. Gulfstream Aircraft Interior‐ Day time, SCAD …………………………………………………………….. 25 19. Gulfstream Aircraft Interior‐ Night time, SCAD ………………………………………………………….. 26 20. The Japanese House. Noburu Murata and Alexander Black ……………………………………….. 28 21. The Japanese House. Noburu Murata and Alexander Black …………………………………….... 28 22. Wabi ‐ Sabi, Corridor, Unreal 4 …………………………………………………………………………….….… 29 23. Wabi ‐ Sabi, Room1, Unreal 4 …………………………………………………………………………………….. 29 24. Wabi ‐ Sabi, Room2, Unreal 4 …………………………………………………………………………………….. 30 25. Wabi ‐ Sabi, Room3, Unreal 4 …………………………………………………………………………………….. 30 2 Production Quality in Real‐time Khushnuma Percy Savai September 2015 Current tools and standards in 3‐D Digital Production for films do not allow for realistic results without a substantial impact on production process time and quality. This thesis focuses on the utilization of software with real‐time feedback rendering alternatives presently accessible, where a 3‐D digital task can be generated using high quality production renders with significantly improved efficiency. Keywords: Production Quality Real‐time Unreal 4 GPU Rendering Efficiency 3 1.0 Introduction: “A good picture is equivalent to a good deed.” ‐
Vincent Van Gough In the current production process for Computer Generated images used for film and 3D productions, three dimensional models must impart both accurate geometrical structure and also a desirable visual appearance. In several cases, while developing 3D creations, the desired visual appearance is photorealistic in nature attempting to copy the look and physical properties of similar items seen in the real world. Under current production pipelines, look development artists must wait to see the results of changes in specific image attributes. However, due to the development of technological features in graphics processing hardware, real‐time rendering is quickly becoming more precise. The GPU (Graphics Processing Unit) is a specialized circuit within a computer, designed to accelerate the image output in a frame buffer intended for output to a display such as a computer screen. To a degree, GPU (Graphics Processing Unit) rendering can offer an efficient and speedy option for the visual feedback process in look development in 3D production. During this process, an image appears on the screen, the viewer assesses its precision and responds accordingly within the parameters of the production's aesthetic requirements. PIXAR implemented a real‐time feedback workflow in ‘Monsters University’ and ‘Blue Umbrella’. While comparing the process of GPU vs. CPU, Nick Bartone, a Lighting TD from PIXAR, 4 said, “For example, on 'Brave’ a lighter might average 3‐4 shot approvals on a weekly basis where on Monsters University it was possible for lighters to approve as many as 10 shots.”1 In Digital Film productions, a specific scenic environment is often re‐used for several story scenes, portraying potential variations in time, weather conditions, moods and other relevant narrative needs. Making the changes required to achieve this variation is often time consuming and cumbersome, and creates a gap in the creative process. Through a workflow that incorporates a real‐time renderer, the creation of several types of environments can be achieved more rapidly thereby improving the production process and facilitating a focus on the narrative process. 1.1 Thesis Statement: Current tools and standards in 3‐D Digital Production for films do not allow for realistic results without a substantial impact on production process time and quality. Through the utilization of software with real‐time feedback rendering alternatives presently accessible, a 3‐D digital task can be generated using high quality production renders with significantly improved efficiency. 1
Interview with Nick Bartone, Lighting TD, PIXAR – 15th February 2014 5 2.0 Limitations of the current 3D Production pipeline with CPU rendering: There is no doubt that Graphic Processing Unit (GPU) rendering is faster and more effective than traditional Central Processing Unit (CPU) rendering. There are numerous increases in control, precision, and productivity that originate from this increment in rendering speed that can provide an advantage to the creation pipeline overall. In the current workflow, the complexity of the scene and impact on render time depends on the assets, texture map resolution, Depth of Field, sampling, motion blur or any atmospheric effects. Relative to the intricacy, the artist needs to wait for a render after each change which prompts additional time spent on waiting for the visual feedback and relatively less time on the creative feedback. This traditional method affects the overall efficiency and productivity of the project. It makes the process time consuming and expensive. The artists do not get maximum control on the desired visual. They have to wait for the visual result after every alteration made to various attribute values. Adding rendering processes like Final Gather, Global Illumination and Depth of Field, make the renders computationally heavy. It is also difficult to create physically accurate renders with the current workflow. The main limitation is the speed of the renders and the time consumed to achieve the desired output. But due to the restraints of CPU rendering, the Visual Effects and 3D Animation industry has started to take a step towards finding solutions to increase productivity and be cost effective. 6 3.0 Industry Advancements: 3.1 Background of 3D Animation in Film and Television: 3D rendering has figured out how to discover its way into the lives of a large number of people around the world. Be it a gaming console connected with a TV, animation software on a workstation, an architecture design or the most recent Visual Effects blockbuster at the motion pictures, 3D rendering is utilized, experienced and harness its energy without giving any thought about the magnificent innovation behind it. In 1960, William Fetter devised a new process in order to maximize the efficiency of the layout inside Boeing’s airplane cockpits. His final output was a computer generated orthographic view of the human form (Fig. 1)2. William Fetter coined the term “Computer Graphics” for his human factors cockpit drawings, which led to a chain of events that would eventually modernize the world of entertainment, advertising and media. Figure 1: William Fetter, Cockpit Drawings, 1960, digital wireframe (An Historical Timeline of Computer Graphics and Animation”. Wayne E. Carlson, 2003. Web) 2
“An Historical Timeline of Computer Graphics and Animation”. Wayne E. Carlson, 2003. Web. Accessed: 14th July 2015 7 In 1972, Ed Catmull and Fred Parke at University of Utah, produced ‘A Computer Animated Hand’.3 As being one of the earliest examples of computer animation, the film has been hailed as groundbreaking and revolutionary. According to the film, a model was created by casting a plaster model of Catmull’s left hand. Using a pen, they divided the structure into 350 triangles and polygons. The model was then digitized and the data output was in the form of lines. For the final step, the hand was animated in a 3D animation program, written by Ed Catmull. Slowly, the idea of 3D graphics spread and was used in many feature films. One of the first extensive use of animated 3-D computer animation (or CGI) was done in Star Wars Episode IV: A New Hope (1977) where an animated 3D wire‐frame graphic was used for the trench run briefing sequence.4 In 1982, Tron became one of the first film to use 3D CGI extensively ‐ 20 minutes of fully Computer Generated elements including the popular Light Cycle sequence.5 The movie also includes very early facial animation. While computer animation was used in Star War, it wasn't until Terminator 2 (1991) and Steven Spielberg’s Jurassic Park (1993) that a movie used computer‐generated imagery extensively, or CGI, and mixed it with live action. In the 1990s, there was further exploration of CGI, which resulted in Terminator 2: Judgement Day being among the first to use realistic human movements on a CGI character, morphing a partially Computer Generated character onto the main character. It was also was one of the first uses of a personal computer to create 3D effects. 3
“Let’s Give a Hand to the Original 3D Computer Animation from 1972.” Will Fulton. Digital Trends. 1st Feb 2015 Accessed: 28th August 2015 4
“Visual and Special Effects Film Milestones.” Tim Dirks. Filmsite. Web. Accessed: 28th August 2015 5
“Visual and Special Effects Film Milestones.” Tim Dirks. Filmsite. Web. Accessed: 28th August 2015 8 The creation of the CG elements took nearly a year from the conceptual stage to the final output. It was 10 months of actual production for 35 people which comes to be about 25 man years of work which went into that sequence. All that work made less than 5 minutes of running time in the film.6 Jurassic Park was one of the first to use photorealistic CG creatures. In 1992 and 1993, Spielberg and company crafted 63 visual effects shots realized with CGI in Jurassic Park.7 In 1995, John Lasseter directed Toy Story, which was the first CGI feature length movie plus the first theatrical film produced by PIXAR. The run time for Toy Story was 1 hr 21 min. The film consisted of 1700 shots.8 Following is a short summarized table: Film A Computer Year 1972 Animated Hand Star Wars Episode 1977 IV: A New Hope Tron Amount of CGI used This film was one of the earliest examples of Computer Animation. It was produced by Ed Catmull and Fred Parke at University of Utah An animated 3D wireframe graphic for the trench run briefing sequence 1982 One of the first film to use CGI extensively – 20 minutes of full CG elements Terminator 2: 1991 Judgement Day Among the first films to use realistic human movements on a CGI character and morphing a partially CG character onto the main character. Run time of CGI: 5 min Jurrasic Park 1993 One of the film to use photorealistic CG creatures. 63 visual effects shots realized with CGI Toy Story 1995 First CGI feature length movie. Run time: 1 hr 21 min, consisted on 1700 shots 6
“Visual Effects on Terminator 2.”Animator Mag Archive. Web. 30th Mar 1993 Accessed: 25th July 2015 7
“Jurassic Park: Still the Best Use of CGI in a Movie”. David Crow. Den of Geek. 9th June 2015 Accessed: 1st Sept 2015
8
“CGW: Feature: Toy Story: A Triumph of Animation.” Barbara Robertson. CGW Magazine. August 1995 Accessed: 1st Sept 2015 9 In 2000s, The Lord of the Rings series, The Matrix series and the Harry Potter series all helped raise the bar for the quality of CGI to a more refined level. With the utilization of CGI increasing, the complexity of the effects desired increases with it, which becomes extremely expensive in regards to manpower, resources and time. 3.2 Steps taken by the 3D and Vfx Industry towards Real‐Time : Real‐time rendering is not an unknown topic. A few of the major studios in the Vfx and Animation industry have taken a step towards changing their workflow which consists of a real‐
time feedback, PIXAR being one of them. Figure 2: Still from Toy Story 3, Shot example, PIXAR
According to a Technical Report submitted to SIGGRAPH 2013, Physically Based Lighting at Pixar, from PIXAR Animation Studios, for Monsters University and more recently for The Blue Umbrella, the lighting pipeline at PIXAR was completely rewritten and switched to a Physically 10 Based Lighting/ Rendering.9 Physically based rendering/ lighting essentially implies that the artist strive to create realistic materials and lighting computations. At SIGGRAPH 2013, Jean‐Daniel, PIXAR TD, talked about the new pipeline used in Monsters University. Interactive lighting in Katana on top of the NVIDIA Optix framework was used. In the previously used workflow, a great deal of lighting effects were replicated by using a large number of lights (PIXAR Research Interactive Lighting – SIGGRAPH Talk 2013).10 An example from Toy Story 3 was given (Fig. 2). A large amount of bounce lights was used to simulate Indirect Illumination. (Fig. 3 and 4). Figure 3: Indirect Illumination, Toy Story 3, PIXAR “Pixar’s New GPU‐Based Lighting Workflow.” Jean Daniel. Siggraph 2013 9
“Physically Based Lighting at PIXAR”. Christophe Hery and Ryusuke Villemin. PIXAR. 2003 Accessed: 17th June 2015 10
“Video: Pixar’s New GPU‐Based Lighting Workflow.” Jean Daniel. Jim Thacker. CG Channel. Web. 27th Aug 2013 Accessed: 20th June 2015 11 Figure 4: Indirect Illumination, Toy Story 3, PIXAR “Pixar’s New GPU‐Based Lighting Workflow.” Jean Daniel. Siggraph 2013 As seen in Fig. 3 and 4, hundreds of lights were used to light one shot (Fig. 2). Then, to test the new pipeline, the same shot was lit again, using a new rig with a Physically Based Lighting (Fig. 5), which contained only dozens of lights. Figure 5: New Light Rig with Physically Base Model, PIXAR “Pixar’s New GPU‐Based Lighting Workflow.” Jean Daniel. Siggraph 2013 12 The number of lights used in the new model, dropped considerably, from hundreds to dozens. This resulted in an increase in efficiency, saving time and resources. As mentioned before, Nick Bartone, a Lighting TD from PIXAR confirmed that in comparison to the lighters from Brave, the artists from Monsters University were able to more than double the number of shots to be approved per week. 11 And so, using the new workflow, PIXAR was able to double their output and thus save time. After this test was successful, PIXAR used that model for the first time in Monsters University, The Blue Umbrella and would continue for their future films. A quick live demonstration was given by Jean‐Daniel of a shot of Monsters University, where the same production lights and assets were used. Katana’s interface and abilities were shown. A special plugin for Katana was written by PIXAR’s Research and Development team, specifically for these projects. It consisted of an open GL view and in another corner, the Optix viewer. Open GL does not give a great deal of visual feedback to the artists. With the Optix viewer the artist can receive a very realistic output. Indirect illumination could be turned on or off, view the lights individually, determine the amount and roughness of reflection, while moving though the camera and getting the visual results instantly. The PIXAR pipeline finally contained real‐time feedback, but not real‐
time rendering. However, it is evident that PIXAR has taken one massive step towards Real‐Time. “With the introduction of real‐time feedback, it will only continue to help serve us so that Lighting can spend more time iterating on creative decisions and less time simply waiting for visual feedback.” ‐ Nick Bartone, Lighting TD, PIXAR 11
Interview with Nick Bartone, Lighting TD, PIXAR – 15th February 2014 13 4.0 Technical Discussions: 4.1 Proposed Solutions in the 3D Production industry: As Graphic Processing Unit render engines become more popular and feature rich, the VFX and 3D industry have (for the purposes of final frame rendering) jumped in and started to integrate GPUs into the workflow. As of late, there has been a considerable measure of improvement around the utilization of GPUs for accelerating the creation of rendered images. In any case, many solutions have been proposed to the film industry. Numerous solutions have been suggested to the film industry with several other GPU renderers. Otoy’s Octane Renderer, Maxwell and Furry Ball have all shown great strides in speed improvement. 
Otoy’s Octane Renderer: Fig. 6: Test Render by Otoy – Rendered using Octane renderer Pete. The Octane RealTime Renderer – Physically Based Rendering on the GPU. Digital Image. POP‐3D Graphics‐Site. 15th Jan 2014. Octane is the world's first GPU based, un‐biased, physically based renderer. In Computer Graphics, unbiased rendering is a rendering technique that does not introduce any systematic error in the rendering equation. Octane renderer has its own standalone software which allows the artist to export their assets from a host 3D software (eg. Maya or 3DS Max) as .OBJs and adjust their lights and materials in Octane at Real‐ Time speed. Besides a standalone software, it 14 has fully integrated plugins for various software including Maya (Introduced in 2012). Original authors were Refractive Software (2009 ‐ 2012). The first stable version was released on February 5, 2013. Octane Render released Octane 2.0 which has additional features like fur, displacement and many other time intensive features. Octane provides the artist with a near‐real time visual feedback. The visual, at first, is very grainy and it takes 15‐20 seconds to clear the image. For example, all the changes in the lights or shaders can be viewed simultaneously as the changes are made. Although, once the desired output is achieved, the scene needs to be rendered separately which is not real‐time. 
FurryBall renderer: FurryBall also offers quick real‐time GPU creation quality unbiased and biased final frame renderer. FurryBall was uniquely made for purposes of using with CGI animated films. The feature film The Goat Story 2 was rendered by using FurryBall. It is a Czech 3D animation film and is produced by Art and Animation Studio. Furry Ball: 19 Seconds Mental Ray: 23 Minutes Fig. 7 Comparison of render times between Mental Ray and Furry Ball by FurryBall “Furry Ball.” 3D‐test, 3D Interactive Et Nouvelles Technologies. Carlos Ortega. 1st Mar 2010 15 FurryBall offers advanced rendering techniques, direct support for Maya, 3DS Max and Cinema 4D. Licenses for each can be used by all three 3D applications with no cost. One can render pre‐visualization or cartoons in real‐time with fur, soft shadows, unlimited lights and textures, Displacement, Sub Surface Scattering, Ambient Occlusion, Depth of Field, Motion Blur or Maya Fluids and Particles can be rendered utilizing Rasterize renderer. A test comparison was done by FurryBall and the results were posted online.12 Following are the results of a comparison test done by FurryBall with their renderer and Arnold. Fig. 8 Comparison Test by Furry Ball, Test Scene 1 – FurryBall “GPU vs CPU – GPU Rendering vs Softwre CPU Rendering.” Speed Comparison. Furry Ball. Web. Fig. 9 Comparison Test by Furry Ball, Test Scene 1 – Arnold “GPU vs CPU – GPU Rendering vs Softwre CPU Rendering.” Speed Comparison. Furry Ball. Web. 12
“GPU vs CPU – GPU Rendering vs Softwre CPU Rendering.” Speed Comparison. Furry Ball. Web. 16 FurryBall used the same test scene and rendered in other CPU and GPU renderers like Mental Ray, Vray, Maxwell etc. and created a comparison table (Fig. 10). comparasion scene time
FurryBall (GPU)Mental Ray Vray 2.0
Maxwell
Arnold
14 min
46 min
41 min
44 min
47 min
realtime viewport
motion blur
displacement
fur/hair
maya fluids
particles
Maya proc textures
partly
unlimited textures
image based lighting
raytraced unbiased GI
photon map
caustics
toon rendering
Fig. 10 Comparison Table by Furry Ball, Test Scene 1 “GPU vs CPU – GPU Rendering vs Softwre CPU Rendering.” Speed Comparison. Furry Ball. Web. 17 
Unreal4 Engine: GPU‐accelerated ray‐tracing with renderers like Octane, FurryBall and the upcoming V‐
Ray RT have been discussed at length in the 3D production community. But the graphics card in most workstations is also capable of producing quality imagery using the latent power of real‐
time engines. The image below (Fig. 11) from Beniot Dureau’s UNREAL Paris Virtual Tour, has been rendered utilizing Unreal Engine 4 which adequately exhibits what the most recent innovation is prepared to do. Fig. 11 Beniot Dureau’s UNREAL Paris Virtual Tour, using Unreal 4 “Tour a Virtual Paris Apartment in Stunning Unreal Engine 4 Tech Demo.” Andy Chalk. PC Gamer. 26th Jan 2015 In last few years, Unreal 4 Engine created by Epic Games, has developed to such an extent that the product can no longer be identified as exclusively for game design. At the Annual Game Developers Conference, 2015, Tim Sweeney, founder of Epic Games, said: 18 “We’re realizing now that Unreal Engine 4 is a common language between all these common fields. Once all movies are made in a real‐time engine and they’re experience‐able in VR with some amount of interaction, it’s not going to be a separate industry. There will be a continuum from storytelling that’s mostly linear to user‐driven games and everything in between.”13 Epic games created a short film, The Boy and His Kite (Fig. 12), which was running in 100% Real‐time. It was done with the intention of demonstrating the tools flexibility. It is a high quality film, with Physically Based Rendering and HDR (High Dynamic Range) reflections throughout the film. Fig. 12 Epic Games, The Boy and His Kite, March 2015, rendered using Unreal 4 13
“Why Video Game Engines May Power the Future of Film and Architecture.” Chris Plante. The Verge. 4th Mar 2015 Accessed: 31st March 2015 19 5.0 Methodology: For this project, rather than using a real‐time tool which only provides Real‐time feedback, a Real‐time renderer, in the form of Unreal 4 engine, was chosen. Tim Sweeny, founder of Epic Games believes that Unreal 4 could be used for various applications besides games like VR, films or commercials. 5.1 Familiarizing with the new workflows and look development using Unreal 4: Unreal 4 does not have the functionality for a user to create intricate 3D models. Instead the models must be created in another tool, such as Autodesk Maya and imported into Unreal. For the sake of this project, Autodesk Maya 2015 was used for the creation of all 3D assets. Before the assets are imported, there are a few steps that must be taken in order to prepare the models. Exporting the assets from the host 3D software to Unreal 4 Before the assets are exported for Unreal 4 there are a few steps to keep in mind: 
To have the pivot point in the center of the asset, the model needs to be exported with co‐ordinates (0, 0, 0) i.e. in the center of the viewport. 
Unreal 4 is most compatible with the FBX file format in comparison to OBJ, however, OBJ’s is also possible. 
All the models need to have their UVs laid out between 0 to 1. 20 
If each asset has different materials, then the model should be assigned different shaders in the 3D software before exporting. In Unreal 4, the different shaders would give the assets their Material ID. 
In order to have a smooth mesh, the smoothing groups should be checked while exporting the models. 
In order to move the assets in Unreal, it is preferred if the assets are exported separately. Materials Unreal 4 uses Physically Based Shading which gives the potential for a realistic result using as few parameters as possible. In a paper, Real Shading in Unreal Engine 4, written by Brian Karis from Epic Games said, “It should be difficult to mistakenly create physically implausible materials.” Fig. 13 Material editor in Unreal 4, Tatimi material, Wabi‐Sabi
21 The material editor is node based which makes it straightforward to understand as it is visual (Fig. 13). Any changes made to the material can be seen instantaneously in the viewport. From production of the material to use of the said material was real‐time. Lighting In the 3D environment titled ‘Wabi‐Sabi’, only direct light sources were used in the scene. The rest of the lighting was created with the use of Global Illumination (GI). Hence, no fill lights and bounce lights were used. One light was used for sunlight and two lamps, so in total 3 lights (Fi. 14). If the objects are static, one gets a choice to bake the shadows. This is where one could say that Unreal 4 is not entirely Real‐time. Fig. 14: Lights used in the scene, Top view Wireframe, Wabi‐Sabi Depending on the complexity of the scene and the quality of the visual, the amount of time to ‘build’ a light varies. When a scene is built, the shadows from all static light sources are baked on the objects. After the scene is built, the artist can move freely through the environment. 22 But that is a one‐time thing, once the direct source is finalized. It is preferable to wait 30 min once than awaiting the same amount for each frame. One has a control on the amount of Indirect Illumination and the intensity of a light whose results can be seen immediately. Light effects like color of the light, utilization of IES files when required (Fig. 15), light shafts (Fig. 16), volumetric lights, can be used in Real‐time. Fig. 15: IES Profiles, Wabi‐Sabi
Fig. 16: Light Shafts, Wabi‐Sabi
23 Due to the rapid results, the efficiency of the progress of the Wabi ‐ Sabi project, increased. The result which would normally take longer to achieve with the tradition workflow (CPU rendering), can be accomplished in half that time. Post Processing and Rendering Effects like Depth of Field, light shafts etc. which would make the project computationally heavy in a CPU renderer like Mental Ray or would require a separate package for post processing, can be achieved in Real‐time. Unreal 4 also allows all the color corrections and effects like Vignette, bloom etc. to be achieved instantly. Once the desired look is achieved, it takes a maximum of 5‐6 minutes to export the camera moves as an MOV or an image sequence. Due to this, one can render various camera moves and then later choose the cameras during the edit. It gives a vast amount of freedom to achieve cinematic results. 5.2 Parallel research in a project that includes users proficient in both approaches, rendering in Maya vs Unreal: Unreal 4 was also used in a collaborative project where the team members were a mix of users’ proficient with the traditional tools and would be using Unreal for the first time. The project was for a company called Gulfstream. Savannah College of Art and Design collaborated with Gulfstream Aerospace to develop creative concepts for a next‐generation floorplan configuration tool, which would provide customers a real‐time, interactive and immersive experience when designing their aircraft interior. In order to create the visual aspect of the concept, Unreal 4 engine was used. An interior of a Gulfstream aircraft was designed. In 24 the past, the look development process for a project of this scale would take 8‐10 weeks of work which would include texturing, shading, lighting and compositing. Once the assets were successfully imported, it took our shading artist one day to create the shaders for the assets and one more day to light the scene and add post‐process effects. So after the assets were imported, it took 2 days to get the following output (Fig. 17 & 18). Fig. 17: Gulfstream Aircraft Interior‐ Day time, SCAD
Fig. 17: Gulfstream Aircraft Interior‐ Day time, SCAD
Fig. 18: Gulfstream Aircraft Interior‐ Day time, SCAD
25 The project also required the team to convert the scene in fig. 17 and 18 from a daytime to a night time scene, which took 4 hours to create as all the visuals were Real‐time. Considering the generic render times, in a traditional workflow, it would takes at least 2 or 3 weeks to create the following output: Fig. 19: Gulfstream Aircraft Interior‐ Night time, SCAD
26 5.3 Real‐time engines in Visual Effects and 3D industry: Technology has advanced leaps and bounds with time. There was a stage when it took 10 months of actual production with 35 people to create a 5 min sequence for Terminator 2 (1991). And in 1992 and 1993, Jurrasic Park incorporated 63 shots with visual effects. Then in 2013, Gravity took a leap in the Vfx industry where 80% of the film had CGI elements. And in 2014, Guardians of the Galaxy took a leap and had 2750 shots in which CGI was incorporated in some capacity. And now, in 2015, Avengers: The Age of Ultron, broke that record and had more than 3000 CGI shots. As the technology advanced, the demand and the complexity of the CGI elements also increased and different paths were explored to make the workflow more efficient in resources and man power. PIXAR took a giant leap in the CGI industry to achieve real‐time visual feedback, which doubled their output, saving time and precious resources.14 In some areas, game engines like Unreal4 are already being used. Few independent artists are using a Real‐time engine to create short films and projects. The Third Floor, a studio specializing in pre‐visualization has started using Unreal4 engine, to get Real‐time results.15 Lucasfilm’s Kim Libreri, chief technology strategy officer, at the Technology Strategy Board event at BAFTA in London, 2013, said: “If you combine video games with film‐making techniques, you can start to have these real deep, multi‐user experiences. Being able to animate, edit and compose live is going to change the way we work and it's really going to bring back the creative experience in digital effects.”16 14
Interview with Nick Bartone, Lighting TD, PIXAR – 15th February 2014 “Get Connected!” Chris Edwards‐ CEO Third Floor, SIGGRAPH 2015, 9th August, 2015 16
“Lucasfilm Will Combine Video Games and Movies to Axe Post‐production Process.” The Inquirer. 20th Sept 2013 15
27 5.4 Wabi‐Sabi: The aesthetics for the project was inspired by the Japanese view centered on the acceptance of imperfection with beauty. The concept was to keep the visual simple and yet beautiful and serene at the same time, with a touch of modernism. Noburu Murata and Alexander Black’s book The Japanese House, was used as reference for the visuals. Following are a few examples: Fig. 20: The Japanese House. Noburu Murata and Alexander Black. Pg 83 Fig. 21: The Japanese House. Noburu Murata and Alexander Black. Pg 54 28 All the assets for the Wabi‐Sabi project were created in Autodesk Maya 2015. All the look development was done in Unreal 4, including the post‐process effects. The environment consisted of a Japanese inspired house with a corridor and three different rooms. Following are the final visuals of the Wabi‐ Sabi project: Fig. 22: Wabi ‐ Sabi, Corridor, Unreal 4
Fig. 23: Wabi ‐ Sabi, Room1, Unreal 4
29 Fig. 24: Wabi ‐ Sabi, Room2, Unreal 4
Fig. 25: Wabi ‐ Sabi, Room3, Unreal 4
30 The number of triangles in the scene was 1,831,476. There is no triangle count limit in Unreal4. But the build time is relative to the complexity of the scene. The Wabi‐Sabi environment had a number of high tri‐count trees which made the environment computationally heavy. In the final scene, the final built time was 30‐45 min. The Real‐time rendering was demonstrated by presenting the visuals for the Wabi‐ Sabi project though the Oculus Rift (Development Kit 2). As the environment was rendered in real‐
time, the surroundings were virtually simulated to give the viewer a virtual experience. In order to create a perfect experience, the scene needed to be simulated at 65‐75 fps which gives a smooth transition when the viewer is navigating through the scene. To get that amount of fps, a strong graphics card is needed. For the Wabi‐ Sabi project, a GTX 980 Ti was used and a hydra cooling system was installed. With the intention of creating a good simulation, sometimes more than one sense comes into play. A background sound effect like birds chirping, water flowing in a distance, atmospheric noise etc. was added to the scene and was experienced through a noise cancellation head‐phone. 5.5 Artist Statement: CGI has evolved and come a long way in the Vfx and 3D industry. With the help of CGI, there is no limitation for the artists to portray their creative imagination on to the screen. As an artist specializing in lighting and compositing, a tool like Unreal4 engine which has real‐time rendering and feedback is very refreshing. It is extremely gratifying to get instant visual feedback for computationally heavy effects like reflection, depth of field, refractions, dynamics, particles 31 etc. After the achieving the visual results for Wabi‐ Sabi, I believe that real‐time rendering will production quality renders can be accomplished with significantly high efficiency and reducing the render times. Gaming engines like Unreal4 and Unity have advanced to a level where they can be used to achieve cinematic results which can be used in other non‐gaming industries such as commercials, pre‐visualization, animation, architecture and medical visualizations. Few artists have already begun exploring the great featured of Unreal4 engine and have created photo‐
realistic, production quality renders. Major studios like PIXAR and ILM have taken a big step, in exploring the idea of implementing a real‐time tool in their existing pipeline. During an interesting conversation, with an industry professional, a Lighting TD at a major Animation Film Studio, mentioned a very good point. At this point, all major production houses have their own workflow, with which their seniors and supervisors are accustomed to and it is less likely that the studios would change their workflow as it is less economically feasible to train the artists in a new tool. I completely agree to his point that it is not possible to change an ongoing workflow in a major studio. But if students and junior artists start using easily accessible real‐time tool like Unreal, there is a good possibility that slowly but surely, major production houses will adopt a similar tool. In conclusion, I believe that real‐time is the future of the Vfx and Animation industry. The only question is, when? “In my opinion real time rending in film is inevitable. It is going to take over it is a matter of time. It is very early for to take over just yet....” – Industry Professional, Lighting TD 32 6.0 Conclusion: Ed Catmull's a Computer Animated Hand (1972) was significant and instrumental and has helped to influence and evolve CGI. In films from the '70s forward, Computer‐Generated Imaging (or CGI) has turned into a steady tool for visual effects artists. Today, the multi‐billion dollar CGI industry can make characters as well as crowds, and whole sets and in addition the explosions that destroy them. Current tools and standards in 3‐D Digital Production for films do not allow for realistic results without a substantial impact on production process time and quality. The project Wabi‐
Sabi confirms that through the utilization of software with real‐time feedback rendering alternatives presently accessible, a 3‐D digital task can be generated using high quality production renders with significantly improved efficiency. As technology moves forward towards the future of CGI, it is for sure computer‐generated imagery effects will improve further. With computer power constantly advancing CGI will do the same. 33 Appendix Conversation with Nick Bartone – Lighting TD, PIXAR 15th February Khushnuma Savai: Hello Nick, I hope you are enjoying your weekend (and I hope I am not disturbing you). My name is Khushnuma Savai. I am currently pursuing my MFA degree in Visual Effects at SCAD, Savannah. Your colleague Kiki Poh, gave me your reference to ask few queries about my thesis research. I am researching on the lines of real time lighting. Last year, I was a student volunteer at SIGGRAPH and I heard this talk from Pixar TD Jean‐Daniel Nahmias, about the lighting used in Monsters University, and I found that so interesting that I am researching more on that topic. I hoping you could shed some more light on Katana and the plugins/ Graphic card used in that production. What are your views about it? I would be really grateful for any help and advice. Regards, Khushnuma Savai Nick Bartone: Hi Khushnuma, Here is a YouTube Video of that talk that Danny gave at SIGGRAPH. The Katana‐
based NVidia Optix viewer is something that's actually still in development and isn't currently being used in Production. As far as I know, as Danny explains in the video, it's using the NVidia Optix graphics card platform and for Katana they wrote a special viewer plugin specifically for this purpose. As far as how I feel about it, it's really quite exciting. Being a Lighting Artist, I feel like real‐time lighting could be a hugely beneficial tool for increasing our productivity. With the introduction of more Physically Plausible Lights and Physically Plausible Shading on Monsters University, we saw a huge increase in our usual production workflow when compared to previous shows, like Brave. The way we generally measure that is by shot through output. So, for example, on Brave a lighter might average 3‐4 shot approvals on a weekly basis where on Monsters University it was possible for lighters to approve as many as 10 shots. This means you can either have team sizes that are half as small as they used to be, or your regularly sized team could blow through almost twice as much inventory. With the introduction of real‐time feedback, it will only continue to help serve us so that Lighting can spend more time iterating on creative decisions and less time simply waiting for visual feedback. Although, I have heard that there may 'currently' be some limitations to the real‐time render viewer. For example, I head some of the shading models, brdfs, displacement, etc have been either greatly simplified or removed for the sake of time. Don't quote me on that, since I'm not really qualified to say one way or another, but I'm sure in time these issues will become less and less prevalent as the technology matures. 34 There's a lot of great work being done by people in the Animation/VSFX industry as well as the games industry to get real‐time lighting feedback on full‐feature production quality art. Since I'm not an engineer, I don't really know a whole lot about the specifics of how these technology is built, but I can tell you that the implications are pretty fantastic. I'm looking forward to the day when we can get our hands on this kind of tech ‐ glad to hear there are plenty of young bright minds working on the problem! Was that helpful at all? Khushnuma Savai: Hi Nick, Thank you so much for taking time out of your weekend to share your views. It was definitely helpful. I was wondering, for the viewer, did you use a specific customized machine which supported the graphic card? So in order to use it, one needs to write a special plugin. Obviously, that plugin won't be available. So if someone like me with limited resources, would like to venture forth in this direction, what would you recommend or advise my next step? Or how would I go around it? Being a lighter, I completely understand when you explained about the efficiency. My main reason to pursue this research is to cut down the render time, just to see a preview of how a particular light or shade is working in the scene. Once again I really appreciate your feedback. It really helps to get views from an industrialist. Regards, Khushnuma Savai Nick Bartone: I'm not entirely sure, it looks like they have some recommended settings on their website regarding the Optix platform. https://developer.nvidia.com/optix. My guess is that they were probably running the demo on a pretty powerful machine. Well, I would recommend doing some research into what types of software already have these sort of 'real time' render viewers in off‐the‐shelf applications. I'm sure there are plenty of Maya‐compatible rendering packages that support these types of real time calculations. If your intention is to design your own real‐time rendering engine, then I'm afraid I won't be very useful in that respect as I'm not an engineer. But if SCAD still has a library of SIGGRAPH papers from previous years (or if you can find them online), I might go digging through there to find as many different papers on Real‐time rendering as possible and see if any of them contain information on implementing real‐time rendering. There are also lots of YouTube videos from these companies/researches building these real‐time renderers showcasing their work which might be worth having a look at. If you were to find a real time rendering tool that would be useful for research purposes, you may be able to appeal to the department chair to see if the school would be willing to acquire a license for lab purposes. 35 Also, if your goal is to eventually find a way to cut render times it would be good to understand fundamentally what makes renders 'expensive'. There are a lot of different rendering methods all with their own pros/cons and resources required by these different methods will change. Some engines render very quickly but use crazy amounts of RAM while others take a longer time but are very memory conservative, which makes running many renders on one machine in parallel ideal. Doing more research on rendering in general to determine what specific goal/problem you're trying to solve might help steer you in a direction. For example, if your goal is to try and find a way to more easily/quickly represent true raytraced subsurface scattering in a real time rendering engine then you could specifically research SSS, different scattering methods, and prototype some shaders/algorithms that might help to better approximate an accurate SSS effect but maybe by firing fewer rays, or importance sampling rays, or using a different method entirely. Displacement is also a tough problem in real time rendering from what I understand ‐ so maybe that could be another thing worth investigating? Also, it's worth mentioning that at the moment the prototype real time rendering viewer is primarily just an interactive viewer. That is to say, when we're going to produce a 'final' film frame image for the movie, we still submit a render to renderman like we have in the past. The viewer is designed to help give us faster interactive feedback and not necessarily change the way that we render our final images fundamentally. At least for now. Khushnuma Savai: I see your point. Actually even I am not an engineer. My intension is definitely not to create an engine. I understand that the rendering process remains the same. I am looking to search some ways to have a quick feedback when the lighters start to light the scene. Interactive viewer gives a nice idea what to expect from the render. Currently, the students while lighting, add a light or tweak a value and then wait for the render to see the change (so interactive viewer sounds exciting). I will take your advice and start my research. I have already started going through previous papers. Thank you very much for your feedback. I am really grateful for that. If it’s not too much to ask, is it ok with you if I could connect with you on LinkedIn and ask your advice on any future queries? Nick Bartone: Sure, that'd be fine. I'm happy to help how I can. 36 8th April Khushnuma Savai: Hello Nick, I hope you are doing well. My name is Khushnuma Savai. I am currently pursuing my MFA degree in Visual Effects at SCAD, Savannah. We talked couple of months back about my thesis research. I really appreciate all your help and advice. I took your advice and have been researching real time renderers. I came upon OCTANE Render (www.otoy.com). It looks promising and it is free for students to download. I am talking to the faculty if they could install it in one of the computers. I just wanted your advice and your feedback on it at your convenience. I would really appreciate any feedback, it would really help me with my review this quarter. Regards, Khushnuma Savai Nick Bartone: Howdy! Yeah, getting down a version of Octane to play with would be cool! I would be sure to check their FAQs to make sure they're loading it down on a machine that's compatible hardware/drivers wise. Also, if you had a 'game plan' for how you planned to use the tool, that might make convincing them to go through the effort easier. For example, if you had a clear goal for what you wanted to get 'out' of playing with Octane or if you had a specific project in mind you wanted to work on 'using' Octane, etc. Or, it could be, 'Research how existing Real‐time engines work and come up with realistic workflow possibilities using real‐time lighting engines'. The industry, as a whole, is moving towards these real‐time lighting preview/rendering tools; so learning more about it ~now~ would put SCAD students at a distinct advantage to other Vfx/animation programs that haven't invested in learning about real‐time rendering. In addition, there's plenty that Game Design students could learn from it as well ‐ since a real‐time lighting engine is similar in concept to a video game engine. Also, related to all this, NAB is going on right now ‐ and Octane just announced some new features forthcoming in a press release yesterday. Might be worth having a look to see what's coming down the pipe, just in case the 'next' version of Octane is actually worth waiting for and is actually the version that you'd want to work with! Good luck! If you have any questions, gimmie a shout! I don't really know much of anything about Octane itself but I'm sure there are plenty of resources out there for information/help regarding the engine itself! 37 Khushnuma Savai: Hi, Thank you so much for your reply. I hope I am not disturbing you. After seeing all the renders by this renderer, I am very excited to play with the software.As of now, my plan is to create 2‐3 shots of an interior environment (maybe a living room, kitchen, and bedroom) by using octane to its limit. And at the end create a small UI which is linked to all the lights and then control all the lights and change the time of day maybe, and see the results in front of them instantly. Maybe create a small installation where the viewers can make the changes themselves. What are views on it till now? Nick Bartone: I'm not sure I understand your question. What are views on it till now? Oh ‐ one thing you may want to have a look at is 'Maxwell Render' and their 'Multilight' tool. It's a really savvy feature of their Rendering toolkit which allows you to manipulate lights like you mention, post‐render in real time. Khushnuma Savai: I am sorry. I meant about my plan till now. As in about my plan about the environment and the installation plan. And thanks for the link. It looks interesting. I'll definitely look into it more. 38 14th October Nick Bartone: Hi Nick, I hope you are doing well. I am a student at SCAD. We chatted a bit few months back about my thesis. I am pleased to let you know that I passed my review in Spring quarter. I took your advice and pitched my idea for using octane as a real time renderer. After my review, I requested if SCAD could install Octane, but unfortunately the request could not go through as the student version (though free) was computer locked and hence could not be installed. And as I don’t own a computer powerful enough, I had to change my idea about Octane. After more research and talking to my chair and other faculty, I landed on UNREAL 4. I realize it’s a gaming engine, but the new unreal has, physically plausible materials and lighting. I have seen couple of renders from Unreal 4 online and it looks promising (example: http://www.youtube.com/watch?v=VwpjZ‐JGXE4 ). I have started importing my models in unreal and have started to texture. So far, I have had no problems and seeing the real time rendering and the lighting I am looking forward to go along with it. I was wondering what your views on it were. Do you think it would work? I am planning to use the demo version of octane, just as an experiment (and hopefully Maxwell too). I am trying to keep a thesis blog www.khushnumasavaithesis.blogspot.com . Feel free to look through it. (Though I have to update it soon) Your feedback would really help me with my research for my paper. Looking forward to hear from you. I really hope I am not bothering you. Regards, Khushnuma Savai www.khushnumasavai.com 39 Bibliography ‐ “An Historical Timeline of Computer Graphics and Animation”. Wayne E. Carlson, 2003. Web. https://design.osu.edu/carlson/history/timeline.html ‐
“CGW: Feature: Toy Story: A Triumph of Animation.” Barbara Robertson. CGW Magazine. August 1995 https://design.osu.edu/carlson/history/tree/related%20materials/08stry1.html ‐ “Get Connected!” Chris Edwards‐ CEO Third Floor, SIGGRAPH 2015, 9th August, 2015 ‐ “GPU vs CPU – GPU Rendering vs Softwre CPU Rendering.” Speed Comparison. Furry Ball. Web. http://furryball.aaa‐studio.eu/aboutFurryBall/compare.html ‐ “Jurassic Park: Still the Best Use of CGI in a Movie.” David Crow. Den of Geek. 9th June 2015 http://www.denofgeek.us/movies/jurassic‐park/246791/jurassic‐park‐still‐the‐best‐use‐
of‐cgi‐in‐a‐movie‐steven‐spielberg ‐
“Let’s Give a Hand to the Original 3D Computer Animation from 1972.” Will Fulton. Digital Trends. 1st Feb 2015 http://www.digitaltrends.com/gaming/give‐hand‐first‐3d‐computer‐animation‐1972/ “Lucasfilm Will Combine Video Games and Movies to Axe Post‐production Process.” The Inquirer. 20th Sept 2013 http://www.theinquirer.net/inquirer/news/2295956/lucasfilm‐will‐combine‐video‐games‐
and‐movies‐to‐axe‐post‐production‐process “Physically Based Lighting at PIXAR”. Christophe Hery and Ryusuke Villemin. PIXAR. 2003 http://graphics.pixar.com/library/PhysicallyBasedLighting/paper.pdf ‐
‐
‐
“Video: Pixar’s New GPU‐Based Lighting Workflow.” Jean Daniel. Jim Thacker. CG Channel. Web. 27th Aug 2013 http://www.cgchannel.com/2013/08/see‐pixars‐gpu‐lighting‐workflow‐on‐monsters‐
university/ ‐ “Visual and Special Effects Film Milestones.” Tim Dirks. Filmsite. Web. http://www.filmsite.org/visualeffects10.html 40 ‐ “Visual Effects on Terminator 2.”Animator Mag Archive. Web. 30th Mar 1993 http://www.animatormag.com/archive/issue‐30/issue‐30‐page‐14/ ‐
“Why Video Game Engines May Power the Future of Film and Architecture.” Chris Plante. The Verge. 4th Mar 2015 http://www.theverge.com/2015/3/4/8150057/unreal‐engine‐4‐epic‐games‐tim‐sweeney‐
gdc‐2015 Visual Reference Noburu Murata and Alexander Black. The Japanese House. Book. Hibi, Sadao. Japanese detail: architecture. Book. ‐
‐
41