Course Description

Modern video games employ a variety of sophisticated algorithms to produce groundbreaking 3D rendering pushing the visual boundaries and interactive experience of rich environments. This course brings state-of-the-art and production-proven rendering techniques for fast, interactive rendering of complex and engaging virtual worlds of video games.

 

This year the course includes speakers from the makers of several innovative games and game engines, such as HypeHype, Striking Distance Studios, Intel, Guerrilla, Activision, and Epic Games. The course will cover a variety of topics relevant to the practitioners of real-time rendering in games and other real-time 3D applications. The topics will cover diverse subjects such as efficient design of modern rendering architectures for a wide range of platforms, layered material systems, realistic rendering for fast flying through formations of volumetric clouds, scalable real-time path tracing, large-scale terrain rendering, and more!

 

This is the course to attend if you are in the game development industry or want to learn the latest and greatest techniques in the real-time rendering domain!

 

Note: All parts of the Advances will be recorded, whether virtual or live, pending speaker permissions, and they will be posted on SIGGRAPH’s virtual on-demand program until September 9th 2023, and then on the Advances in Real-Time Rendering YouTube channel

Previous years’ Advances course slides: go here

Syllabus

Advances in Real-Time Rendering in Games: Part I
Tuesday, August 8, 9 am – 12 pm
Room 403 AB

Advances in Real-Time Rendering in Games: Part II
Tuesday, August 8, 2 pm - 5 pm
Room 403 AB

 

Prerequisites:  

 

Working knowledge of modern real-time graphics APIs like DirectX or Vulkan or Metal and a solid basis in commonly used graphics algorithms. Familiarity with the concepts of programmable shading and shading languages. Familiarity with shipping gaming consoles hardware and software capabilities is a plus but not required.

 

Intended Audience

 

Technical practitioners and developers of graphics engines for visualization, games, or effects rendering who are interested in interactive rendering.

 

Advances in Real-Time Rendering in Games: Part I
Tuesday, August 8, 9 am – 12 pm
Room 403 AB

Welcome and Introduction
Natalya Tatarchuk (Unity Technologies)

 

HypeHype Mobile Rendering Architecture

Sebastian Aaltonen (HypeHype)

Nubis3: Methods (and madness) to model and render immersive real-time voxel-based clouds

Andrew Schneider (Guerilla)

Closing Notes

Natalya Tatarchuk (Unity Technologies)

 

 

Advances in Real-Time Rendering in Games: Part II
Tuesday, August 8, 2 pm - 5 pm
Room 403 AB

Welcome and Introduction
Natalya Tatarchuk (Unity Technologies)

 

Large Scale Terrain Rendering in Call of Duty

Stephane Etienne (Activision)


Authoring Materials That Matters - Substrate in Unreal Engine 5

Sébastien Hillaire (Epic Games)

Charles de Rousiers (Epic Games)

The Rendering of The Callisto Protocol

Jorge Jimenez (Striking Distance Studios Spain)

Miguel Petersen (Striking Distance Studios Spain)

 

Closing Notes

Natalya Tatarchuk (Unity Technologies)

 

 

Course Organizer

 

Natalya Tatarchuk (@mirror2mask) is a graphics engineer and a rendering enthusiast at heart, currently focusing on driving the state-of-the-art rendering technology, graphics performance and character content creation in her role as a Distinguished Technical Fellow and Chief Architect, VP, Wētā Tools at Unity. Prior to that, led the Graphics team at Unity, as VP of Graphics for the Unity Editor and Engine. Before that she was a AAA games developer, working on innovative cross-platform rendering engine and game graphics for Bungie’s Destiny franchise, as well the Halo series, such as Halo: ODST and Halo: Reach, and AMD Graphics Products Group where she pushed parallel computing boundaries investigating advanced real-time graphics techniques, and graphics hardware design and APIs. Natalya has been encouraging sharing in the games graphics community for several decades, largely by organizing a popular series of courses such as Advances in Real-time Rendering, and Open Problems in Real-Time Rendering at SIGGRAPH, more recently, the virtual Rendering Engine Architecture conference, and convincing people to speak there. It seems to be working.

 

HypeHype Mobile Rendering Architecture

 

 

Abstract: Sebastian Aaltonen joined HypeHype one year ago with a mission to rebuild their mobile rendering architecture from scratch. The goal of the new rendering architecture is to reach state of the art performance, power efficiency and improve graphics programmer productivity. The new renderer is designed ground up for Vulkan, Metal and WebGPU. This presentation discusses the graphics API abstraction, performance-oriented architecture design and various optimizations.

 

Speaker Bio: 

Sebastian Aaltonen has over 20 years of experience in graphics rendering technology. His main focus areas are engine architecture, low level rendering APIs, performance optimization and GPU compute. Sebastian was pioneering GPU-driven rendering development at Ubisoft and distance field ray-tracing at Second Order (Claybook). He was leading Unity’s DOTS rendering team until he joined HypeHype to rebuild their mobile rendering technology.

Materials: PPTX (25 MB), PDF (3 MB) Updated September 6th, 2023

 

 

 

Nubis3: Methods (and madness) to model and render immersive real-time voxel-based clouds

 

 

Abstract: In less than 6 months, we developed a highly detailed and immersive voxel-based cloud renderer and modeling approach. The Nubis voxel clouds act as traditional volumetric skyboxes, which support a time-of-day cycle when viewed from the ground, while also supporting high frame rates and atmospheric gameplay when explored on the back of a flying mount in the sky. To achieve these goals, we solved or mitigated several open problems in immersive volumetric cloud rendering with solutions such as ray march acceleration using compressed signed distance fields, fluid simulation-based modeling of clouds, a method to up-rez dense voxel data that avoids memory access bottlenecks, light sampling acceleration, and new methods to approximate cloud-specific lighting features like dark edges and inner glow. This talk covers these subjects in detail and offers a glimpse into our reasoning for leaving behind the now widely adopted 2.5D methods of real-time volumetric cloud rendering, which we introduced in our 2015 and 2017 Advances in Real Time Rendering Course Talks, in favor of the benefits and opportunities presented by voxel-based clouds.

 

Speaker Bio: 

Andrew Schneider is the Atmospherics Lead at Guerrilla. He spends his time developing the Nubis volumetric cloud system and driving cohesive solutions for all atmospherics in the Horizon franchise games. He has presented three talks about real-time cloud rendering in the Advances in Real Time Rendering Course and three SIGGRAPH talks before this on cloud and volumetric rendering for feature animation.

 

Materials: 1080p videos PPTX (1.4GB), 4K videos PPTX (3.5GB), PDF (47 MB), Nubis Voxel Cloud Pack (Supplementary materials) (Updated September 6th, 2023)

 

 

 

Large-Scale Terrain Rendering in Call of Duty

 

 

Abstract:  In this talk, we will describe how large-scale terrain is authored & rendered in Call of Duty. Our system is inspired by and builds on earlier work presented at GDC 2015. We will discuss how we extended the idea of virtualization well beyond the terrain textures. We will also discuss how we made our system scale from low-end mobile devices to end-high end PCs.

 

Speaker Bio: 

Stephan Etienne is co-CTO of High Moon Studios, an Activision owned studio. Stephane joined High Moon Studios over 20 years ago and during that span worked on 14 released games. Jack of all trades, Stephane worked on virtually all game systems, including tools & pipeline, to the exception of online and UI. Most recently, Stephane developed the virtual texturing system that has been used on Call of Duty since Call of Duty Vanguard.

Materials: PPTX (285 MB), PDF (3 MB) Updated September 6th, 2023

 

 

Authoring Materials That Matters - Substrate in Unreal Engine 5

 

 

Abstract: Game engines are typically restricted to a fixed number of material types, e.g. shading models, with a limited set of parameters. In some ways, this constrains artists to a limited range of material appearances. The Movie industry has moved away from such a monolithic approach using graph-based mixing and layering of BSDFs; thus freeing artists expressivity and increasing the amount of visual possibilities. Our goal with Unreal Engine 5 is to enable rendering the same amount of visual complexity in real-time on GPU. This system can achieve visuals and performance ranging from path traced movie quality to games with high frame rate on consoles; and it is able to scale down to mobile platforms.

Our proposed framework relies on a concept of slab of matter: a principled representation parameterized by physical quantities with well-defined units. Based on this core building block, artists can create materials by assembling slabs of matter expressed as a graph of closures on which operations are applied, e.g. mixing or layering. The resulting material graph is quantized, packed, and stored in-place of a regular GBuffer. This adaptive storage scheme enables per-pixel topology and features scalability. In addition, by defining transformation rules for closures, the graph can be retargeted from high cinematic visual quality to high performance for lower-end platforms, while preserving materials overall appearance.

 

 

Speaker Bios: 

Sébastien Hillaire is a Principal Rendering Engineer at Epic Games, focusing on the Unreal Engine renderer. He is pushing for visual quality, performance and innovations in many areas, such as physically based shading, volumetrics or visual effects. Before joining Epic Games, he worked at Dynamixyz, then Criterion Games and Frostbite at Electronic Arts.

Charles de Rousiers is a Principal Rendering Engineer at Epic Games. He helps to drive advancement in lighting and materials. He developed a real-time hair and fur system for Unreal Engine. Previously, he worked within the Frostbite engine team at Electronic Arts, where he helped to move the Frostbite engine onto physically-based rendering principles.

 

Materials: PPTX (250 MB), PDF (33 MB) Updated September 6th, 2023

 

 

The Rendering of The Callisto Protocol

 

 

Abstract: This session covers the rendering choices made in the pursuit of photorealism for The Callisto Protocol.

 

In this talk, we dive into our digital double's workflow and its methodologies, including the main protagonist, Jacob Lee. We propose a novel approach for BRDF authoring against photograph references, through an intuitive and methodical process. Including the tooling built to accelerate the authoring and validation process through direct comparison with captured data. Finally, we will introduce “Realis”, a technology built to cross the boundaries from renders to photographic reference.

 

This talk will also cover the various challenges with shipping multiple raytracing features on consoles, namely raytraced shadows, raytraced reflections and raytraced transmission, on practically all lights and surfaces. We will propose novel techniques to accelerate raytracing work while preserving quality, such as partial precomputation of ray visibility, spatiotemporal and visual perception driven variable rate raytracing and tiled classification optimized for raytracing workloads. We will discuss how we raytraced reflections on practically all surfaces, including transparent surfaces such as glass and surfaces with a high roughness ceiling. In addition, we cover how we handled integration with volumetrics, performant dynamic shadows in reflections, and world space radiance gathering. Finally, we will cover platform specific optimizations such as async raytracing, mixed inline tracing, the "tail" problem, and heavily pipelining work for reduced register pressure.

 

This talk showcases how unified art-direction, next-generation hardware and technical design can yield innovation within budget.

 

 

Speaker Bios: 

Jorge Jimenez is the Director of Creative Engineering & General Manager at Striking Distance Studios Spain, where he pushes the boundaries of what is possible at the intersection of art and technology. Jorge is a passionate real-time graphics researcher with over 14 years of experience, focusing on real-time photorealistic rendering, digital humans and creatures, special effects, photography and more. He received his Ph.D. degree in real-time graphics from Universidad de Zaragoza in 2012. Before joining Striking Distance Studios, Jorge worked at Activision Blizzard as a Graphics R&D Technical Director, where he contributed to the Call of Duty franchise on Call of Duty: Ghosts, Advanced Warfare, Black Ops 3, and more. He also has contributed to multiple publications including Transaction on Graphics, Game Developer Magazine, and the GPU Pro series.

Miguel Petersen is a Senior Rendering Engineer at Striking Distance Studios - Spain, where he leads the Raytraced Lighting solutions for The Callisto Protocol, additionally working on Character Rendering, VFX, and UI. Prior to Striking Distance Studios, Miguel worked at Avalanche Studios as a Graphics Programmer on Central Tech, pushing the rendering architecture, next-generation hardware, and engine R&D. HHe has actively contributed to Rage 2, Contraband, Generation Zero and Second Extinction.

 

Materials: PPTX(397 MB), Slides only PDF (46 MB), Slides with speaker notes PDF(11 MB) Updated November 27th, 2023

 

 

 

 

 

 

 

 

Contact: