Modern video
games employ a variety of sophisticated algorithms to produce groundbreaking 3D
rendering pushing the visual boundaries and interactive experience of rich
environments. This course brings state-of-the-art and production-proven
rendering techniques for fast, interactive rendering of complex and engaging
virtual worlds of video games.
This year
the course includes speakers from the makers of several innovative games and
game engines, such as Sucker Punch Productions, Epic Games, Activision, EA |
SEED and Unity Technologies. The course will cover a variety of topics relevant
to the practitioners of real-time rendering in games and other real-time 3D
applications. The topics will cover diverse subjects such as real-time global
illumination, atmospheric rendering and dynamic time of day management,
advances in physically-based rendering, novel skylight
model, improvements for spatial upscaling, and several approaches for handling
large geometric complexities in real-time scenarios.
This is the course to attend if you are in the game development
industry or want to learn the latest and greatest techniques in the real-time
rendering domain!
Wednesday, 11
August 2021 9 am - 12 pm PDT | Virtual Conference
Working
knowledge of modern real-time graphics APIs like DirectX or Vulkan or Metal and
a solid basis in commonly used graphics algorithms. Familiarity with the
concepts of programmable shading and shading languages. Familiarity with
shipping gaming consoles hardware and software capabilities is a plus but not
required.
Technical
practitioners and developers of graphics engines for visualization, games, or
effects rendering who are interested in interactive rendering.
9:00 am PDT
Welcome and Introduction
Natalya Tatarchuk (Unity Technologies)
9:15 am PDT
Improved Spatial Upscaling through FidelityFX Super Resolution for Real-Time Game Engines
Timothy Lottes (Unity Technologies)
Kleber Garcia (Unity Technologies)
9:40 am PDT
Experimenting with Concurrent Binary
Trees for Large Scale Terrain Rendering
Thomas Deliot (Unity
Technologies)
Jonathan Dupuy (Unity Technologies)
Kees Rijnen (Unity Technologies)
Xiaoling Yao (Unity Technologies)
10:10 am PDT
A Deep Dive into Nanite Virtualized Geometry
Brian Karis (Epic Games)
Rune Stubbe (Epic Games)
Graham Wihlidal (Epic
Games)
11:20 am PDT
Large-Scale
Global Illumination at Activision
Ari Silvennoinen (Activision Publishing)
12:15 am PDT
Part I
Closing Remarks
Natalya Tatarchuk (Unity Technologies)
12:30 pm PDT
Live Q&A
All Speakers
Advances in
Real-Time Rendering in Games: Part II
Wednesday, 11
August 2021 9 am - 12 pm PDT | Virtual Conference
9:00 am PDT
Welcome to Part II
Natalya Tatarchuk (Unity Technologies)
9:10 am PDT
Real-Time Samurai Cinema: Lighting,
Atmosphere, and Tone mapping in Ghost of
Tsushima
Jasmin Patry (Sucker Punch Productions)
10:10 am PDT
Radiance
Caching for Real-time Global Illumination
Daniel
Wright (Epic Games)
10:45 am PDT
Global Illumination Based on Surfels
Henrik Halen (SEED at Electronic Arts),
Andreas Brinck (Ripple Effect Studios at Electronic
Arts),
Kyle Hayward (Frostbite at Electronic Arts),
Xiangshun Bei (Ripple Effect Studios at Electronic
Arts)
11:35 am PDT
Part II Closing Remarks
Natalya Tatarchuk (Unity Technologies)
12:00 pm PDT
Live Q&A
All Speakers
Course
Organizer
|
Natalya Tatarchuk
(@mirror2mask) is a graphics engineer and
a rendering enthusiast at heart, currently focusing on driving the
state-of-the-art rendering technology and graphics performance for the Unity
engine as a Distinguished Technical Fellow and VP, AAA
and Graphics Innovation, and, prior to that, led the Graphics team at Unity.
Before that she was a AAA games developer, working on innovative
cross-platform rendering engine and game graphics for Bungie’s Destiny
franchise, as well the Halo series, such as Halo: ODST and Halo: Reach,
and AMD Graphics Products Group where she pushed parallel computing boundaries
investigating advanced real-time graphics techniques, and graphics hardware
design and APIs. Natalya has been encouraging sharing in the games graphics
community for several decades, largely by organizing a popular series of
courses such as Advances
in Real-time Rendering and Open Problems in Real-Time Rendering at SIGGRAPH, and convincing people to speak there. It seems to be
working. |
Abstract: A two-part talk, illustrated
with diagrams, images, and performance numbers. First part dives into the
details and inner workings of AMD's FSR1 scaling algorithm. With a focus on
teaching the optimization principles used in its design, and to leave the
viewer with some creative thoughts on image processing. Second part covers
integration into a modern physically-based deferred
and forward rendering pipeline.
Bios:
|
Timothy
Lottes specializes in GPU algorithms and optimization with a
background in photography. Currently focusing on new graphics technology at Unity
as part of the Graphics Innovation Group. Prior work included authoring FidelityFX shaders like CAS and LPM at AMD, working on
TAA and mobile post processing at Epic, and authoring FXAA/TXAA at NVIDIA. |
|
Kleber
Garcia has a M. Sc. in Computer Science from the Florida Institute
of Technology. He led the rendering team for the Madden franchise during the
first drop of Xbox One and PS4. He later joined the Frostbite rendering team
in Stockholm where he implemented shadow systems, character material and lighting
models, and post process effects. He architected and led the development of
an open world GPU probe system currently shipping in Battlefield 6 and Need
for Speed HEAT. Kleber has joined the frontlines at Unity Technologies where
he has delivered the DLSS integration. He is an avid coffee drinker, foodie,
and has a passion for all things GPU and work pranks. |
Materials (Updated August 10th, 2021): Slides (PPTX - 23 MB, PDF 3.5 MB)
Abstract: In this talk, we share
results of our novel technique using concurrent binary trees for large-scale terrain
rendering. First, we will review the foundations of concurrent binary trees and
provide intuition about their benefits for computing adaptive tessellations for
large-scale terrain geometries. Next, the presentation will share the results
of the original 2020 paper, followed by the deep dive into latest efforts to
integrate the original technique into the Unity game engine. The speakers will
share further optimizations found over the original implementation and showcase
early integration results.
Bios:
|
Thomas
Deliot is a senior research engineer
working at Unity Technologies. His work focuses on computer graphics topics
and improving real-time rendering of 3D content, by bringing new papers into
rendering engines and bridging the gap from research to production. This
includes GPU/parallel programming, post-processing, level-of-detail,
materials, lighting and machine learning. |
|
Jonathan
Dupuy is a senior research scientist working at Unity Technologies.
His research interests are primarily oriented towards high quality real-time
rendering. This encompasses a wide range of topics including antialiasing,
level-of-detail, analytic models for both materials and lighting, and
GPU/parallel programming. |
|
Kees
Rijnen is an engineering manager working
at Unity Technologies. His work focuses on helping his team research and
develop new environment technologies and workflows. |
|
Xiaoling Yao is a senior graphics
programmer working at Unity Technologies. His work mainly focuses on
improving the terrain system, integrating with new artist workflows
and adding graphics features like instancing and virtual texturing. |
Materials (Updated August 10th, 2021): Slides (PDF 5 MB)
Abstract: Nanite, Unreal Engine 5's new virtual geometry system, enables the
rendering of trillion triangle scenes at real-time framerates. This lecture
will take a deep dive into how Nanite works, from mesh import all the way to
final rendered pixels. We will explain how the mesh-based data structure is
built, streamed, decompressed, culled, rasterized, and finally shaded.
Bios:
|
Brian Karis is an Engineering Fellow in graphics at Epic Games. Most recently he
has led the development of Nanite for
UE5. He is most known for work on physically based shading and temporal
anti-aliasing although has touched most areas of real-time computer graphics
throughout his career. Prior to Epic he worked at Human Head Studios. |
|
Rune Stubbe is a Principal Rendering Programmer at Epic Games, where he focuses
on Nanite development and optimization. He has previously worked on rendering
technology at IO Interactive and Unity. Rune has also been active in the demoscene (as Mentor/TBC), where he has contributed
widely used compression tools and several award-winning releases. |
|
Graham Wihlidal is a Principal Rendering Programmer at Epic Games, primarily working
on Nanite and other UE5
initiatives. Previously, Graham worked at Electronic Arts (SEED, Frostbite,
BioWare), implementing and supporting technology used in many hit games like Battlefield, Dragon Age: Inquisition, Plants
vs. Zombies, FIFA, Star Wars: Battlefront, and others.
While at BioWare, Graham shipped numerous titles including the Mass Effect and Dragon Age trilogies, and Star Wars: The Old Republic. Graham is
also a published author and has presented at several conferences. |
Materials (Updated: August 10th, 2021): Slides (PDF, 16 MB)
Abstract: : In this talk, we’ll describe
the key techniques behind the large-scale global illumination system in Activision.
We present a new precomputed lighting compression technique that enables
high-performance and seamless reconstruction directly from the compressed
lighting data. In addition, we’ll discuss visibility-based sampling of
precomputed volumetric lighting and describe a practical method for computing
constrained spherical harmonics representations.
Bio:
|
Ari
Silvennoinen is a Fellow Software
Engineer at Activision, where he works on graphics technology research and
development. Prior to Activision, he obtained a master’s degree from the
University of Helsinki and worked on graphics technology at Umbra Software
and Remedy Entertainment. His main interests are in global illumination,
visibility algorithms and real-time rendering and he has contributions in
graphics conferences and journals, including SIGGRAPH, I3D, CGF and EGSR. |
Materials (Updated August 10th, 2021): Slides
(PPTX – 237
MB)
Abstract: In this talk, we describe some of the
graphics techniques used in the production of Ghost of Tsushima. Set in 13th
century Japan, Ghost of Tsushima pays homage to classic samurai cinema with
dramatic lighting, wind, clouds, haze, and fog, and features a beautiful
open-world version of the island of Tsushima during the first Mongol invasion.
In this talk, we will cover the diffuse and specular indirect lighting
techniques used in the game including methods of computing SH irradiance probes
from sky visibility data, including plausible sky and sun/moon bounce light.
Next, we present our atmospheric lighting and rendering techniques, including
how we lit our haze, clouds, and particles with multiple scattering. We show
how to improve the accuracy of Rayleigh scattering to approach that of spectral
rendering by using a custom color space for atmospheric lighting calculations.
Finally, we discuss some of the tone mapping techniques we used to recreate the
samurai cinema experience in-game.
Bio:
|
Jasmin
Patry is a Lead Rendering Engineer at
Sucker Punch Productions, where he has worked on Infamous 2, Infamous Second
Son, Infamous First Light, and Ghost of Tsushima. Prior to that, he was at
Radical Entertainment and contributed to their Hulk, Scarface, and Prototype
titles. As a graduate student in the Computer Graphics Lab at the University
of Waterloo, he created the popular Linux game Tux Racer, which was named
“Best Free Software” by PC Magazine and has downloads numbering in the
millions. His interests include physically based rendering, scientific
computing, and performance optimization — and anything that makes games look
better and run faster. |
Materials (Updated August 11th, 2021): Slides
(PDF - 66 MB, HTML)
Abstract: This talk will present an efficient and
high-quality Final Gather for fully dynamic Global Illumination with ray
tracing, targeted at next generation consoles and shipping in Unreal Engine
5. Hardware Ray Tracing provides a new
and powerful tool for real-time graphics, but current hardware can barely
afford 1 ray per pixel for diffuse indirect, while Global Illumination needs
hundreds of effective samples for high quality indoor lighting. Existing approaches that rely on Irradiance
Fields cannot scale up in quality, while approaches relying on a Screen Space
denoiser have exorbitant costs at high resolutions. This talk will present practical applications
of Radiance Caching along with effective techniques to reduce noise and
leaking.
Bio:
|
Daniel
Wright is an Engineering Fellow in graphics at Epic Games, and
Technical Director of the 'Lumen' dynamic Global Illumination and Reflections
system in Unreal Engine 5. Prior to
that, he developed lighting and shadowing techniques for Unreal Engine 3 and
4 which shipped in Gears of War, Fortnite and a multitude of games licensing
Unreal Engine technology. Daniel's
main passion is real-time Global Illumination. |
Materials
(Updated August 18th, 2021): Slides (PPTX
– 122 MB)
Abstract: Global Illumination Based on
Surfels (GIBS) is a solution for calculating indirect
diffuse illumination in real-time. The solution combines hardware ray tracing
with a discretization of scene geometry to cache and amortize lighting
calculations across time and space. It requires no pre-computation, no special
meshes, and no special UV sets, freeing artists from tedious and time-consuming
processes required by traditional solutions. GIBS enables new possibilities in
the runtime, allowing for high fidelity lighting in dynamic environments and for
user created content, while accommodating content of arbitrary scale. The
algorithm is part of the suite of tools available to developers and teams
throughout EA as part of the Frostbite engine.
This talk will detail the GIBS
algorithm and how surfels are used to enable
real-time ray traced global illumination. We will describe how the scene is
discretized into surfels on the fly, and why we think
this discretization is a good fit for caching lighting operations. The talk
will describe the acceleration structure used to enable efficient access to surfel data, and how this structure allows us to cover
environments of arbitrary size, while keeping a predictable performance and
memory footprint. We will detail how the algorithm handles dynamic objects,
skinned characters, and transparency. Several techniques have been developed to
efficiently integrate irradiance on surfels. We will
describe our use of ray guiding, ray binning, spatial filters, and how we
handle scenes with large numbers of lights.
Bios:
|
Henrik
Halen joined Electronic Art's SEED research division as a Senior
Rendering Engineer in 2017. His work at SEED is focused on real-time graphics
algorithms, lighting and characters. Henrik's
experience as a rendering engineer prior to joining SEED includes a decade of
contributions to franchises such as Gears of War, Battlefield, Medal of Honor
and Mirror's Edge. |
|
Andreas
Brinck has worked as a rendering engineer
for more than two decades. He joined Electronic Arts in 2011 to help start
Ghost Games and was later the rendering lead on NFS Rivals, NFS 2015, NFS
Payback, and NFS Heat. In 2019 he joined DICE LA where he is currently
working on the Battlefield franchise. |
|
Kyle
Hayward has worked as a rendering engineer since 2010. He has focused
on multiple areas in graphics, from animation compression to global
illumination, working on both offline and real-time solutions. He joined EA
in 2012, and later became the NBA rendering lead from 2014 onwards. In 2019
he joined Frostbite, where he has been working on global illumination and
raytracing. |
|
Xiangshun Bei has been a rendering
engineer within DICE LA at EA since 2019, focusing on real-time rendering and
ray tracing. He currently works on the Battlefield franchise. Prior to DICE,
he contributed to graphics drivers for Adreno GPU on Snapdragon SoC at
Qualcomm. He received his master’s degree in computer science from University
of Southern California in 2017. |
Materials (Updated August 11th, 2021): Slides
(PPTX – 315 MB,
PDF – 16 MB)
Contact: |
|