Digital Art Renaissance

This past decade has seen the birth several key technologies and applications which have empowered digital artists to author work at an unprecedented level of realism.  Effects that once required deep knowledge across teams of specialists can now be created by generalists that deal with all aspects of their creations. Innovations like physically-based shading, GPU advancements, and workflow enhancements all played a part. In this article I’d like to highlight, and celebrate, how we have arrived at where we are today.

Specialists -> Generalists

Before 2010, high-end digital imagery required a complex pipeline that usually followed a process involving modelers, texture artists or surfacers, look development artists, lighters, and compositors.  While these roles still exist today, we are seeing more artists able to wield the tools to allow them to be more self-sufficient.  A clear example of such work can be seen here: ZBrushCentral Gallery

Not to downplay the amazing craft these artists bring to the table, but I’d like to highlight a few developments which have removed some hurdles which previously existed from inception to final image. One of these developments is physically-based shading.

SIGGRAPH 2012: Physically-Based Shading at Disney

A lot of us saw Wreck-It Ralph in theaters, but not everyone realizes what that movie signified for digital imagery.  Up to this point, there were a wide array of specialized algorithms to describe a material’s appearance within a rendering simulation.  What the team at Disney working on Wreck-It Ralph accomplished, and shared with the community in SIGGRAPH 2012, was take a set of common materials and create a unified way to represent them while allowing for a high degree of artistic control to modify that appearance.  This effort also moved us closer to using real-world lighting properties to help us describe how to illuminate these materials.  Things that were previously arduous, such as chipped paint on a metal surface, were now much more accessible to artists without requiring a high degree of specialized knowledge.

Several technology companies started embracing these advancements, and for the first time, we started seeing offline renderers such as Arnold and V-Ray, start to converge with real-time solutions such as Unreal and Unity. On the asset authoring side, we start to see newcomers such as Allegorithmic (now Adobe) with Substance Painter and Substance Designer embracing these concepts.  So what we start to see is film, games, and tools along the pipeline, embracing a solution allowing artists to achieve a higher level of realism while reducing complexity. This next level of functionality, however, needs incredibly high graphics processing performance.

GPU Advancements

While there were advancements on the software side, there was also increased activity in the hardware side, particularly with graphics processors.  The hardware normally required to create high-end digital imagery was not cheap due to the enormous amount of computation required.  As we see graphics processor manufacturers such as NVidia and AMD growing in prominence, this meant that more powerful and affordable graphics cards could be utilized to create high-end imagery.  This obviously meant a great deal for real-time game engines which rely heavily on GPU processing, such as Unreal and Unity, but we also start to see more GPU renderers enter the arena such as Redshift, Octane, Corona, and many more. Software and hardware improvements alone would not have brought about the sweeping changes we see, unless artists changed what they were doing.

Workflow Improvements

While there was a strong potential to evolve due to advances in physical-based shading and GPUs, we also had entrenched ways of working which were not easy to change overnight.  No single solution can take complete credit for changing the way artists worked, but we have noticed a few standouts along the way:

ZBrush + Keyshot: one of the digital sculpting tools favored by artists, partners up with an advanced renderer allowing sculptors to preview their work in breathtaking fidelity.

(Image by Marco Plouffe)

Allegorithmic’s Substance Painter + Real-time + IRay: Having embraced physically based shading very early on, Substance Painter, allows texture artists to preview their work under lighting conditions resembling real-world scenarios using HDR captures.

(Image by Jonathan Benainous)

Marmoset Toolbag: While still representing a somewhat stand-alone stage of the pipeline, Marmoset Toolbag is a convenient way to assign materials and textures to a model in order to preview an asset using a real-time rendering solution.

(Image by Marmoset Toolbag)

Unity / Unreal: As the image fidelity with real-time game engines improved, we also had a more unified environment in which artists could author their creations.
YouTube: Overgrown Ruins by Maverick

These examples start showing us the potential, for a single artist to work in a more self-sufficient capacity, to take an idea from inception all the way to a polished visualization of the final product.

Gallery

So here we are!  There is no better way to illustrate where we are today other than to share with you one of my favorite art galleries:
ARTSTATION.COM

One Comment

Comments are closed.