Jessai Flores

According to Yale professor of computer science Theodore Kim, representation in animation is limited by the racially biased legacy of computer graphics technology.

Kim focuses on racial bias in the research behind computer-generated humans. From his previous work as a senior research scientist at Pixar to his current work as a computer science professor and co-lead of the Yale Computer Graphics Group, Kim has substantial experience in this field. By raising awareness of racial bias in the history of computer graphics technology, Kim hopes to build a community dedicated to open discussion and confrontation of this systemic issue of representation.

“The idea that math and science provide an objective way to understand the world and ourselves is not an ironclad guarantee,” Kim said, “but rather an ideal to strive for.”

While interning at Rhythm and Hues Studio in 2001, Kim observed when subsurface scattering became synonymous with “skin.” Subsurface scattering creates a translucency meant to simulate the effect of light hitting skin. However, this glowing effect is only the dominant visual feature in young, white skin. Such translucency is much less important to adding realism to darker skin tones. According to Kim, the field effectively carved out a piece of physics most important to white skin, in an effort to emulate the types of skin that dominated ads, magazines and movies in the late 1990s. He discussed this common lighting technique in a talk at the conference SIGGRAPH 2021. 

To Kim, the classification of this algorithm as a method of generating “human skin” was synonymous with calling pink bandaids “flesh-colored.” He found that most journal articles on “skin rendering” included a single computer-generated white person when exemplifying techniques meant to depict “humans.” Kim further raised concern that such complex algorithms could realistically only be disputed by specialists of a certain standing in the community — effectively, what was established as “skin” could not be easily rewritten. 

“We need to consider the multiple dimensions of diversity,” professor of computer science Holly Rushmeier said, “And should not lump people into groups with one group somehow holding a privileged standing over others.”

Rushmeier and fellow professor of computer science Julie Dorsey joined Kim in efforts to emphasize diversity in the modeling of humans. This team of computer graphics experts submitted an extended abstract to SIGGRAPH 2021. Kim reflected on the controversy that ensued. 

While five of the seven reviews were “extremely positive,” one was neutral and the other was “virulently negative,” which Kim also described to have contained coded racist messages. Ultimately, the last judge forced the abstract to be rejected. 

“Technical topics are usually taught as objective and existing outside of history,” Kim said. “When you start talking about how these topics can also encode racist assumptions, people can become very unhappy, especially those who benefited most from this perception of objectivity. The sciences are often presented as a safe haven from the prejudices of the world, and if you’re the messenger saying that it’s not actually safe, you can be painting a target on yourself.”

Kim had no issue being a messenger. In an opinion piece published in Scientific American, he called attention to the use of the skin and hair of white people as the “default” in the development of technology for computer-generated humans. Kim referenced John Alton’s “Painting with Light” from 1949, a book of guidelines for film lighting that were specifically designed for white skin. These techniques for lighting “human faces” were then pushed into the digital era, being used in studies of lighting in the world of computer graphics. 

He further noted that “technological white supremacy” extended to human hair. The “Marschner” model was the standard model employed for hair rendering, yet was designed specifically to capture light interaction in flat, straight hair. This model was treated like a “good-enough hand-me-down” in the practice of depicting “human hair,” with no equivalent model developed for kinky, afro-textured hair. In the same vein, algorithms for simulating the motion of hair were built on the assumption of hair as being made of straight or wavy fibers, rather than “kinky” hair.

“We are currently exploring methods for simulating human hair that encompass all types of human hair,” Kim said. “Not just those that adhere to a specific, historically biased, standard of beauty.”

Rushmeier remarked that research must be done to provide everyone the tools they need to tell their stories. A.M. Darke, another collaborator, developed the Open Source Afro Hair Library, aimed at countering the bias in computer graphics technology which limits the ability to represent Black characters. However, to address all of the racial bias in computer graphics, a large community effort will be required, Rushmeier emphasized. 

Kim and Rushmeier led a session titled “Countering Racial Bias in Computer Graphics Requires Structural Change” at SIGGRAPH 2021 with the goal of convincing others to join them in the submission of extended abstracts for SIGGRAPH 2022. Rushmeier hopes for a future in which this is “not some niche topic” and that the computer graphics community undergoes structural changes in terms of what are identified as important research topics and appropriate ways to conduct research.

“Scientists are not automatically imbued with objectivity just because they participate in these disciplines,” Kim said. “The sciences still hold the promise of providing a safe haven against bias and prejudice, but that promise can only be fulfilled if we specifically decide to live up to it.”

From his time at Pixar, Kim’s work has appeared in movies such as Cars 3, Coco, Incredibles 2 and Toy Story 4.

KAYLA YUP
Kayla Yup covers Science & Social Justice with an interest in the intersections of the humanities and STEM. She is majoring in Molecular, Cellular & Developmental Biology and History of Science, Medicine & Public Health.