With nearby Silicon Valley driving the development of artificial intelligence, or AI, Uncanny Valley: Being Human in the Age of AI arrives as the first major West Coast museum exhibition to thicken the discourse around AI through a lens of artistic practice. Rather than trafficking in speculative fantasies, the exhibition looks at current applications of AI to redefine the coordinates of the uncanny valley. It builds on contemporary metaphors from the technological imagination—the digital alter ego, the model of swarm intelligence, the data-mining algorithm—to propose a new visual vocabulary for describing the relationship between humans and machines.
Several works in the exhibition concern forms of collective intelligence—in both nature and culture—that inform or define the design and mechanisms of AI. Ian Cheng’s BOB (Bag of Beliefs) (2018–2019) translates this idea into a virtual life-form in the shape of a snake that responds to both its internal programming and to input from its audience. Agnieska Kurant’s works in this exhibition incorporate crowdsourced information and labor that together point to the new economy of “ghost work” fueling many AI platforms and devices.
Simon Denny’s sculpture and relief works critique the humanitarian and ecological costs of today’s data economy, particularly the destructive environmental practices used to create sleek AI-based objects like Amazon’s Echo. The Zairja Collective’s intricate collages also concern mining—both of physical resources and of data. Overlaying images of firing neurons with open-pit designs sourced from mining corporations, their works in this exhibition invite viewers to consider the relationship between the two subjects.
The independent research agency Forensic Architecture uses spatial methods and machine learning to investigate human rights violations around the world. Their work in this exhibition includes a group of algorithmic models developed to scan online images for objects used in such crimes. Invested in humanitarian causes, the group is equally concerned with exposing the ways that machine learning can be exploited and misused, particularly in the types of offenses they investigate. Trevor Paglen also takes on the myth of neutrality in machine learning. His They Took the Faces from the Accused and the Dead… (SD18) (2020) consists of a large gridded installation of more than three thousand mugshots from the archives of the American National Standards Institute. The institute’s collections of such images were used to train early facial-recognition technologies—without the consent of those pictured.
Hito Steyerl’s video installation addresses applications of AI that reinforce social and economic inequality and the potential counterforce offered by communal and artistic acts of resistance. Martine Syms’s interactive, video-based work presents an avatar of the artist named Teeny, which viewers can communicate with via text message. Teeny’s assertiveness and self-absorption position her as an “anti-Siri,” says Syms, and upend the expectation of gendered obedience perpetuated by AI-based assistants.
Lynn Hershman Leeson’s Shadow Stalker (2019) gives viewers the opportunity to glimpse their digital alter ego—the trail of data they have created online—while urging them to take control of their data profiles. The intersection of identity, subjectivity, and technology is also the subject of Stephanie Dinkins’s ongoing series of conversations with Bina48, a robot created by Hanson Robotics that lacks any understanding of race and gender. Christopher Kulendran Thomas’s film Being Human (2019), created in collaboration with Annika Kuhlmann, questions the philosophical foundations of humanism in the age of AI simulations known as deepfakes. Lawrence Lek similarly considers the ways that AI technology is rendering the distinction between humans and machines more permeable in his sci-fi-inflected film AIDOL (2019).
Finally, Zach Blas’s video installation serves as a portal, both literally and metaphorically, into Silicon Valley’s corporate culture—the other uncanny valley of AI. Considering the widespread use of nootropics, or “smart drugs,” he highlights the ways that tech companies invoke counterculture ideas to support corporate goals.