In the New Yorker cartoon, which ran last April in the magazine’s “Daily Shouts” humor column, a woman is brushing her teeth in the bathroom while her AI assistant — a smiley-faced contraption with an upright loudspeaker — gives her a human-ish idea for a popular internet series. “O.K.,” the device says, “what about a show where, every week, someone bakes a cake and Marie Kondo throws it out?” Marie Kondo is a famous organizing consultant, so the AI assistant has used divergent thinking to solve its owner’s quandary. Get it?
Except that AI’s intrusion into our daily lives is no laughing matter, and the New Yorker — like all media outlets — has boosted its coverage of AI’s downside, as in the magazine’s July 2019 article, “The Hidden Costs of Automated Thinking,” which warned that AI systems misrecognize faces and make inaccurate health judgments, and that these systems “will remain susceptible to hijacking.”
AI’s downside is also a major, disturbing theme in “Uncanny Valley: Being Human in the Age of AI,” the de Young Museum’s latest modern-art exhibit, but like the New Yorker, the de Young offers more than downside in this timely show. Among the featured artists: Ian Cheng, whose BOB (Bag of Beliefs) is an AI-produced animated creature that responds to — or chooses to ignore — art-goers’ requests; Agnieszka Kurant, whose “Conversions #1” is a liquid-crystal painting that uses AI to change its dimensions through social-media posts of activist organizations around the world, which makes the painting colorfully abstract one minute and moody and intense the next; and Zach Blas, whose “The Doors” is a psychedelic video/audio/quasi-garden work that uses AI to generate odd poetry influenced by singer Jim Morrison and The Doors’ era-defining music of the 1960s. Each of these works makes important, underlying points about AI and social networks, but even grade-school kids could enjoy the works’ basic operation and set-up. They’re fun!
Like a sinister clown, however, whose smile belies the trouble that could emerge any minute, other artworks in “Uncanny Valley” — like Trevor Paglen’s “They Took the Faces from the Accused and the Dead . . . (SD18)” — are in-your-face ruminations on how artificial intelligence can be used for dangerous means. And one video at the exhibit, “Forensic Architecture’s Model Zoo,” is an investigation that uses AI for public good — uncovering connections between a wealthy art-minded figure, Warren Kanders, and inhumane practices that have hurt and even killed protesters around the world.
Made in collaboration with Praxis Films, Model Zoo is one of the exhibit’s most stunning and disturbing works. Forensic Architecture is a British agency based at a research university, Goldsmiths, University of London, where its founder, Eyal Weizman, is Professor of Spatial and Visual Cultures. Forensic Architecture does what it calls “advanced spatial and media investigations into cases of human rights violations.” Model Zoo investigates law enforcement product manufacturer Defense Technology, whose tear-gas grenades have been used by U.S. authorities against migrants from Mexico and by other governments — including Egypt, Iraq, Israel, Peru, and Venezuela — to suppress citizen movements.
Forensic Architecture was commissioned to do the artwork by the Whitney Museum, whose Board of Trustees’ vice chair at the time was Kanders, the head of a group (Safariland) that makes Defense Technology products. Kanders was forced to resign in the wake of Forensic Architecture’s reporting, which also connected Kanders to a U.S. company, Sierra Bullets, that makes ammunition that the Israeli military apparently used to shoot, injure, and kill Palestinian protesters.
Forensic Architecture alerted the European Center for Constitutional and Human Rights to Sierra Bullets’ connection to Israel’s shootings of Palestinians, and the European Center for Constitutional and Human Rights wrote Sierra a letter saying it was considering legal action against Sierra Bullets for “possibly aiding and abetting war crimes.”
That’s how Model Zoo ends: With an image of the letter, and the film’s narrator (musician David Byrne) voicing the words “war crimes.” The video is a damning, dot-connecting artwork worthy of its own attention. Weizman, who holds both British and Israeli citizenship, was scheduled to attend the press opening for “Uncanny Valley,” and a major Forensic Architecture exhibit in Miami, but the U.S. embassy in London used an algorithm to vet Weizman’s background and revoked his visa waiver.
Weizman then asked the London embassy why, and was told by an officer that they weren’t sure what had tripped the algorithm and triggered the revocation of his visa, but that it could do with people he knew or places he had traveled, according to an account Weizman wrote on Forensic Architecture’s website. He said the embassy then asked him for 15 years of travel history, among other things, and that he refused, because giving the U.S. embassy names of people he knew could jeopardize their safety — a case study in how sophisticated algorithms let governments, corporations, and anyone with the means to employ them reach into someone’s life and to the lives of people with whom they’re even remotely connected.
“Uncanny Valley” is a ground-breaking exhibit of sorts — the first major U.S. exhibition “to explore the relationship between humans and intelligent machines through an artistic lens,” the de Young says. More art museums, curators, and artists are recognizing that relationship, which is why Miami Dade College’s Museum of Art and Design is now exhibiting “Forensic Architecture: True to Scale,” the first major U.S. survey of the agency’s work. And it’s why “Uncanny Valley” has such a wide spectrum of artworks, including Stephanie Dinkins’ video conversations with Bina48, a humanoid robot modeled after an African-American woman but that has little knowledge of African-American history; Lynn Hershman Leeson’s “Shadow Stalker,” which invites art-goers to share their email and then watch the result of internet searches, which appear next to the art-goer’s actual outline on the same screen; and Paglen’s “They Took the Faces from the Accused and the Dead . . . (SD18),” which is a giant amalgamation of mugshots from a database used by American National Standards Institute, a private U.S. nonprofit that says it’s “committed to enhancing the global competitiveness of U.S. business and quality of life.” But the institute, Paglen says, let developers test facial-recognition algorithms on its mugshots archive without the consent of those photographed — a moral crime if not a legal one that situates the AI work within a long line of judgmental practices.
The excellent “Uncanny Valley” catalogue connects Paglen’s art (which whites out every set of eyes) to the practices of discredited figures like Cesare Lombroso, the 19th-century Italian criminologist who believed that physical traits could identify criminals. But some art-goers will also see a connection to even more disturbing periods of history, like Belgium’s colonization of Rwanda, which led the European ruler to instill classifications between Hutus and Tutsis and issue photographs and identity cards that distilled those classifications to a deeper, cultural level — which helped stoke, scholars have said, the 1994 Rwanda genocide. Misused in pernicious ways, photos have the ability to instill layers of manufactured division. AI multiplies that potential perniciousness to grossly obscene levels. Oppressive governments are already using AI and facial recognition to track people’s movements and imprison people they deem dangerous.
Around the world, people are debating the use of AI and its reach into everyday life. “Uncanny Valley” is a chance to join that debate — and to become better informed as you take in art that says “being human” means being vigilant about the impact of things like personal assistants. Lampooned in the New Yorker, these assistants seem to offer greater freedoms (and even fun!) but are now the foundation of so much habit and intrusion that it’s difficult going back to the way it used to be.
The Gray Area Foundation for the Arts specializes in projects that use art and technology to “create social and civic impact,” and its exhibit “The End of You” takes on a monumental goal: Give art-goers a more intricate and intimate connection with Earth’s natural environments. Like “Uncanny Valley,” “The End of You” features art that lets visitors actively participate. Stephanie Andrews’ “An Immersive Game of Life,” for example, follows visitors as they walk on a gridded floor, all the while changing its giant, simulated landscape that’s on the wall ahead — getting more active (with giant fungi and the like) or less active, depending on the number of people there.
Each artwork inside the Gray Area’s Grand Theater isolates experiences that make you more conscious of how trees, rocks, the human footprint, or even the passage of time have a cumulative, interconnected effect on the Earth’s health. Gray Area says the exhibit takes about an hour, but even a few minutes can change your perspective if you get into the exhibit’s rhythms of sound, sight, and spectacle. If you don’t let go and bring a sense of curiosity, there’s little point — like there is with any art exhibit, but especially with one that offers scores of small insights everywhere you look.
“Uncanny Valley: Being Human in the Age of AI” — Through Oct. 25 at the de Young Museum, 50 Hagiwara Tea Garden Drive (Golden Gate Park), $12-$15, 415-750-3600, deyoung.famsf.org
“The End of You” — Through March 1 at Gray Area’s Grand Theater, 2665 Mission, $15-$25, endofyou.io
Whatever your goal may be, eventually you are going to have to work cohesively with other people, and when that…
The shelter-in-place order sent the local cannabis industry on chaotic five-day bender.
'Eye contact is the hardest part.'