A space oddity

Trevor Paglen, who makes art about surveillance, is sending a sculpture into orbit and making images with artificial intelligence. Simon Willis visits him in Berlin to find out why

By Simon Willis

Later this year a rocket will take off from Vandenberg Air Force Base in California carrying an unusual cargo contained in a cube-sat, a miniature satellite about the same size as a brick and normally used for scientific experiments. Once the rocket reaches an altitude of 575km, the brick will be jettisoned, will unfold and a small canister of carbon dioxide will inflate a work of art inside. The piece is called “Orbital Reflector” and comes in the form of a Mylar balloon about the length of a football field, the height of a person and the shape of a diamond. Once fully inflated, it will orbit the planet every 92 minutes, and will be visible from Earth as a bright star streaking across the sky.

“Orbital Reflector” is the work of Trevor Paglen, an artist best known for his work photographing America’s surveillance state. In the years after the September 11th 2001 attacks, he began documenting the listening stations used by the NSA and the air-strips and secret prisons employed by the CIA in its rendition programme. He has also spent years on a project called “The Other Night Sky”, in which he photographs all the classified spy satellites in orbit, visible in the heavens but missing from the UN’s official log of spacecraft, their passages appearing as long white scratches against the blackness of space.

As he studied the skies, Paglen began to realise quite how many of the objects in orbit are dedicated to defence and surveillance. So he set about designing a sculpture which, in its beauty and brightness, in its fabulous uselessness, would encourage people to look up into space and think about how much of its traffic is looking back at them.


Darkness visible ABOVE “Shadow” and BELOW “Octopus”, both made by AIs. MAIN IMAGE “Prototype for a Nonfunctional Satellite”, an early version of Paglen’s “Orbital Reflector”, being polished

One wet winter weekend I visited Paglen at his studio in an early-20th-century apartment building in Berlin. Hanging from the high ceiling in front of the window was a small version of the balloon, like a silver ghost. And on the wall were diagrams of discarded prototypes for “Orbital Reflector” – in which it appears as a floppy disk like a pancake, a cloud of gas ejected like the tail of a comet, and a series of orbs like giant bubbles – which have been exhibited at the Nevada Museum of Art, one of the funders of the project. “The comet was not super-viable,” said Paglen, who has a short blond beard and a shaved head. “Having energy bottled up on the launch platform, with explosive potential, is generally frowned upon.”

But the studio, where Paglen has worked since 2015, is dedicated to another project besides “Orbital Reflector”. Arrayed on desks around the edge of the room are banks of computers. During the week they are manned by software developers helping Paglen explore a different form of surveillance which, in its ubiquity and power, is both more ordinary and, in his view, more nefarious. He has been learning how machines see.

“This is an octopus,” he said, hunched over his laptop in a grey T-shirt, black jeans and biker boots. It didn’t look much like an octopus to me. The image on the screen was a gloomy composition in dark greens and reds with a pale, gaseous mass at its centre, which resembled a faceless spectre on a misty night more than a cephalopod. But how it appeared to human eyes wasn’t the point. More interesting was how it appeared to the artificial intelligence (AI) that made it.


Shepherd’s delight? “STSS-1 and Two Unidentified Spacecraft over Carson City” (2010), from Paglen’s series “The Other Night Sky”

As image-recognition technology becomes ever more pervasive, the ethics surrounding its use become murkier. Not only is it being used to enable cars to drive themselves and to help manufacturers spot defective products rolling out of factories, it is also peering into our personal lives. There are now AIs that can tell with remarkable accuracy whether someone is gay simply by looking at their picture, that can read your lips as you walk down the street, that can interpret the emotions flickering across your face. Coca-Cola uses AIs to identify pictures on social media of people who are using rival products, and then targets them with advertising. Paglen is not alone in thinking that, as well as doing useful things, AI represents a threat. Michal Kosinski, the Stanford computer scientist who developed the AI with gaydar, says it constitutes “the end of privacy”. Paglen’s project, called “Invisible Images”, is an exploration of its inner workings and ethical implications.

This is where the octopus comes in. An AI learns to recognise images incrementally. First it has to be shown millions or even billions of pictures, tagged as depicting a certain kind of thing – be they forks, cats or fish. As it sees more and more, it begins to figure out what these images have in common. Eventually it produces a kind of template, its idea of what images of a certain kind look like, by which it judges whether or not a new picture shows the object in question. Curious about what machines are looking at when they’re looking at the world, Paglen began training AIs himself.

He presented his pupils with a strange syllabus. He trained one to see what he called “omens and portents” – things like comets and black cats and bolts of lightning – and another to recognise items from Freud’s “The Interpretation of Dreams”, including false teeth and puffy faces. A third he taught to identify sea creatures. Whatever they looked at, this is what they would look for. Once the AIs had become adept at recognising, say, octopuses, he would ask them to produce their idea of one. He was a tough task-master. They passed this exam only if what they drew fooled another AI.

The results delighted Paglen. In addition to the octopus that looked like a ghost, there were deformed and barely recognisable body parts that could have come from the gory paintings of Francis Bacon, and incomprehensible forms surrounded by fields of sickening colour as in a canvas by an abstract expressionist. As well as being arresting in their own right, they reflect Paglen’s biggest worry about the rise of artificial intelligence: if AIs can be trained to recognise and respond to the world in any way its developers choose, we should be concerned about who is doing the training and where its gaze is being turned. “I wanted to make it irrational,” he says, “so that the technology was producing a commentary on itself rather than just a demonstration of itself.”

Looking at these gothic visions, it is obvious what he thinks of the prospect of governments and companies having the power to know someone’s sexual desires more intimately than their friends, or to monitor people because a computer thinks they look suspicious. The octopus feels like a central image in the series about a technology which, according to Paglen, “has its tentacles deeply interwoven into our everyday lives”. Just to drive home his point, he trained an AI to see what he calls “American predators”: carnivorous plants, drones and Mark Zuckerberg.


Open secret Paglen’s photograph of a classified listening station in West Virginia

Paglen was born in 1974 at Andrews Air Force Base near Washington, DC, where his father worked as an ophthalmologist. The family moved with the job, and eventually settled at Wiesbaden Army Air Field, a base in Germany. One of the things that struck Paglen when he was growing up was the pervasiveness of American power. “You realise that there’s a geography to it that you wouldn’t know unless you were a part of it. You are conscious of there being bases in Korea, in the Middle East. It’s part of your conception of the world, this ring of bases.”

In many ways he had a conventional reaction to the strictures of a military boyhood: he embraced the counterculture. In the late 1990s Paglen enrolled at the University of California, Berkeley, where he played in a thrash-metal band called Noisegate. After a stint at art school in Chicago he went back to California to do a PhD in geography. There he found a set of photographs, taken by the us Geological Survey, from which chunks had been redacted. Intrigued by what lay behind these blank spots, he began making trips to places like the Nevada desert, home to Area 51, where the us Air Force develops and tests classified aircraft.

In the desert he began photographing the secret bases and hangars, which were identifiable simply because there were lights and buildings where there shouldn’t have been. These being restricted zones, proximity was a problem. Having tried and failed to use conventional cameras, he adopted equipment designed for photographing space. Setting himself up on high points overlooking his subject – sometimes as much as 50 miles away – he’d shoot these secret places through the heat haze rising off the desert floor. His pictures combined the urgency of a political pamphlet with the cloudy gorgeousness of an impressionist painting. From long range, the bunkers and cabins appeared to melt or vaporise. As well as documenting their existence, his images illustrated the desire to keep them hidden.

It was Edward Snowden who put him on the path towards AI. Among the people who helped Snowden in his publication of classified material from the NSA was Laura Poitras, a film-maker and a friend of Paglen. When Poitras made a documentary about the Snowden affair, “Citizenfour”, which went on to win an Oscar, Paglen worked for her as a cinematographer. While they were producing the film, Paglen became interested in the physical manifestations of the virtual world: the server farms and undersea cables that store and carry information, and which the NSA tapped to obtain data. Gradually he came to see the NSA as a minnow in the world of personal information. The bigger fish were companies like Google and Facebook.

Around this time Paglen moved to Berlin, where Poitras lived. When she left he took her apartment and turned it into his studio. Drawn to the city by “its wonderful anarchist hacking scene”, he found developers, including members of an association called the Chaos Computer Club, who could help him experiment with AI.

One evening in January 2017, a crowd gathered on San Francisco’s waterfront to watch the Kronos Quartet, an experimental group of string players whose programme for the evening ran from Bach to the blues by way of African folk. As enticing as the repertoire was, the audience wasn’t there for the music. Above the stage was a screen and as the players bowed and plucked it showed a video of the concert as seen by AI.

The performance was part of another strand of “Invisible Images”. Paglen and his team built a program, which incorporated image-recognition algorithms used by tech companies and then created visual representations of what those algorithms were doing in real time. He then fed images and video into the program so that he could look at the world through its eyes.

At one point the face of Sunny Yang, the quartet’s cellist, appeared covered in multicoloured circles. They were monitoring changes in her expression and using them to infer her mood. A read-out below said, “Sunny is 48% fearful”. But the system was also assessing the players’ gender, and it produced some wonky results. While John Sherba, a bald, tubby violinist, was judged to be “99.01% male”, Hank Dutt, a middle-aged viola player with thinning grey hair and sharp cheekbones, was “68.01% female”.

The AI had fixed ideas about what gender looked like, and Dutt did not conform. This goes to the heart of Paglen’s project. In our online lives we give data to companies who then use it to draw conclusions about who we are, what we like and how we behave. And yet the conclusions drawn by AI depend on the ideas it is taught and will reflect the biases of the person who created it. “The bigger philosophical question”, Paglen says, “is who decides what things mean? Who is 100% female? Is it Barbie or is it Grace Jones?”

IMAGES: COURTESY OF the artist, Metro Pictures, New York, and NEVADA MUSEUM OF ART

More from 1843 magazine

1843 magazine | Inside the Kenyan cult that starved itself to death

During covid-19 a preacher lured thousands of people into a remote forest. Then he told them to stop eating

1843 magazine | Houston, Texas: where asylum cases come to die

Some immigration lawyers relish a challenge


1843 magazine | Robert F. Kennedy junior doesn’t care if he condemns America to Trump

He’s a tree-hugging conspiracy theorist – and he’s running for president