Late on the afternoon in April, Lisa Park stood in the middle of a cavernous room at Mana Contemporary, a factory-turned-gallery in Jersey City, New Jersey, holding hands with her intern. Beneath their feet, three pothole-sized metal plates were nestled into a patch of fake grass with wires running from them to a series of sensors that measure electricity. In front of the women, a 19 by 12-foot, semi-translucent screen stretched across the room shielding the tangle of wires, computers, and lights that hid behind it.
Park, a multimedia artist known for turning brainwaves and heartbeats into performance art, gripped the woman’s hand, and in tandem, they glanced up at the screen where a 3-D rendering of a leafless cherry blossom tree glowed in the dark. “It’s supposed to bloom,” Park said with a hint of frustration. The two women held each other tighter and waited. Nothing happened.
“You have to take off your shoes,” a voice called from the back of the room.
“Oh, right,” Park said with a laugh.
Bell Labs has teamed up with a group of resident artists to explore the emotional and social elements of machine-human interactions.
Park and her intern let go of each other and removed their boots. Barefooted, they stepped back onto the plates and wrapped each other in a stilted embrace. Within seconds, the computers in the back of the room registered the slight uptick in electrical conductivity between the two women, and light pink flowers began to fill the barren tree branches before falling off and tumbling through the digital air to the ground.
A few months before the demo, Park was sitting in her studio at Nokia Bell Labs, the famed New Jersey research park, showing off a small prototype of the tree on her laptop screen. Pinned to the wall were pictures of people holding hands and spooning. Her work table was filled with gel patch sensors and wires. Park describes her piece, called “Blooming,” as a meditation on the future of inter-human connection. “The idea is that the cherry blossom will bloom based on your emotional connection with another person,” she said, clicking through slides on her computer. “Conductivity, capacitance, resistance— they’re all proxies for intimacy.”
For the last year, Park, along with the artist Sougwen Chung and dancers Jason Oremus and Garrett Coleman of the dance collective Hammerstep, have been working out of Bell Labs as part of a residency called Experiments in Art and Technology. The year-long residency, a collaboration between Bell Labs and the New Museum’s incubator, New Inc, culminated in “Only Human,” a recently-opened exhibition at Mana where the artists’ pieces will be on display through the end of May.
“Only Human” is a homecoming of sorts for Bell Labs, which has a rich history of collaborating with artists that dates back to the 1960s. It was then that the first iteration of E.A.T. was started by Bell Labs engineers Billy Kluver and Fred Waldhauer and the artists Robert Rauschenberg and Robert Whitman. The E.A.T. of the ‘60s was famous for a performance called “9 Evenings,” in which the artists and Bell Labs engineers crafted a series of techno-artistic experiments that defined interactive artwork for decades to come. At the time, E.A.T. was a foreign concept—artists, for the most part, didn’t use technology. And technologists certainly didn’t make a habit of sharing their work with artists.
Fifty years later, E.A.T. signifies something different for the artists and the researchers involved. Technology is no longer a novelty—it’s a given. And artists, who might have in the past approached technological advancement with a hint of idealistic curiosity, now question the impact it’s had on the way humans interact with one another.
‘Conductivity, capacitance, resistance— they’re all proxies for intimacy.’
Artist Lisa Park
This tension is ripe territory for artists, who are often more interested in creating provocations around technology than they are in building practical applications. The rebirth of E.A.T. is a chance for them to explore big questions (How can we make technology more human? Should we make technology more human?) alongside Bell Labs engineers—the very people who are building the networks, cameras, and cables the artists use in their works, says Julia Kaganskiy, director of New Inc. “My hope is that we don’t fetishize the technology,” Kaganskiy said during a recent visit to Bell Labs. “We’re really trying to understand how it’s shaping culture and shaping our understanding of ourselves and our relationships to other people.”
Understanding that question—how do we, as humans, best communicate with one another?—is what Bell Labs refers to as a “BHAG”: a big, hairy, audacious goal. Over the years, Bell Labs has approached the challenge with the scientific rigor one might expect from a company comprised of 1,000 specialized engineers with PhDs. Foundational research is key to Bell Labs’ identity and business, and its engineers have tackled plenty of nerdy, technical research problems in pursuit of improving communication technology.
Around a year-and-a-half ago, though, Bell Labs’ corporate perspective began to subtly shift. Tensions around the 2016 election uncovered deep communication chasms between groups of people that had been there all along. It became clear that while reducing network latency and improving camera resolution might make it easier for people to talk to each other, it did very little to help them understand what the other was actually thinking.
Video and emoji help round out the emotional contours of our digital conversations, but even those modes of communication have done little to convey what the person on the other end of the line is really feeling.
For Bell Labs’ president, Marcus Weldon, this realization was a turning point in the way he thought about the company’s research. Bell Labs would continue to do the foundational research that pays the bills, but there were also bigger, more pressing issues to think about beyond improving fiber optic cables. Weldon decided that Bell Labs’ next BHAG would would focus on developing something called “empathic communication,” a phrase he uses to describe an aspirational state of communication where people can connect on a deeper, more meaningful level. For Weldon, and therefore for the rest of Bell Labs, that means moving past basic audio, video, and text into the realm of technology that captures—and transfers—feeling.
“We’re very interested in the idea of, what’s the right coding of humans that allows us to transfer it over a distance?” Weldon asked one day last fall while pacing in front of a wall-sized screen at Bell Labs headquarters. Behind him were projected images of Star Trek’s Replicator, Holodeck, and Transporter. Weldon views Star Trek as an apt, if obvious, analogy to today’s state of technology. “Conceptually, they got most of these futuristic things roughly right,” he said.
Weldon explained that technologists have already solved for two of these three Star Trek technologies. The Holodeck is more or less augmented and virtual reality, and the Replicator is basically an advanced 3-D printer. The Transporter, a teleportation machine that dematerializes a human into an energy pattern and rematerializes them in a new location, is still an elusive concept for scientists. “It’s foolish to think you can actually dissolve a human and recreate that human without any entropic error,” he said, mulling over the Transporter’s scientific veracity. “It’s not the right way to solve the problem.”
Easier, he figured, is transferring an impression of a person—an emotion or the essence of what a human is thinking and feeling, but not the person itself. “What we’re trying to do is capture the subtle stuff you can pick up on when you really know a person,” he said. “Things like mood.”
Humans have, of course, done exactly this for thousands of years. First through voice and then through the written word. Today, video and emoji help round out the emotional contours of our digital conversations, but Weldon argues even those modes of communication have done little to convey what the person on the other end of the line is really feeling.
This lack of true emotional connection, he believes, is at the core of our current political climate. It’s the root of misunderstandings, disagreements, lost love and fractured friendships. “We’ve become isolated in little silos of existence; we have no understanding of what it’s like to be other people,” he said. “What’s lacking is state transfer between individuals so you can actually feel how they feel.”
There’s one problem, though: State transfer is incredibly complicated, and not just for technological reasons. Telling someone you’re sorry for their loss or that you’re in love with them is relatively easy. Making them feel your heart sink or your pulse quicken requires more than a battery of smart sensors—it requires a way to meaningfully translate that biometric data into something that another person can intuit. “The big question is, how do you accurately measure the body? asked Domhnaill Hernon, head of Bell Labs’ new Experiments in Arts and Technology research lab. “And how do you express that in a really compelling way?”
Bell Labs is good at the former, but not so much at the latter. That’s where the artists come in.
Earlier this spring, Hernon was leading a group of artists and engineers through a maze of of hallways at Bell Labs’ campus, as he explained the importance of E.A.T. “Trained scientists have a very different approach in their thinking,” he said. “We’re reductionist in our thinking—artists are divergent. Bringing together those two modes can be very powerful.”
For Bell Labs, the cliched left brain-right brain gap is at the center of the company’s investment in resuscitating E.A.T. Hernon believes that pairing artists with engineers can help Bell Labs’ engineers, who’ve traditionally taken a hard-nosed academic approach to their research, start to think about their work with a hint of creativity. “You say to us, ‘I want to think about the ways fiber optic cables can enable us to better communicate, and we’ll tell you I’ll give you a world record speed in zeros and ones,” he said. “We can’t answer the right questions in isolation.”
At the beginning of the E.A.T. residency, the artists participated in a form of scientific speed dating, where they met with a handful of Bell Labs researchers to figure out what technologies they might want to use in their projects. The idea was to create mutually-beneficial partnerships—engineers would lend their technical expertise; artists would lend their unconventional ideation process.
Each project required something different from Bell Labs’ technology. For the “Blooming” piece, Park wanted to build an artistic representation of biometric data pulled from an array of sensors that read the nuances of motion, heart rate, and capacitance. Sougwen Chung, whose performance art piece “Omnia per Omnia” uses motion vectors from New York City surveillance footage to control a small army of painting robots, was interested in using Bell Labs’ Motion Engine to program the bots’ brush strokes. And Hammerstep’s project, an interactive dance performance called “Indigo Grey: The Micah Grey Experiment,” required motion-tracking technology that would allow people to wirelessly control drones through simple gestures. “It was important that the engineers [we work with] be open minded,” said Jason Oremus of Hammerstep. “That there was a leniency with the technology they were developing.”
All three projects are wildly different in their final form, though they’re similar in the way they transform hard data into something more poetic. For Chung, uncovering the nuances between human and robot interaction is at the crux of the residency. “I thought it would be really interesting to extract this data and turn it into painterly gestures that robots would articulate,” she said while standing with her fellow artists and a handful of engineers in a hallway at Building Two, a ‘70s-era office complex where most of Bell Labs’ video analytics research happens.
On the wall, cheap cameras sat atop a monitor like a row of ducks, measuring the environment through motion, frequencies, and depth, and displaying the data in real time on the screen. Larry O’Gorman, a research fellow at Bell Labs who developed the Motion Engine technology Chung and Hammerstep use in their pieces, pointed to a camera and explained how, at that very moment, software was looking at the small group of people as a faceless whole, analyzing macro gestures like dwell, density, and direction and translating that data into the squiggly lines displayed on screen.
This was the same basic algorithm Chung would use to power her robots and that Hammerstep would use to track audience members during their performance. “Sougwen came to me and told me what she wanted to do, and I wrote an equation on the board,” he explained.
R(t) = A(t) * P(t) * E(x,t)
“The way that we wrote the equation was to say: Sougwen, the artist at time T, is convolved with the audience at time T, and then the environment. That put her work into mathematical terms that I can understand.”
“It’s been a real process of translation,” Chung said.
Later that morning, as the artists and engineers walked through Building Two’s corridor on the way to Bell Labs’ Emerging Materials, Components and Devices lab, they passed a small room outfitted like a drab family den with a monitor, lamp, and chair. “There are sensors in here that can tell you whether you had a donut for breakfast or not,” said Paul Wilford, a research director at Bell Labs in charge of the video analytics group. Wilford was half-joking; today, Bell Labs’ wireless technology can’t tell exactly what you had for breakfast, but through analyzing data from multiple sources, it can tell that you ate something and that you’re happy (or sad) about it.
“We just got this working yesterday.” He pointed to a small, cheap camera attached to the wall. “We can measure [your heart rate] through slight color changes in two parts of your cheek and one in your forehead,” he explained. “Based on that lousy little camera—and lots of algorithms and lots of filtering and lots of network stuff.”
Bell Labs believes the ability to “sensorlessly sense” a human through network technology is key to its goal of empathic communication. Once technology is disaggregated from our phone; once it’s subtly pervasive–everywhere but nowhere—we’ll have the infrastructure to really understand what’s happening both in the environment and with ourselves.
This vision is becomes a little more clear inside the Emerging Materials lab. The bright room is filled with colorful wires and optical cameras that can peer into the skin at micro-resolution. The day the artists visit, the team was in the middle of developing a prototype of something called The Sleeve, a stretchy piece of fabric embedded with sensors, wires, and haptic motors that can be slipped onto the forearm like an arm warmer.
Hernon led the group of artists into the lab where Sanjay Patel, the VP of research in the Emerging Materials lab, was standing next to a pedestal covered with a piece of blue fabric. With a flourish, Patel pulled off the fabric and revealed an early prototype of The Sleeve. The tangle of wires, sensors, and screen adhered to a blue 3-D printed arm, making it look half-human, half-robot
“I think we’re on the verge of a revolution in terms of new devices,” Patel said, gesturing to the arm. In the future, he explained, people will no longer rely on their phones for everything. Instead, we’ll interact with the environment and other people through a series of discrete devices, some worn, some embedded in the world around us. “How do I control my world today?” Patel asked. “I pull out my phone, look at an app, and push some buttons on my screen. What we’d like to do going forward as we instrument the world, is allow us to have sixth sense about our surroundings.”
The Sleeve, which is far from production ready, is able to read biomarkers like heart rate, blood sugar, and stress levels through an optical tomography sensor that peers into the skin. Though Patel and his group view The Sleeve as more a provocation than anything, they also believe it’s a step in the right direction towards a future where we’re untethered from our little bricks of glass and metal. It’s a future where people will control their environments through gestures and communicate with loved ones through haptic messages that are bolstered by the equivalent of emotional temperature readings.
Patel’s remarks about a “sixth sense” feel familiar when a few weeks later the artists find themselves back at Mana preparing for the opening of “Only Human,” the culmination of the E.A.T. residency. Traces of Bell Labs technologies were evident at the gallery, though they were masked by the gloss and abstraction of art.
While artist-types wandered around the gallery, mingling with engineers and curiously taking in the works, the questions the artists started out with still loomed: What happens when humans communicate through touch instead of words? Can you imbue robotics with a humanistic sense of collaboration? Is it possible to transfer empathy through music, rhythm, and technology? These are 10-year questions—the artistic equivalent of a BHAG—and they weren’t going to be answered in a single afternoon or even through a year-long residency.
At one point during the opening, a group of people gathered around the patch of fake grass in front of Park’s glowing tree. An older couple removed their shoes and stepped onto the plates. “Let’s see what 37 years of marriage looks like,” the man said, gripping his wife’s hand tightly. A second passed, then another. The tree sprouted its pinkish white flowers and then they tumbled to the ground.
More Art and Tech
Want more news and reviews you can use? Sign up for the Gadget Lab newsletter.