Computer scientist Pedro Lopes integrates technology with anatomy to reimagine the role of “human” in human-computer interaction.
The sparse room is almost blindingly bright: white floors, white walls, white exposed ductwork, and a primary-yellow door. Articulated fume extraction pipes (think robot arms with suction cup hands) extend from the ceiling and a Styrofoam human head sits on the bench. Above the white noise of fans and beeping electronics plays the otherworldly whine/hum/squeal of a theremin—the classic B-movie UFO sound effect. If Stanley Kubrick and Ed Wood had teamed up to imagine the year 2019, it might have looked something like Pedro Lopes’s lab in the recently renovated John Crerar Library.
In the center of this futuristic space stands Marco Kaisth, Class of 2021, himself a study in retro styling in a Western yoke shirt, plaid cropped pants, and mod boots. Kaisth waves his arm through the air, his fingers fluttering as if playing invisible strings—and the music fluctuates. That’s how you play a theremin, moving your body through the instrument’s electromagnetic field to produce sound. But it’s what you can’t see that’s most interesting: the theremin, in a sense, is playing Marco. He can feel the music. His fingers are moving involuntarily.
The details of this work aren’t yet published and must remain under wraps for now, but the crux of the project represents a theme in Lopes’s research—reversing the flow of control within human-computer interaction and reimagining the nature of the relationship between humans and technology. In Lopes’s office one floor up, relocated from the lab to give his students “acoustic space,” Lopes writes H-C-I vertically on a scrap of paper, like a ladder with the H on top. “In HCI, a human controls the computer.” What if you put the computer on top? And how would that affect our sense of control? (Some may dread a Matrix-like future, but Lopes qualifies that the computer must still be programmed by a human. Whether that’s better than being controlled by artificial intelligence is debatable.)
The field of human-computer interaction (HCI) is difficult to define, even by experts, except in the most redundant way: it’s how humans interact with computers. Before Lopes joined the computer science department as an assistant professor in January, UChicago had no dedicated professors in HCI, though Neubauer Family Assistant Professor Blase Ur and Neubauer Professors Ben Zhao and Heather Zheng all count HCI projects among their work.
Ur, who takes a human-centered approach to studying computer security and privacy, says the field looks at how to design computer systems to better meet users’ needs and habits. Researchers adopt techniques from sociology and psychology to understand how people use computers and other technology in practice, which is often distinct from what the designers intended.
Zhao and Zheng, a married team who operate a joint lab, study human behavior as illuminated by data. Zhao conducts empirical research with a slant toward security, trying to model and predict human behavior, while Zheng focuses on mobile computing and sensor data, gathering information from wearables to better understand the behavior of, for instance, patients and caregivers, which could help make health care more efficient and predictable.
In this vein, some of Lopes’s work follows the conventional HCI venture of developing technology to record, facilitate, or enhance human action. Wearable tech—a booming subset of HCI research—features prominently in Lopes’s lab, including stretchy electronics, ideal for biometric sensors of the type Zheng might use to gather hospital data.
Take, for example, visiting PhD student Steven Nagels, who brought his work on stretchable circuitry when he joined the Lopes Lab in January. The material combines comfort and flexibility for a variety of wearable applications. (Nagels, one of several lab members whose expertise lies outside of computer science, specializes in electrical engineering and materials science. Lopes intentionally devised his team to include diverse specialties to best suit the many facets of HCI.)
In a lab filled with computers and electronics components, Nagels pulls out a floppy square of silicone with an embedded circuit. He alligator-clips the circuit to a power source, and a green LED lights up inside the patch. He then twists, folds, and stretches the square, and it returns to its original shape, still illuminated. A normal circuit board has rigid electrical connections; such treatment would break the wires. Nagels solved this problem by creating chambers within the silicone and pouring in metal that remains liquid at room temperature. A constant liquid state makes for flexible conductivity.
“Such resilience means engineers could put electronics in places normal circuit boards can’t go,” says Nagels, and makes the technology extremely promising for wearables, particularly those meant for skin contact. Stretchy electronics could sit on the body for extended periods without hindering your movement. They could be incorporated into any kind of material, even a sweatshirt that can be easily removed. (Laundry involves water, heat, and mechanical force—all enemies of electronics—but Lopes and Nagels, now back at his home institution, Hasselt University in Belgium, seem confident that this material will eventually be washable.)
Wearable tech is a broad field: it includes the ubiquitous fitness tracker, as well as experimental equipment that scrambles surveillance, like the microphone-jamming bracelet that Lopes, Zheng, and Zhao are developing. But what if the technology—skin-adhered electrodes, for instance—was used to deliver information instead of gather it, to control the human rather than the human’s environment? This is how some of Lopes’s work differs from that of his HCI colleagues.
Beginning this line of inquiry as a graduate student at the University of Potsdam in Germany, Lopes looked at the ways technology interfaces with anatomy. For biosensors, the human body is the input, transmitting information to a computer. Perhaps he could make the body the output, reacting to computer processing.
In the realm of movement, which is of particular interest to Lopes, there are a couple of ways to flip the flow: devices based on mechanical actuation and those that use electrical muscle stimulation (EMS). Mechanical actuation for wearable technology involves motorized equipment, like robotic exoskeletons—your body is simply along for the ride. In contrast, EMS sends signals to your muscles, making your body a computer peripheral device. You, the human, are the printer, the monitor, the speaker, receiving information and acting on it. This is the technology Kaisth’s theremin project relies on.
Electrical muscle stimulation uses a pair of electrodes stuck to the skin to deliver small jolts of electricity to muscles, making them involuntarily contract. (The concept traces back to 1791, when Luigi Galvani discovered that electric current made an inanimate frog leg twitch, partially inspiring Mary Shelley’s Frankenstein.)
“When a burst of electricity enters the first electrode, it wants to come out of the second,” Lopes explains. “As it travels through the medium that is your body, it crosses your skin, and most people say it tingles because it’s a form of vibration.” The electrical impulses then encounter nerve cells connected to muscles, or the muscles themselves. When the muscles feel current, “they do what they always do, which is to contract.”
Lopes first started experimenting with EMS at the University of Potsdam’s Hasso Plattner Institute, exploring eyes-free interaction for wearables. He focused on proprioception—the ability to sense the orientation and movement of your body within its environment. (Close your eyes, stretch out your arms, and then touch your index finger to your nose. That ability comes from proprioception.)
One of his applications dealt with gesture recognition: Lopes wondered if your smartphone, connected to you by electrodes, could deliver information by moving your body. Perhaps the phone compels a person’s finger to move in the shape of a “5,” as an alert for five unread messages, or in a heart shape to indicate a message from a close friend, he explains on his website.
Lopes also realized that EMS could be used for more than notification; it could be used to train or teach the muscles directly, similar to its use in physical therapy for rehabilitating muscle function. For instance, a wholly unfamiliar object could itself teach a person how to use it.
To demonstrate the concept, Lopes developed a prototype that uses an EMS-wired device worn on the hand and forearm and a multicamera setup to track the user’s hand proximity to an object. A video shows people wearing the contraption trying to pick up a white cube. When one woman nearly grasps the cube, her fingers jerk away. “It doesn’t want to be grabbed,” she laughs. The instrument had been programmed to “tell” the user of its unwillingness to be held by stimulating the muscles to back off. Other participants try to spray paint a design on cardboard, but the spray can “instructs” them to shake it first.
The participants are then given a specialty tool, a magnetic nail sweeper, whose function is not intuitive. The wearable device instructs them how to grasp the tool and gather scattered nails and then how to pull a release bar to drop them into a bin. “In a wild future, you could wire up and learn by doing rather than reading an instruction manual,” says Lopes. He notes that the same could be achieved with an exoskeleton, but “I wanted to see if we could do it without being so instrumented,” with a more seamless human-technology interface.
The learning could skip the brain entirely and go straight to muscle memory, but whether your body actually retains the lesson remains to be seen. Lopes is collaborating with Shinya Fujii, a neuroscientist at Japan’s Keio University, to investigate whether EMS could build drumming skills. (Fujii is a drummer while Lopes is a turntablist, playing turntables as if they were full-fledged instruments.) After training beginner drummers through “passive action,” or electrical stimulation of the arm muscles, they found that the drummers did improve, but not much more than had they spent time practicing.
What was notable was the striking improvement in the nondominant hand and the speeds the drummers were able to reach while wired up. “We’re making people drum as fast as the world’s fastest drummer,” says Lopes. Maybe such conditioning will lead to unfair competition advantages—electrical doping of sorts. “Shinya said that if this changes the rules, he’ll be happy. And I agree.”
Lopes emphasizes that he and his collaborators use medically compliant devices in their research that are carefully calibrated for human use and have fail-safes to prevent injury. Still, a small amount of current can be dangerous. “You can kill yourself with a 9-volt battery,” he says, so any projects that involve human subjects go through an institutional review board. Lopes also believes in the value of open source; in 2015 he and a colleague in Hannover, Max Pfeiffer, designed a device that safely leverages standard EMS generators and made the source code and schematics public to help researchers begin exploring EMS.
In a UChicago lab that is relatively new, Lopes and his team still experiment with EMS, the work that was his baby for so long, which he “could talk about till dinner.” But he wants the lab to grow from that technique to explore other ways humans can bond with technology. “We’re trying to see if we can do that with more human capabilities, human skills, human senses.”
The reversed directionality of Lopes’s work raises questions of agency—both physical and philosophical. First the easier scenario: what it feels like to resist EMS impulses. If you insist on grabbing the “unwilling” white cube, “it feels literally like you’re resisting a force,” explains Lopes. “In other projects, we’ve used that as a trick to create the feeling of weight” in virtual reality. (See “Similitude” below.)
The system is designed to recognize when the user is resisting the stimulation and disconnect. And because EMS-instructed action can only be as strong as the human’s own muscles, it can also be overcome by the strength of the human's own muscles, unlike a mechanized exoskeleton. When asked if someone could hack such a device to compel you to act against your will, Lopes stresses that any technology can be vulnerable to infiltration, but the strength limitation offers protection.
Willingness poses a different set of issues. If technology inspires action (or aids function through nerve-integrated prostheses, for instance), it can “feel like this alien force,” says Lopes—you think and something else acts. This is particularly troublesome with exoskeletons because of the external nature of the hardware. For such technology to feel natural, it needs to “synchronize or harmonize with your intention,” actuating at a speed faster than possible without the device (in the scientists’ terms, achieving “preemption”) but slowly enough that the person feels they initiated the action.
Lopes and Jun Nishida, a postdoc in his lab, collaborated with Shunichi Kasahara, an associate researcher at Sony, to explore how fast they could accelerate human reaction without compromising people’s sense of agency. With their ring fingers wired to receive EMS, subjects were asked to tap a touch screen when a target appears. The researchers programmed stimuli to be delivered at different speeds, measured reaction time, and asked the subjects to record their sense of agency.
The paper, published in May in the 2019 CHI Conference on Human Factors in Computing Systems, showed that stimulating the subjects 160 milliseconds after the target appears sped up reaction time by 80 milliseconds while preserving the strongest sense of agency. That is an incredibly short window, where the subject feels just a shadow of external impulse, barely distinguishable from their own intention. This finding might help optimize wearable technology for the most natural experience.
While research based on self-reported sense of agency involves metaphysical elements, the experimental design can be quantified. Time frames can be measured; data patterns can be analyzed; conclusions can be formed. But not all of Lopes’s work revolves around an answer. Sometimes the question is what matters, and “art seems to be a good way to not just ask the question but also inject it into people’s minds. It’s like an Inception thing.”
In 2015 four musicians—one on bass, one on guitar, one on vibraphone, and Lopes on turntable percussion—played in front of 80 people in the now-shuttered media space Spektrum Berlin. In a video posted on Lopes’s website, the music is a din of discordant sounds, and the players jerk and lurch like marionettes, which in a way they are. The audience is full of puppeteers controlling the quartet through a web app. They can make the musicians play or stop playing by tapping the performer’s name on their smartphone screen, which sends a stream of electrical muscle stimulation.
(The instruments, even the guitar and bass, are treated mostly as percussion tools. EMS isn’t yet precise enough for sophisticated, targeted movement because of the layered nature of muscles. Using EMS to teach writing or violin is quite a ways off. Dancing is probably not as distant because it requires coarser motion, says Lopes—a welcome prediction for the pathologically unrhythmic.)
At the end of the roughly 12-minute performance, the audience members’ screens appear to glitch and they see a closing message posing the question: Were the audience members in control or were they being “played?”
The “conductive ensemble,” as Lopes calls it, was an art piece designed to provoke thought about the notion of control in relation to social networks and their pervasiveness.
Between 2016 and 2018, one of Lopes’s art installations, called Ad Infinitum, was staged at five exhibition spaces around the world, including Austria’s Ars Electronica. The piece, created with Patrick Baudisch (Lopes’s PhD adviser at the time) and several fellow graduate students, comprises a box into which visitors insert their arms. Cuffs close down on their forearms and deliver mild electric shocks that cause their wrists to involuntarily pivot, forcing them to turn a crank; the device is harvesting kinetic energy to power itself. The only way to be released is to convince another participant to take your place at the other end. The installation likely hosted 200,000 arms over its exhibition life.
The creators call the installation a parasite that lives off human energy. It “reverses the dominant role that mankind has with respect to technologies: the parasite shifts humans from ‘users’ to ‘used.’” As described on its website, Ad Infinitum is a “critical take on the canonical HCI configuration, in which a human is always in control.”
Lopes’s art doesn’t always provide an answer to the question; that’s not the point, after all. But he “started valuing art as a mechanism to do research.” Sometimes art works better than empirical methodology to explore an idea, so he takes “that route rather than the hypothesis-testing approach.” When his team brainstorms, often there’s an outlier that they realize would make a great art project. (Several lab members are also artists.) “It’s such an interesting space to probe people’s imaginations.”
Questions of flow and directionality and control only matter so long as there’s a disconnect—no matter how minuscule—between humans and computers, but Lopes’s grand overarching goal is to achieve complete fluidity between technology and us. In fact, his research group is called the Human Computer Integration Lab, emphasizing the expanded scope beyond simple interaction.
“A good interface matches your expectations,” says Lopes. It feels natural and does exactly “what it afforded you to do.” By moving certain interfaces to the body itself, he’s removing instrumentation, making for a more organic experience. You shouldn’t know where you end and the tech begins.
“Good HCI has no I—it just happens,” says Lopes. “So my secret objective for the field is to destroy it.”
Similitude
Pedro Lopes’s research concerns our experiences interacting with technology. His team is also exploring whether HCI (human computer interaction) can be used to strengthen empathy by placing us into other people’s experiences through technology.
Virtual and augmented reality offer one such path. Virtual reality (VR) has come a long way since the 1980s, when it first emerged in the form most closely related to current technology. Today VR visuals are more refined; programs can run on phones and “look incredible now. People in graphics would be horrified with what I just said,” Lopes notes, asserting that virtual displays can always be improved, “but the visual clarity of VR has certainly stabilized.”
A convincing virtual world must take into account more senses than vision, though. When what you see in VR doesn’t sync with what you feel—for instance, if you bump into a virtual table but feel no thud, or feel something not represented in your headset—it’s a reminder that you’re in a simulation. One of Lopes’s graduate projects was to develop electrical muscle stimulation (EMS)-based haptic feedback for virtual objects. Inside the simulation, you could pick up a book and feel its weight or feel physically impeded by doors and walls.
Lopes is also interested in exploring tactile feedback, such as the feeling of liquid, but the complexity of human anatomy and physiology makes imitating certain sensations complicated. Stimulating a muscle to contract, mimicking the force required to lift a weight, is fairly simple because motor neurons are relatively easy to isolate and activate. Pressure, pain, temperature, olfaction—these senses rely on a combination of sensory neurons and exist on a spectrum.
Graduate student Jas Brooks, LAB’12, SB’16, is exploiting that complexity to create “thermal illusions for virtual reality,” developing technology that will make someone in a VR simulation perceive a change in temperature without actually changing the ambient temperature of the space. Both a computer scientist and an artist, Brooks is testing ways to manipulate the trigeminal nerve, found in the nose and mouth, which responds to both smell and thermal changes.
A full-sensory VR experience will improve immersion and make gaming more thrilling, but Lopes is “not so excited about the infinite mimicry of VR because we already have this beautiful reality. I don’t want to create an escaping method.” Rather, he wants to create opportunities for solving problems that can’t easily be solved physically, such as overcoming social anxiety or treating body dysmorphia and related eating disorders.
Postdoctoral fellow Jun Nishida, who specializes in designing experiences and technologies that allow people to “maximize and share their physical and cognitive capabilities to support each other,” began working with empathy building as a grad student at the University of Tsukuba, Japan. One of his projects was engineered to help people better understand the visual perception of children and smaller people by affixing a camera to the wearer’s waist, allowing them to see at that level through a headset. Getting that perspective can be useful for teachers and caregivers as well as industrial and interior designers who might realize that counters or doorknobs are positioned too high for some people. The experiment also revealed that from that point of view, participants felt more vulnerable, requiring a larger amount of personal space for a comfortable, nonthreatening interaction.
Nishida has also developed ways to replicate others’ experiences using EMS to simulate medical conditions. As part of his master of engineering thesis, he stimulated the muscles of spoon designers to feel the tremors experienced by patients with Parkinson’s disease. Compared to a control group, the designers then created spoons that are easier to pick up and hold with minimal tension and vibrations.
This type of technology could transform the medical industry as well. Physicians could experience a specific type of pain or impairment, possibly altering their treatment. Male doctors could experience female conditions, ultimately leading to reduced gender disparity in medicine. And of course, bedside manner—which has received renewed attention in medical schools—would benefit from a strong dose of empathy.