Students in a robotics class spent the semester thinking about and building devices with a human-centered design principle. How, for example, would a remote telepresence user manage to push the button on an elevator? Students are demonstrating their work at an open house Friday, Dec. 12.

PROVIDENCE, R.I. [Brown University] — Robotic telepresence is a technology on the rise. Telepresence devices combine Skype-like video conferencing with wheeled mobility, enabling users to roam remotely through hallways and offices, interacting with people in distant locations. Several companies now make the devices, which have found uses in education, healthcare, and elsewhere.

During the fall semester, a group of Brown students has been working to make the experience of telepresence richer and more useful. In a class called “Designing Humanity Centered Robots,” the students brainstormed new ideas, assessed possible new applications, and used the tools in the Brown Design Workshop to build testable prototype devices.

The students will show off their creations at an open house on Friday, Dec. 12, 2014, from 5 to 7 p.m. in the Design Workshop in Prince Lab.

“We started the semester with this big idea: telepresence,” said Ian Gonsher, adjunct lecturer in Brown’s School of Engineering, who co-taught the course with Michael Littman, professor of computer science. “We had conversations about what telepresence is and what are the applications. The conversations led to writing and sketches, and then we started to prototype and work out the functionality of things.”

Students split into three groups, each focusing on different aspects of telepresence. One group focused on the limitations of telepresence devices currently on the market. Most are basically computer monitors perched on chassis that look like Segway scooters. The devices have cameras, microphones, and speakers that give users a sense of the remote surroundings. They don’t have arms or hands, though. That’s a problem when it comes to simple tasks like pressing an elevator button.

So the students built their own telepresence bot starting with the electronic guts of a Roomba, the vacuum-cleaner robot. They equipped the machine with a robotic arm that can be remotely raised to push elevator buttons. The design also includes a head-tracking device that transmits signals to the robot. As a user moves her head, the computer monitor atop the robot moves accordingly.

Another group looked at possible applications for telepresence for the elderly. Inspired by a trip to a local assisted-living facility, the team set to work designing a self-propelled walker that converts into a remotely operated telepresence device.

“When it’s in walker mode, it can do things like helping give directions [to the walker’s user] because it has an onboard computer screen,” Littman said. “But there are also motors that can raise the screen, so it becomes the telepresence screen when it’s in telepresence mode.”

A third group worked to develop devices that remotely transmit sensory information involved in human interaction.

“The human experience of presence is really complicated,” said Emma Funk, a student in the class. “We’re trying to break down the essence of someone ‘being there’ into isolated component parts that we can actually recreate.”

The idea is to create a fuller experience than telepresence can offer today.

“Standard telepresence robots give you stuff you can get via Skype — sight and sound,” Littman said. “This is trying to capture more of the complete experience.”

Among the devices from that group is one designed to transmit the sensation of heat remotely. When the sensor on one end heats up with ambient temperature, a heating pad worn on the user’s arm heats up accordingly. Another project is a headset system for phone conversations. Each headset tracks the position of the users’ heads in space. The movements are translated into where the sound comes from in the other headset. “So it would feel like we’re walking around each other, even though we’re not in the same place,” Littman said.

While the students were able to create working prototypes, the idea behind the class wasn’t necessarily to create commercially viable products. Rather, the idea was to get students thinking and building.

“What’s great about the class is it’s really cross-disciplinary,” said Gonsher, who also has an appointment in computer science. “What we’ve discovered is that people who write a lot of code tend not to make things with their hands. People who make things with their hands don’t always know how to do code. So it’s a nice mix of different peoples’ skills.”

Another goal of the class, which was sponsored by Brown’s Humanity Centered Robotics Initiative, was to get students familiar with the idea of human-centered design.

“The basic principle of human-centered design is you’ve got to design with user needs in mind,” Gonsher said. “It involves a lot of up-front research — ethnographic work, surveys, and observations of what people do.”

Funk, a junior concentrating in computer science and history, said the class was a rewarding experience.

“I spend a lot of time programming things and working on screens,” she said. “This was a way to get off of the screen and see how things happen in real life. I’m also in the history department, so I’m interested in human behavior and ... why people act a certain way. This has been a great way to connect the technical side of computer science, with the human experience.”

The class proved to be a new experience for the instructors as well.

“This is the first time I’ve taught something involving sawdust,” Littman said.