News

UC researchers continue to blaze new trails in science, medicine, business, education, engineering and the arts — literally transforming the way we live, work and learn.

Your New Robot Co-Workers

Your New Robot Co-Workers

To demonstrate the stakes of her robotics research, Catharine McGhan plays a video showing the many ways industrial machines can kill you.

In a series of increasingly violent interactions, robotic arms deliver spine-snapping blows to crash test dummies that are stand-ins for factory workers. Often, these robots are massive machines that exert enough force to cause serious bodily injury despite their slow speed.

“It’s the difference between being hit by a car going slowly or a bike traveling fast,” said McGhan, an assistant professor of aerospace engineering at the University of Cincinnati’s College of Engineering and Applied Science. “The more capable these systems are, the more able they are to hurt someone.”

McGhan is one of several researchers at UC studying ways to improve the safety of human-robot interactions.

It’s a timely topic: Industry experts say robots are coming soon to a workplace near you.

Service robots are a $5 billion global business, according to the International Federation of Robotics. The trade group predicts the number of robots working in factories will nearly double from 1.8 million in 2016 to more than 3 million by 2020. And if you add drones and self-driving cars to the list, robots are becoming ubiquitous.

“Robots can do a lot more than we let them do because of safety and ethics considerations,” McGhan said.

Her published research has examined productivity in workplaces shared with robots, human-robot collaboration and how to get robots to predict a human’s intent.

“We’re getting there slowly. It’s much easier to work in structured environments with very little uncertainty,” she said. “Once you start getting into uncertainty and you’re dealing with human decision-making, we jump off a cliff quickly. Can we really trust this? Can we guarantee safety?”

Working Side by Side

 

For industrial purposes, the traditional solution has been to sequester the robot behind safety cages or perimeter tape to prevent accidents.

To minimize the risk of worker injury, the National Institute for Occupational Safety and Health recommends physical barriers and sensors that cut electricity when someone strays too close. The agency found that many accidents occur during unusual conditions such as when workers are performing maintenance, testing or programming and fall victim to an unexpected robot movement.

But the industry has found a segregated workspace is not the most productive. Manufacturers such as Boston’s Rethink Robotics increasingly are offering interactive features in which the human worker provides input to help the computer understand the desired task. With Rethink Robotics’ Baxter robot, a person manipulates its appendages by hand and uses a touch screen that doubles as the robot’s face to get it to perform the instructed task.

“Baxter is a good example of where modern robotics is going,” said Dieter Vanderelst, an assistant professor of psychology at UC’s McMicken College of Arts and Sciences.

“They have this humanoid, industrial robot. Anyone should be able to program it to do repetitive tasks,” he said. “The human worker becomes a supervisor of the robot and works alongside it.”

McGhan said Rethink’s Baxter is revolutionary not only because of its ability to detect its environment and avoid potentially harmful impacts during assembly line operations but also but also because its digital display features a virtual face that can make a sort of eye contact with people so they know where the robot’s focus of attention is.

“Before, you had no sense of what the robot was ‘thinking,’” she said. “Now the worker sees the robot is focused on a task and understands the robot may not react as quickly under those circumstances. The person has an indication about what to expect.”

Safety First

 

UC researchers are approaching safety questions from the human end.

Tamara Lorenz, UC assistant professor of psychology and engineering, and her students are studying people to learn more about what makes them comfortable or anxious while working around robots.

“I’m interested in what humans perceive as safe and what makes them feel safe in their surroundings and in the vicinity of robots,” she said. “What would be required for people to trust robots and rely on technology and feel like interacting with them.”

Companies in the United States have been slower to embrace robots than those in Asia. The top three markets for robots are China, South Korea and Japan. The United States is fourth. Perhaps no coincidence, American pop culture is full of killer robots, from Ash and David in the “Alien” franchise to the seductive Ava of “Ex Machina” and the sentient robots of HBO’s “Westworld.” Robots are often presented as dangerous and fraught with malign intent in the popular Netflix series “Black Mirror.”

“In the United States and Europe we treat robots more like a thing. Asian cultures treat it as a being,” Lorenz said. “We give them names, sure. But it’s not like in Japan or South Korea where they treat the robot almost like it has a soul.”

Big Personality

 

Lorenz, UC postdoctoral fellow Maurice Lamb and graduate student Riley Mayr work in a lab with four robotic arms. One is a small tabletop machine. The lab also has two larger, identical robotic arms operated by a remote joystick. The researchers named the twin robots Hans and Franz.


“I get to play with these toys. It’s really fun,” Mayr said. “Ultimately, I’m studying human behavior. But human behavior when working with and alongside robots.”

The last robot in the lab is a large seven-jointed machine by KUKA Robotics that looks like something Tony Stark would use in the movie “Iron Man.” The robot is bolted to a solid platform and can lift up to 30 pounds. It’s used in all kinds of medical and industrial applications. The arm is encased in form-molded white plastic with a calming blue ring of light at its wrist where users can plug in tools, grippers or other implements.

“We call her Tess. Obviously, it’s a female robot,” Lorenz joked. “As soon as a robot moves, you start anthropomorphizing it.”

Lorenz is studying how to get robots to engage with people the way people do naturally with each other.

“When you talk to someone, you are aware that you’re engaged in an interaction. You want the robot to understand the interaction, too,” she said. “It’s the slight difference between doing something with somebody as opposed to next to somebody. We want to replicate that with the robot.”

Robots in Medicine

 

New technology in robots is making them more practically useful in the workplace. Many of the most dramatic advancements have been in medicine.


McGhan’s graduate student, Christopher Korte, is collaborating with UC Health on a robot that can help cut open the human skull for a procedure called a mastoidectomy. As many as 60,000 of these procedures are performed every year, according to the National Institutes of Health.

This procedure is exhausting and time-consuming for surgeons who then must spend hours more at the operating table for the surgical operation once the small section of outer bone is removed, Korte said.

“Safety is definitely a consideration. You’re drilling holes into human bones. You have to avoid critical structures and make sure you don’t do anything that will jeopardize a patient’s health,” Korte said.

Robots show promise for this application since there is far less variability in rigid bone than soft tissue, he said.

“Soft tissue is pliable. If you put a little pressure on it, the landmarks move. It’s a whole other layer of difficulty,” Korte said.

Korte, a student in UC’s College of Engineering and Applied Science, is particularly interested in one unusual application for surgical robots. They will be a necessity for astronauts undertaking a mission to Mars, Korte said. These long missions will not include medical specialists who can address every conceivable medical need. Even telemedicine, where doctors on Earth could assist with a medical procedure, would be unwieldy and impractical, he said.

“The time lag between Earth and Mars is up to 48 minutes depending on the orbit,” he said. “Trying to operate remotely is simply not possible. The idea that this might be sent into space for extraterrestrial exploration is the driving force for a lot of this.

“It’s a fun project that I enjoy working on,” Korte said.


Economic Disruption


Economists predict that robots will be a disruptive economic force in much the way that computers and the internet have been.


But UC’s Vanderelst said that won’t necessarily be a bad thing. He has a background in engineering, psychology and biology and is working on a project to help robots navigate by sound like a bat, which would be an especially useful skill for drones.

“There is a fear that robots will take over jobs. What I hope is that robots will take jobs that nobody wants to do anyway,” he said.

Computers have displaced many jobs but created many more that never would have existed without them, he said.

“People will adopt other jobs that are more important and meaningful,” he said. “Instead of focusing on a repetitive task, robots can take over just like technology has been an enabler in the past.”

But McGhan said there are other philosophical considerations for how we’ll use robots as their abilities evolve.

“I definitely want a helper robot to help fold my clothes,” she said. “At the same time, there are some days when I’d like to do the work myself so I’ll have that five minutes to meditate.”

Researchers have found that people can reduce daily stress by performing physical activities  such as gardening or household chores, she said.

“It sounds New Agey, but sometimes you just need to wind down. I’ll always vacuum my own floors rather than have a Roomba do it for me because I enjoy doing that,” she said. “It really is about the little things. You don’t want the robots to take over your life completely.”

And engineers are grappling with safety, legal and moral questions about self-driving cars, she said. In one example she gave, self-driving cars wouldn’t budge at an empty four-way intersection because of a bush moving in the breeze.

“The bush was close enough to the road that the self-driving car defined the motion of branches swaying in the breeze as another car. That was the closest match. It was like the car was hallucinating in a sense,” she said. “With its limited vocabulary of obstacles and objects to detect, it really had no other way to classify – to describe – what it was seeing.”

Smart cars struggle with these decisions under the best conditions. Throw in fog, rain, frost or snow and the system’s flaws become more apparent, she said.

“There’s a reason why they test self-driving cars in sunny California,” she said. “Essentially, it’s a student-driver. It’s not autopilot. More than 90 percent of the time, it will be fine. But you had better be ready to hit the brakes when it does something stupid.”

By Michael Miller
513-556-6757
Photos by Joseph Fuqua II/UC Creative Services