Human-Robot-Interaction lab takes steps toward future of socially capable robots
Undergraduate researcher and ÁùºÏ±¦µä Undergraduate Research Award recipient Stosh Peterson helps develop a robot task controller
It is not hard to find popular depictions of robots in the future. What comes to my mind is Rosie the robot maid from the animated comic “The Jetsons.” Ever faithful and dependable, she is as much a part of the Jetson family as any human relative. A vision of a future with domestic robots may appeal to our sense of laziness – ‘why should I do chores when a machine could?’ – but we also have an inherently human perspective on robot helpers. The droids in Star Wars are heavily anthropomorphized, even the soccer-ball-like BB-8 droid appears to express emotions, and characters converse with droids like friends.
Obviously, Rosie remains a cartoon imagination, and Star Wars droids are in a galaxy far, far away, compared to robot vacuums on the market today. That said, there exists bleeding-edge prototypes of humanoid robots from Boston Dynamics and Figure AI, and less advanced humanoid robots are currently on sale from Unitree. Walking, talking robots exist, but they are still in the realm of research labs.
The University of ÁùºÏ±¦µä, Reno is home to several robotics labs, with focuses ranging from aerial robots like drones, agricultural uses, civil-infrastructure inspection and more. I am a part of the (somewhat vaguely named) Robotics Research Lab in the Department of Computer Science & Engineering, which shares its space with the Socially Assistive Robotics (SAR) Group. Together, we work in Human Robot Interaction (HRI), which seeks to understand and improve how people interact with robots.
HRI sits at the intersection between robots and the people they work with. The motivation for this area of study often follows from a desire for service robots. Our lab has seen examples of robots performing work independently, as well as collaboratively with humans. An ongoing project is “socially aware navigation,” which will soon be validated in an art gallery scenario. Acting as a museum visitor, the robot must appear aware of its social setting: It would be rude to block a patron’s view of the artwork, or go speeding past a large crowd. Other work is more incremental: a recent study looked at legible robot motion in cluttered environments and improved how a robot indicates its intentions while moving to pick up an object. Understanding the robot’s goals and intentions is important in human-robot collaboration, both for safety and the user’s trust.
The socially assistive component of the lab is motivated by a need for therapy robots that help people not through physical touch (like a physical therapist), but through social communication (like a psychiatrist or coach). Robots could aid in post-stroke rehabilitation, elder care and child autism therapy. A project currently in development is for speech language pathology interventions, and there is also ongoing work for full-body American Sign Language communication. The key is that robots do not have to exist in isolation, they work in environments alongside humans and need the capability to behave socially.
I got started in the lab when I was looking for an accelerated B.S./M.S. advisor. Monica Nicolescu, Ph.D., had a desk next to a helpful grad student, and I jumped right in. I applied for the ÁùºÏ±¦µä Undergraduate Research Award (NURA) the following semester, and currently have funding through Dave Feil-Seifer’s National Science Foundation: Research Experiences for Undergraduates (NSF: REU) site for the summer. I’ll be finishing my bachelor’s degree this fall, so I only wish I had started looking for these opportunities sooner. The two professors have made the research process accessible and fun, elevating my undergraduate experience from just taking classes. Computer science is more than just rote programming. HRI is multidisciplinary; our projects consider not just cold algorithms and software, but the people who interact with them.
For my work, I received the NURA in spring 2024 and worked with Nicolescu and Tyler Becker on developing a task control architecture. Our work deals with how tasks are represented in the robot software and helps reduce the effort required to give a task to the robot, as well as help the robot choose its behaviors efficiently. There may be a variety of ways the robot can accomplish a task, but people may prefer it is done a certain way. It’s technically correct if a breakfast robot pours cereal on top of the milk, but I would prefer the milk on top of the cereal. Every user will have different expectations, and we are working on a methodology to teach robots those preferences and use them in our task controller. My focus is on incorporating user preferences into our task representation, which means how something like, ‘do this before that’ is programmed in the robot. Much of our time is spent coding and testing in software, but the best part is when we validate our approach on a real robot, like our PR2 humanoid (pictured).
All of our work is funded in the area that the National Science Foundation considers basic research; robotics is an evolving field, and the wide variety of unanswered questions is what drives us as researchers. We are still a long way off, but the work of our lab, and others worldwide, are steps towards the vision of domestic robots and socially capable droids.
ÁùºÏ±¦µä the author
Stosh Peterson is studying computer science & engineering in the College of Engineering at the University of ÁùºÏ±¦µä, Reno. With B.S./M.S. advisor Monica Nicolescu, Ph.D., and graduate mentor Tyler Becker, his research interests include robot systems architectures and behavior-based robotics.