On March 12, Dr. Ben Amor gave a talk about Human-Robot Interactive Collaboration & Communication in the Robotics Institute seminar series at CMU. Abstract: Autonomous and anthropomorphic robots are poised to play a critical role in manufacturing, healthcare, and the services industry in the near future. However, for this vision to become a reality, robots need to efficiently communicate and interact with their human partners. Rather than traditional remote controls and programming languages, adaptive and transparent techniques for human-robot collaboration are needed. In particular, robots may need to interpret implicit behavioral cues or explicit instructions and, in turn, generate appropriate responses. Dr. Ben Amor presented ongoing work which leverages machine learning (ML), natural language processing, and virtual reality to create different modalities for humans and machines to engage in effortless and natural interactions. To this end, he described Bayesian Interaction Primitives – an approach for motor skill learning and Spatio-temporal modeling in physical human-robot collaboration tasks. He also discussed our recent work on language-conditioned imitation learning and self-supervised learning in interactive tasks. The talk also covered techniques that enable robots to communicate information back to the human partner via mixed reality projections. To demonstrate these techniques, Dr. Ben Amor presented applications in prosthetics, social robotics, and collaborative assembly. https://www.youtube.com/watch?v=eIKAQH6cvi8