Lecture 2 – Robots that Teach by Brian Scassellati

*The lecture was led by the professor that had advised me to send my email to other professors actually working on microbes (during my article for microinsectbots)

Autism kids wanted to play with the robots for 5 hours (perseveration on topic of interest)

Behave as if they were interactive normally with these socially engaging robots —> possible therapy?

Job was to encourage robot whenever robot got scared (robots is scared of water)

  • When he makes eye contact with woman on the right (therapist), almost never makes eye contact with her
  • After 5 minutes of robot (even after its taken away), he’s engaged and makes eye contact with therapist
  1. Robot was controlled by woman in middle, remote control taped under clipboard 
  2. Brought him back next day and third day, and see same response ,but fourth day, the response went away —> X unusual, what should’ve been expected

(= repeating the same joke over wouldn’t be funny)

You can’t just do the same thing over and over and expect same result but rather adapts to situation

—> going to need to build robots that could teach, model the interaction dynamics, tune to the specific needs of the individual, and grow and change over time

Questions:

  • Why does this work? Why do children respond to robots when they don’t to other people?
    • Well robots work cuz they’re simpler to understand (more predictable) —> one was perfectly predictable, other not . —> clear preference to robot that’s more predictable, but autistic kids didn’t show much response
    • Simple vs complex robot
    • Go through lots of easy solutions —> no perfect answer yet
    • Possible answer: Robots are social, but not too social
      • Minimal Language Input in Deaf Infancts
        • 2-3 / 1000 children in US are born with hearing loss
        • > 90 % of deaf children have hearing parents —> show a different developmental course because they don’t acquire language at the speed you need to (despite very dedicated parents)
      • Design Challenge:
        • If we build robots that six month old infants can interact with (but couldn’t design due to complex dexterity and expressiveness in face, posture —> not safe for infants nor easy)
        • Virtual avatars with much dexterity and expressiveness —> infants don’t learn form things on a flat screen
        • Put two technologies together?
          • Responsive and engaging to infant but had physical dexterity to sign?
  1. Robot closes its eyes and nods its head —> child mimics what robot did
  2. When robot turns to look at character on screen —> infant does too (and even follow the animated character)

Infants are moving symmetrically (linguistic= one hand)

  • Infant does water sign with just one hand 
    • Even six month infants want to socially engage with the robots

Teaching English to Native Spanish Speakers in K-2

  • Schools where 80% of kids didn’t speak English as first language, struggle to learn English in public school system
    • What’s the biggest problem you have? Whenever kids come in, they’re very attentive between we ask them a question, they say they don’t know —> hard to tell whether they’re successful because they don’t know whether kids know —> slow down or fasten teaching
  • 15 minutes of interaction with kids
    • Focus on most common translation problems kids had (Spanish “hacer” —> English “make” (sandwich) or “do” (homework))
      • 13 different context tools to learn to differentiate what to utilize (make or do)
    • Robots gave kids translation questions (tell story in English)
      • Robots can recognize what rule he got wrong and make more practice for rule he got wrong —> practice for 5 minutes
  • When given pesonalitzed instruction, performs 85-90% on the test because can give one on one learning
    • Why are they more open to robots? 
      • You’d be embarrassed, not want to make a mistake in front of everyone, but in front in everyone, nobody thinks the robot is going to make them feel emabrarased
        • Robot is social but not too social
          • While engages people socially, doesn’t engage people to make them feel embarrassed

  • Do we really need a robot? Like a cheaper technology like phone?
    • Robots are expensive, fragile, difficult to mass produce. Can’t we do this on some software and send it to tablet? IT DOESN’T
      • Having a physical robot can increase compliance (more likely to do it when told vs when a screen asks you to), and increase learning gains (faster learning than screen)
    • Investigating Embodiment (nico)
      • Need the office for 6 months
        • Brought robot into office, pretend as if Brian was away
        • Tells everyone that he allows employees to do office tasks
      • Half of them when came into office, sat next to robot
      • The other half sat to big flat screen (2D video feed of same robot)
        • Robot did the same exact thing whether screen or physical: wave, watch them, and acted gord
        • We want you to put a few textbooks away (robot knows where textbooks belong)
        • Would point to nice pile of books and then point to empty location on the shelf
      • Everybody there did what they were told to do (happened 2-3 times, and then when the robot pointed to a new textbook ($500 worth of books) to trashcan
      • Question: how many undergrads who sat in the office would take the brand new textbooks and throw them away because the robot said so?
        • When sitting next to  real robot, 2/3 (blue) followed without any hesitation
        • When robot was in flat screen, less than 13% took books and threw them away (tried to negotiate with robot, some tired to hide the books behind the trashcan)
      • All understood
      • Orange : showed picture of trashcan and pointed inside it
      • Still wouldn’t do it (screen)
      • Real robot ==> did right away easily and naturally without hesitation 
      • We as humans respond to things that are there physically
      • When physically present, more easy to convince someone to do something

*Embodiment increases compliance*

Take most abstract puzzle (no physical manipulation) : nonagrams

  • Consecutive number of 6 filled squares, and any blank squares I want and one blank square
  • 7 (blank and then 7 or other way around)
  • Have to simultaneously apply rules to columns and rows
  • Solving strategies
    • 7: first or last 7?
      • Can immediately tell that the middle 6 squares HAVE to be shaded in
    • 100 / 4 groups, told how to solve puzzle

2nd and third group are told hints (but second group had a disembodied voice and third voice got video of robot and 4th group got real physical robot , all with exactly same information and prerecorded lessons)

  • Make first and last puzzle just as difficult by taking first puzzle and rotating it to 90˚

  • No help: improved by 2 minutes
  • Lessons (disembodied voice or video): improve by 3-4 minutes
  • Exactly same lesson through real robot : improved time by almost 6 minutes

Why faster with actual robot?

  • Robot being there could actually be more distracting
  • Therefore, physically worth it to physically sit in class (when get option, please go just by sitting there, will learn more)

*Embodiment increases learning gains*

Different schools show that

Do we really need a robot?

If we want to teach, want something that’s physically there

  • Is there something special that a robot can do that people can’t?
    • Kids become the teachers (therapy system) after they learn something [watch one do one teach one, you’re not certified on a procedure until you’ve seen it being done, done it yourself and taught it to someone else] —> also used to autistic kids
    • Tell kids that the robot has problems (same social problems that you have just practiced dealing with, and we want you to teach Ellie the right thing to do)
      • ex. When Elli is speaking to you, won’t make eye contact (kids notice that this is wrong), 

  • Diagnosis (no bloodiest, imaging test that will tell autistic, but have a checklist of 10 items (fails to maintain eye contact), and if check 7+ items : diagnosed with autism)
    • Can get different diagnoses according to different conditions (find ways to quantify diagnosis and employ with lots of kids)
    • Program sphere to act in certain ways
      • happy: robot rolls around with happy sounds
      • fearful: play anxious sound and tries to hide in corners
      • Angry: flashes red and rolls at people and blares
      • sad: blue and it cries
    • When take kids typically SAD or ADHD —> play with robot and do all sorts of things kids to with robots
      • Awhen robot is happy: kids run around and play with robot
      • When robot is sad: most typical kids hold the robot and give it to adult (there’s a problem here) but kids with ASD don’t show those actions, just nudge robot with foot, try to make robot not sad and don’t respond in same way
        • Use mechanism to not make perfect diagnosis but to screen the kids that we should do complete diagnosis on (measurable, repeatable and easy to do)

Can take on new roles, and quantify behavior (things that aren’t easy for us to measure)

  • But can robots really provide therapy?
    • Dinosaur robot : wasn’t enough technology to do more than a 5 minute interaction and was controlled by human —> would have to build an automatic robot system and be there when you wanted/needed it, on demand , in your home
  • Gibo, Brough system (robot and the screen and cameras) into homes for 30 days of fully autonomous social skills therapy [Robots were completely autonomous, but over 30 days, robots were going to change to cheat they are doing and adapt to what the kids needed, so robots at end of experiment wild be completely different]
    • Robots trained to play a variety of different types of games (understanding emotion’s mainitaining emotions, same social skills that kids had difficulty with or would teach at a clinic)
      • Sometimes the robot was an opponent you played against, was the game master, but didn’t just engage child one on one (interact with child and parent at the same time)
        • We’re in no way to train the kids to interact with robot(no value)
          • But instead change the way they interact with their parent —> immediate value
      • All games were designed to promote interaction and play between parent and child

Diffferencesfrom Prior art

  • Autonomous : 30 days without contact
  • Adaptive : alters curriculum and engagement patterns to match strengths and preferences of individual child (ASD is on a range)
  • In-homea;  complex and challenging environment

Needed to show that the kids were improving in the games (higher scores) and the kids DID show improvement in scores

Asked parents subjectively how kids were doing without robots (after 30 days, made more eye contact, respond to communication and even with other people)

Brought in outside set of clinicians and ran standardized test with kids 

(30 days before robot arrived, day robot arrived (how much of these kids would improve  in 30 days without robot = no difference in growth), 30 days after robot (during the intervention, showed drastic improvement in social skills), 30 days after robot had gone away (game that kids had degraded even after robot gone)

ONLY SHOWED IMPROVEMENT when paired with the robot

Fact that the games went away shows that the robots really caused the games to happen

—> first time to document kids making verified advances in their social skills

—> best measure sin’t statistics but the impact on the families

One family was willing to sleep on floor to send all possessions away to spend 2 weeks with robot 

Can robots really provide therapy?

  • Hopefully, will need robots that will work for longer durations

Not just using tech to teach and therapy, but do things that look at ways how it can improve the world.

Six robotics faculty, visit that website to see the things they’re doing (hire undergrads at day one even with students without any experience,  so come by and say hello)

Robots can’t explain themselves —> critical skills that we’re missing (likely to see more of it in future but not to teach a single subject)

Changes strategies for motivation based on what has been successful and what hasn’t : take data about what has been successful (scores) and use that to shape the future behavior of robot

Children with Cerebral Palsy don’t have same social deficits as ASD

If these robot/artificial intelligence were to be implemented into schools

Physical form matters, but 

Can we sense what’s going on with the person well enough that we know how to respond appropriately?

  • Our ability to sense what the other person is thinking is most prevalent problem of CompSci.
  • Even really hard to ask if the student is engaged with what they’re doing

Novelty effect

  • AI Systems will become more prevalent and as we develop this technology, will change how technology will think
  • However the basic things about human psychology

__________________________________________________________________________________

My personal thoughts:

The moment I start hearing the first slide, I’m just like, “done.”

This has been possibly the most interesting and engaging lecutre/seimnar I had the past YYGS connect experience. I absolutely LOVED how the professor naturally started off with how he was applying to Yale as a CS major after graduating as a developmental psychology majorr.

Nevertheless, the way he explained his process of development and the process of development of the robots was AMAZING. He went slow and steady, unlike the first lecture , AND used relatable, funny analogies.

Thus, after the lecture, there was a QnA time, in which I kept raising my hand, but I did not get picked, so I decided to send an email to the professor myself. Wow, like my mouth spread wide open because the lecture just felt like a compelling coming-of-age novel.

I LOVED how the professor showed us mini clips of the experimentation and was patient enough to explain everything rather than just briefly dismissing over the official definitions. From the dinosaur-river exerpimentation to Sphero, to the Gibo experiment of the 30 days, the processes were all so interesting to hear and the presentation was VERY well organized!

I was glad to hear some familiar AP psych terminologies, along with the mention of Sphero — this toy that I always wanted in 2nd grade. Apparently, the researchers programmed Sphero to display various types of signals of a certain emotion to help treat autistic children, and I thought that was a genius approach because 01) the kids are easily fascinated and love to play with robots and being active, 02) autistic patients often struggle to read emotions, and 03) it was quite an engaging experiment that could easily show the patient’s disposition.

My Questions:

If the ASD children were of varying ASD ranges, how were the operational definitions chosen as some more “mild” ASD could show weaker symptoms? (From Gibo)

+ If people usually comply less when other people are present compared to a non human object such as a robot, why would this phenomenon not occur in some other non-human object like a doll?

I WOULD LOVE to study more about Prof. S’s projects as this was such an interesting lecture. I hope to receive his email reply too hahaha

– Joanna Kim, July 9th, 2021, 3:23 AM KST –