Practical Access Podcast

S10 E4: Enhancing Social Skills and Learning through AI and Robotics

Eric Imperiale Season 10 Episode 4

In this episode of Practical Access, Lisa Dieker and Rebecca Hines discuss Project RAISE (Robots and Artificial Intelligence to Improve Social Skills for Elementary Students), a federally funded initiative aimed at integrating AI and robotics in educational settings. The project is a collaboration with UCP of Central Florida and led by computer science expert Charlie Hughes, among others.

Link to website: 

https://www.ucf.edu/research/research-project/raise-robots-and-artificial-intelligence-to-improve-social-skills-for-elementary-students/

Key Points:

1. Purpose of Project RAISE: The project focuses on using robotics and AI to aid the development of social and communication skills in elementary students, particularly those with autism spectrum disorders. The initiative seeks to merge technology, engineering, science, and math skills with targeted social skills training.

2. Engagement with Technology and Peers: The project involves students learning to code robots through an interactive AI agent, fostering both technical and social skills. Students first work with the AI agent alone and then bring a friend to collaborate, promoting social interactions and peer teaching.

3. Use of Avatar in Classroom: An avatar named Zoobee guides students through activities and provides non-judgmental, consistent feedback, reinforcing positive behaviors. This AI component helps students practice communication in a safe, controlled environment.

4. Recording and Analyzing Student Interactions: The project involves recording and analyzing student interactions with the AI and peers to study social reciprocity in conversations. This data helps in understanding and improving social skills among participants.

5. Biometrics for Emotional Recognition: The project also experiments with biometrics, using devices to monitor physiological changes indicative of stress. This aspect aims to understand and respond to the emotional needs of students better.

6. Impact on Students with Autism: The hosts discuss the potential of such AI-driven projects to significantly aid students with autism by providing a safe space to interact, learn, and express emotions.

7. Future Directions and Accessibility: The project aims to make its resources available for broader use after completion. The hosts emphasize the importance of such initiatives in making education more inclusive and tailored to individual student needs.

Lisa Dieker: 
Welcome to Practical Access. I'm Lisa Decker.

Rebecca Hines: 
And I'm Rebecca Hines. Today, Lisa, I know we're going to be talking about a project that we both love.

Lisa Dieker: 
Yeah, so we're going to be talking about kind of a  theme in this season, which seems to be a bit about artificial intelligence. But now, we're going to throw some robots in there and do it with little kids. So today, we're going to be talking about Project R.A.I.S.E. It's a partnership with UCP of Central Florida, and of course, our dear friend in computer science, Charlie Hughes, and several other key lead people across the campus. And just so that listeners know, this is a federally funded project called the Stepping Up Grant. When we're done, everything will be free to use. Project R.A.I.S.E. stands for Robots and Artificial Intelligence to Improve Social Skills for Elementary Students. Now, that's a mouthful. So, Becky, I'm going to let you start by kind of like what is Project R.A.I.S.E?

Rebecca Hines: 
Well, Project R.A.I.S.E. is indeed a mouthful, and it's a handful trying to implement this. So this study was a kind of a brainchild of you, Lisa, and Charlie, and me. We've always worked together, and similar interests, taking big ideas and turning them into something actionable. So this project began from the ground up, and the initial considerations were, what do we want to do when it comes to creating an atmosphere, a way for kids with autism spectrum disorders to receive some specialized technology training in the area of coding? To give them a hook, something novel to talk about with their peers, as a way of building relationships. And then also providing an opportunity for our robot ZB to connect with these young people so they have a peer support later in the classroom that is familiar and safe and comfortable to help guide them through classroom activities. So that sounds complex.

Lisa Dieker: 
And it gets worse.

Rebecca Hines: 
It gets worse. It gets worse. So when we conceptualized this, we recognized a need for more access to technology engineering skills, science skills, math skills for kids with disabilities. And that's why we chose the robotics. We recognized a need for social skills. And that's why we've had their targeted focus on building connections for these young learners. And we recognized the need for students to have some one-on-one guidance or support when they're actually in the classroom. So as we built out this program, we've learned a lot by videotaping recorded phases of the project and simply analyzing what are the social interactions in those settings? How are these young children interacting with our robot? How are these young students interacting with the peers? And what does it look like when they're back in the classroom?

Lisa Dieker: 
Yeah. And I think, Becky, one of the most interesting pieces for me early on, and I think this is great for our eyes, we had to help because we were focused on mathematics. Shocking, that I was part of the project and that was our focus of having kids have a chance to talk and we had to go back to some basic routes with some of our amazing teachers and say, can you do cooperative grouping? Can you do think, pair, share? Because what we were finding is a lot of our students with autism or students in this project didn't necessarily want to talk. And since they weren't given the opportunity to talk, that's what the outcome was. They weren't talking. And so initially, we thought it was about building their social skills and their experience. But we also find simultaneously the need to make sure amazing teachers remind themselves of some of those really basic foundational things such as think pair share that gives kids a chance to turn and talk and to really increase communication because we found they were talking more to the avatar but then they were stopping talking if they were given the opportunity in the classroom. And I think that was a really fascinating finding for me. And it's one that I reflect on often when I'm in a classroom. It is like what was the opportunity not for just a kid to participate but for every kid to participate in a conversation.

Rebecca Hines: 
And Lisa, you know this is a takeaway from that for teachers. And we'll talk a little bit more broadly about what the project includes. But this idea of purposefully watching and recording reciprocity in conversation has been a key to this. We have looked at when we take a child, we do take the child into a separate setting for a 10-minute teaching led by our robot, not by a human but by a robot who is appearing on a computer screen and telling the student how to accomplish certain tasks. By recording that 10-minute event, we can go back and analyze the data and see how many times that student responded to a question posed by our robot, how many times he or she had multiple incidences of reciprocity in a conversation. So two or more questions and responses on the same theme versus what we see so often in kids with social skill deficits, which is they will talk, but it's not directed at anyone, and it's not a conversation with anyone. It might be a think aloud. It might be repetitive talk. It might be echolalia, but it's not often the reciprocity that we would find with somebody in a conversation. So just as a teacher, if you think about your child, let's just take what Lisa just said with a think pair share because it is a very specific segment. And if you record how many interactions a student with ASD has during a 10-minute, a 5-minute activity, and then seek to increase that, that's the goal in this huge project, but that's the action item any of us could do especially as we're trying to set goals for kids. So we included an avatar, you know, a robot to guide the interactions to start. This will be available to people after next year. You can try it yourself with our model, but it's something that can be done right now if we zero in on those specific skills.

Lisa Dieker: 
Yeah, and I think, you know what's interesting, and so far, listeners, we're going to take you through it. Just imagine for a moment you're a student in this work. We've actually created multiple buckets of things you can use from this project that are very practical. One is, as Becky's been saying, is ZB the avatar who actually helps students learn to code a robot by themselves for 10 minutes, which is a really great station for learning in the classroom. And you don't need to be there, hopefully, if we get it right. And then those students get to bring their friend. And that's what we really have seen as powerful is for a student with a disability. I always say they should be givers, not takers. To be the giver, let me give you knowledge on how to code a robot with my friend Z B. So we still have the scaffold, which we've been talking about. AI provides scaffolding. It doesn't do the answering because it can't have a conversation that is outside of the realm of what it's been programmed to do. But it can help with a prompt, with a help button, help the student code the robot. So first, they learn to code the robot online by themselves. Then they bring a friend into that. And then, as Becky said, these kids that might be struggling with some social skills have an AI agent come in the classroom with them. And the AI agent gets to do what every teacher in America wishes they could do: to reinforce your on-task behavior, your thinking, your executive functioning in a non-judgmental way. And I don't know about you, Becky, but if somebody, you know, if a kid needs to sit down and I've said it 10 times, I'm really good at it, but there is a point that I'm sure my voice changes. The beauty of an AI agent and say it 10 times and not care. And so at this point, our agent is being very neutral, and it's not acknowledging bad behaviors because it can't do that yet. We didn't want to put cameras on it, but it is giving statements in a very affirming way. And it never gets tired and it never gets overwhelmed, which we think is a real power.

Rebecca Hines: 
Funny you should mention that because some of the best moments, if I was creating an outtake reel from this project so far, are moments where our participants come into this environment where they're being led by our avatar, and they go off on the avatar, and they get mad and they get angry and they're frustrated because something won't work. And they'll say mean things to our avatar. But the good news is the avatar does not care. So the student, what we have seen in some cases is kids who are not accustomed to change, who are not accustomed to running into obstacles, who can't easily figure out why something's not working, we have had instances of kids who experience extreme frustration. But we've also seen them deescalate in a safe way, going through this novel experience with the guidance of someone, air quotes because it's just our avatar, who is not going to take it personally. A feature of our avatar is he does have a heart that changes color depending on the emotion, you know, that they're feeling based on how the student is reacting. And the student, even without us pointing it out, notices, wow, his heart has turned purple, his heart has turned green.  And then the avatar will have a discussion about it if asked. So there is this calling of attention to emotions, which was also one of our main goals, is to help students learn to read the person with whom they are communicating. Lisa, there are tons of issues and glitches that we continue to work through. We have a layer that also includes, on the emotional level, that includes biometrics. Some students, if they opt in, wear a biometric device and will literally look at changes in their heart rate, changes in physiology that indicates that student is experiencing stress at certain points in the intervention. And it's fascinating when it works, but what we know because now we've worked with ten schools, nobody's technology is the same, none of it is easy from a technology perspective yet. But what we're finding is we are able to increase reciprocity. It is a safer environment, especially for the kids who are a bit, I will say, explosive verbally when they feel frustrated. And I think those are some of the areas that I'm seeing the most potential from this type of research

Lisa Dieker: 
Well, and it's funny because, again, another really practical idea of sharing this with someone who's a behavior-certified analyst, and she said, "I am going to print out pictures of Zoe's hearts and hold him up about how kids are treating me during sessions." And I was like, "Oh, that's a really typical suggestion of that." So that now we have this AI agent who is non-threatening saying, "Look, right now you're being red to this human. And, you know, so trying to build that reciprocity with AI agents sometimes is very safe. And what we often see is social stories or movies or videos that kids really do use as ways to learn. And there's a great story once, you know, on a kid with ASD, that Disney movies became very helpful. And we believe that AI agents can do that. But again, just like everything, you need the guardrail to be sure we're not making assumptions and forcing kids, 'You must have an AI agent.' One of our kids refused, and we're like, 'Of course you did.' Yet, we have one girl in the beginning that didn't like him. And at the end sings that 'I love ZB' song and wants to take him home. There's a fine line there too. But again, it's a good example of AI having an impact. And then the last piece I will just add, there is, I think biometrics are fascinating. If I could turn back time 20 years to when Josh was born, I would love to have had a device, whether it's the O-ring Apple Watch or something that says, 'Wait for it, it's coming.' You know, we have those devices. We have dogs that can sense seizures. We have now the Empatica that senses seizures. Wouldn't it be nice to know our student, our child, or whatever inside is struggling and getting that data with out making it public to everybody but then saying to them, 'It appears you're stressed. Well, I'm not really stressed, I just sneezed.' Okay, those are totally different kinds of discussions because that data is not perfect yet. But it is what happens when you go to the hospital. When you go to the hospital, somebody takes vital signs for you. God forbid you’re in ICU, somebody's watching those signs nonstop and makes a decision, you're better and moves you out of ICU. And so I feel like that's where we need to go within reason but making, again, it's safe and it's built upon helping you personally instead of let me decide what to do to you or for you. And I think that's where we go back to this person-centered idea. And that's what we're trying to do in this project, which I think is exciting.

Rebecca Hines: 
Well, it's been a fun exploration, Lisa. I know you will put the link so that others can at least go and see the splash page for Project R.A.I.S.E. so you have an understanding of the project, the biometrics part that's what's next, and we're all looking forward to seeing where that leads us.

Lisa Dieker: 
Alright, well, thank you for joining us. If you have questions, please send us a tweet at Access Practical or you can send us a question on our Facebook page. Thanks for joining us!