While conducting research for another project, we had a chance to observe the interactions between teachers and children at the Lionheart School for children with autism. We saw their therapy sessions, their use of technology such as smartboards and observed what did and didn’t engage the children. As we did some online research, we came across encouraging reports of the use of Kinect as a part of therapy. We wanted to take this a step further, and create a customized application for children with autism.
Our vision was simple: this would just be another tool in the toolbox that teachers and therapists have – one that leveraged technology to engage rather than alienate children from their teachers and peers.
Being a fly on the wall
We were super lucky that the teachers and therapists at the Lionheart school were so supportive. We went to the school a couple of times every week, and sat in on their classes, activities and therapy sessions. We also conducted a lot of interviews with the teachers and some students to understand what made them tick. The initial response was positive – most students took to technology enthusiastically.
The ethnographic research led us to our main design concept: we would recreate a storybook read-aloud experience, using the Kinect to engage the students and immerse them within the story.
The teachers already used a lot of props, pictures, singing and acting to help the children visualize the story, and we wanted to capitalize on this. Stories were a big part of their daily routine at the school – the teachers used story-time to encourage the students to answer questions, develop empathy, listening and language skills, improve attention span and develop their curiosity and creativity.
The final application let the teachers narrate the story of Jack and the Beanstalk to a child, who stood in front of a large screen and a kinect. The screen showed each page in the storybook, and the teacher could flip to the next page using a remote.
- Some pages were in the form of a black outline of the pictures, and the screen had four colors that the child could “pickup” and color the pictures.
- Some pages had sound effects that the teachers could trigger, which they typically used in order to encourage the child to answer questions.
- Some pages had missing objects on the page, and the screen showed a few options. The child could gesturally drag and drop an object onto the page in response to the teachers’ questions.
- On a couple of pages, the child could actually control the character on screen through their movements. For example, when Jack was running away from the Giant, the teacher would encourage the child to make a climb down motion, and the faster the child “climbed down”, the faster the little Jack moved down the tree in the picture.
We coded everything in C# using the Kinect sdk, which was surprisingly easy to do. The more difficult parts were figuring out which gestures are intuitive for each of the actions and customizing the gesture recognition to work well with children. We did a lot of iterative testing and designing with the teachers and students at the Lionheart school and got a lot of excellent feedback.
Simple changes such as changing the color of the little “palm” icon to the currently picked color made it a lot easier for the children to color in the pictures. We also created a rectangle on the carpet with some tape within which we wanted the children to stay so that the Kinect would identify them correctly.
The highlights of the experience
The students all really enjoyed the experience, especially the climbing up and down part. In fact, when we left the system on and running, we observed some of the older children narrating the story to some younger children – a great win!
The iterative design and testing we conducted showed that interactive approaches to storytelling hold a lot of promise. Some further approaches we would like to study include:
1. Adding support for multiple students to interact with the story at the same time – maybe co-operative activities that influence the story.
2. Creating an application for remote storytelling incorporating video and audio chat.
We won the first prize in the “Health” category at Georgia Tech’s Convergence Innovation Competition. We also got covered by 11-alive, a local tv news channel.