MET:Gesture Based Computing in Education

From UBC Wiki

Introduction

Gesture based computing uses hand, eye, face and body movements often in combination with voice to control different applications. It offers a more natural form of human computer interaction (HCI). Input devices range from gloves or cameras to touch screens and controllers such as the Wiimote.

Examples of Gesture Based Technology Systems

File:JohnUnderkofflerTED2010.jpg
John Underkoffler of Oblong Industries, science advisor for the movie Minority Report, demonstrating of G-speak. [1]
  • Xbox Kinecta motion sensor designed by Microsoft for use with the Xbox 360 and PCs (Evoluce), does not use an intermediary device like a controller.[2]
  • Leap Motion a small device that contains infrared cameras that accurately track movements and allow user to control their computer in three dimensions using just hand and finger movements.
  • Nintendo Wii uses a combination of built-in accelerometers and infrared detection to sense its position allowing control using gestures.
  • Sony Playstation 3 Motion Controller (Move) handheld wand’s position is tracked by a camera and inertial sensors. On February 20, 2013, Sony released the PlayStation 4 which now has better tracking precision.
  • 6th Sense a wearable gesture interface device.
  • G-Speak technology that is now a reality after advising work for the film Minority Report.

Although gesture based computing is best known for its use in gaming applications the potential use in education is just starting to be explored.

Current Uses of Gesture Based Technology

Sign language recognition

Certain types of gesture recognition software can take sign language and transcribe it into text. Gesture recognition software can provide valuable and engaging opportunities for students to practice their signing.[3]Kinect has shown that it is well suited for sign language recognition. [4]

Rehabilitation therapies and assistive technology

Gesture recognition technologies are already in use to aid people with visual impairments and physical disabilities. Motion tracking provides a range of solutions for physical therapy and rehabilitation. Kinetic has been used with stroke and brain injury patients at the Royal Berkshire Hospital for rehabilitation activities.[2] Teaching tools for children with acquired brain injuries make use of gesture recognition using cameras without using gloves or using controllers. As learning can occur at a slow rate after a brain injury, many trials may be necessary to acquire a new skill or information. Gesture recognition software provides the opportunity for repetitive practice in a play-and-learn environment that works to decrease boredom and frustration. Gesture based computing allows for natural interaction with devices promoting children with brain injuries to overcome their disabilities. [5]

File:Facial Recognition.jpeg
Gesture based technology can be used in novel ways to recognize and interpret facial expressions.[6]

Immersive game technology

Gestures are used to control interactions within video games to create a more interactive and immersive experience.

Affective computing

Gesture recognition technology can be used in the process of identifying emotional expression. Researchers at MIT are using gesture based technology to provide an ‘emotional hearing aid’ to help people with autism interpret facial expressions.[7] Kinect has been used at Lakeside Center for Autism to help children overcoming various difficulties regarding physical and social development.[2]

Applications for education

Child Using Gesture Recognition Technology [8]

With the commercial success of systems like Nintendo Wii and Microsoft Kinetic, attention has been drawn to the potential of using gesture-based technology in the classroom. [9] It can be used as a tool for learning and interaction among users.[2] Kinect is now an open source product which unlocks innovation for educational purposes.[9] KinectEDucation, a community of educators involved in developing resources for the classroom, unconnected to Microsoft, is an example of such innovation.

Can action support cognition? Can direct touch support performance? These are questions posed by Ayelet Segal. Her research focused on whether gestures affect thought. She looked at whether or not the physical manipulation of objects could benefit cognition and learning, improving performance in areas such as arithmetic. Students using a gestural interface, manipulating objects directly rather than with a mouse, showed higher performance in her study. [10]

Gesture based technology will enable users to interact with ideas, share and collaborate in new ways. Potential applications for education include:

Participatory Art

G-Speak from Oblong Industries used gestural technology to involve visitors at the Sundance Film Festival in an interactive film project. Their TAMPER installation enabled participants to take elements from a number of different films and create something unique. [11] Similar participatory projects could be employed in a variety of educational settings.

Simulations/Training

Researchers in Sweden have created a virtual autopsy using a multi-touch table that allows the manipulation of CT scans with gestures [12] Gestural interfaces can allow users to easily perform precise manipulations that can be difficult with a mouse.[11] The affordances provided by gestural interfaces make precision training easier to replicate and cost effective.

Physical Education

Use of gesture based games like Dance Revolution (DDR),when trialed in a school environment, have shown to increase physical fitness and motivation to participate in physical activities.[13]

Science and Math Education

Gesture based technology can be used to explore the solar system or the digestive system in science classes. Kinect Math can be used in math education to provide a hands-on representation of concepts.

Music Education

File:Eye music.jpeg
Gestural interfaces can be used in music education, as seen here in Eye Music [14]

The EyeMusic project at the University of Oregon uses eye-tracking sensors to compose multimedia productions based on the movements of the user’s eyes.[11]

Special Education

Gesture based technology can provide support for students with hearing or visual impairments or physical disabilities by providing alternatives to mouse and keyboard input. As mentioned previously, students with autism spectrum disorders are using applications such as MIT's 'emotional hearing aid'. [7]

Implication for Designing Educational Media

Gestural interfaces promote a number of principles involved in successful educational design.

Accessibility

Gestural recognition applications may be more accessible to diverse populations of learners. Aspects of gestural language are universal and not specific to any spoken language.[9] The use of natural human movements removes barriers for learners of all ages from two year-olds to ninety year-olds. Decreasing the reliance on the mouse and keyboard has the potential to increase accessibility.[12] Gesture based technology used in web browsing shortcuts has shown to increase navigation efficiency and user enthusiasm.[15] Gestural recognition has utility for users with different impairments and disabilities. For example, a user with a hearing impairment may prefer touch, gesture or pen input. [16]. Devices such as Kinect may also provide greater lab experience accessibilty for students and schools with limited lab equipment and also improve on existing virtual labs by adding a gestural component.[17]

Multimodality

Multimodal interfaces process two or more inputs in combination, for example speech with body movements. They are more flexible than keyboard and mouse interfaces which may allow people with a wide range of communication styles to communicate more effectively. These interfaces are preferred by many users and are potentially easier to learn. [16]

Voice and Personalization

Using a multimodal system that includes gesture recognition can empower users and give them a “voice” when interacting with technology.[16]

Interactivity/Immersion

File:Doll Talk.jpeg
Doll Talk, an example of the interactivity and immersion experience offered from gesture based technology [18]

Gesture recognition technology can immerse users in a variety of interactivities. For example, Kinect Math involves the physical manipulation of objects such as graphs with gestures. This provides learners with a physical and visual connection to the content. Another example is Dolltalk, a project by researchers from the Tangible Media Group. This project immerses learners in storytelling through gestures and narrative. By using a variety of sensors (both gestural and audio) the system collects, analyzes information from users to interpret their narrative. If a picture is worth a 1000 words, is a gesture based experience worth a 1000 pictures? Some believe that this may be the case.[19]. Supporters of using Kinect in education suggest that the active component of learning provided by these devices will increase the academic and social performance of students. [17]

Immediate Feedback

Another benefit of this technology is its ability to provide immediate feedback that shapes the learning process. An example of this is found in the success of gesture based technology sign language games for deaf children. If the computer system does not understand the gestural input from the children, they refine their signing until it is correct. Furthermore, children were willing to make multiple attempts and were enthusiastic about using the game.[3]

Collaboration/Community Building

Gestural based technology can aid the collaborative process and encourage group interactions. [12] Systems, such as Kinect, may increase classroom participation and create positive interactions and opportunities for discussion.[2]Researchers are exploring the role of cooperative gestures, multi-user gestural interactions, to increase the sense of teamwork, encourage equity of participation, playfulness, learning and collaborative creativity.[20] [21]

Integration of Gesture Based Technology

To successfully integrate gestural technology in the classroom a number of usability challenges and design considerations exist. One of the biggest considerations is the evolution of a new vocabulary involving body motions.[9]Currently, there are many different platforms for gestural based technology and although movement is usually quite intuitive, no standard gestural language exists.[22]

Although gestural technology is advancing at a rapid pace, systems like Kinect, which are financially accessible to classrooms, generally work well but may have difficulty recognizing high speed hand signals and input from multiple users at the same time.[9]

Other considerations include the physical set up of the classroom and the availability of the device in the classroom. The use of single device in a large group setting seems somewhat impractical. It is suggested that smaller scale user devices that interact with each other may provide a better experience.[9] If only a single device is available, considerations need to be made for how best to set up the classroom to facilitate maximal use.[23]

Many applications of gestural based technology are game based. A paradigm shift may also be needed for educators to further accept games as learning tools or to see the utility of using gaming devices such as Kinetic solely as input devices. [23] This may involve professional development for educators to develop skills to evaluate games for their creative and active learning outcomes.[2]

Another consideration is the assumption that when using a multimodal system, for example on that can use both gestural and audio input, users will interact multimodally.[16] Design will need to consider the full potential of multimodal systems that involve gestural technology and ensure that users understand the affordances available to interact in a multimodal manner.

Although there are benefits to tangible interaction when learning, situations do exist where tangible interaction is less useful and other methods of interaction are more suitable. It is suggested that a hybrid model is most appropriate. This would allow educators and learners the flexibility to choose the most appropriate interaction style for the situation. [24]

Of concern for most educational institutions is cost. Although gesture based technology may seem out of reach to the everyday user, there are many cost effective options. For example, Leap Motion offers a sensor that tracks simple hand and finger gestures for $70.[25] Microsoft also offers Kinect for around the same price point. [26] Integrating this technology into an educational environment is a one-time purchase and many more applications are being developed from the open source community [19]

Future

Commonly used input technologies, such as the keyboard and mouse, can cause major bottlenecks in performing tasks, but things like hand gesture recognition may ameliorate this problem and allow humans to interface with machines more naturally. Considering the relative infancy of research related to vision based gesture recognition remarkable progress has been made. [27]

Many agree that gestural technology is still in its infancy and the full force of its potential has yet to be felt.[19] Technology such as eye tracking devices may become commonplace, eye movements could be used to open an email or open an application on the desktop by all users. [28] Furthermore, there is the suggestion that the keyboard and mouse may one day disappear altogether. [29]

As the evolution of gesture based technology occurs, and occurs quickly, users have some opportunity to prepare. Several contests are in place to find the best applications for gesture based learning tools. Gesture based content is developing across many disciplines and driving innovation. [9]. Kinect user groups also recognize the need for more applications, be it user generated or commercial, in order for successful adoption in educational contexts. [23]

In order for gesture based computing to become a full educational reality innovation will need to involve interdisciplinary collaboration as new ways of teaching, learning and communicating are explored.[12] The ability of multimodal interfaces to model and interpret human movement and language is improving and will continue to evolve. This presents new opportunities for creating and interacting.[16]

Note: During the process of researching this page, it was difficult to find peer reviewed material related to the use of gesture based technology in the classroom. Much of the information available was from educator’s informal experiences with systems such as Kinect. I believe that this page will evolve considerably in further iterations as the technology becomes more commonplace and accessible.

References

  1. [1],Retrieved on March 2, 2013 from http://commons.wikimedia.org/wiki/File:John_Underkoffler,_TED_2010.jpg.
  2. 2.0 2.1 2.2 2.3 2.4 2.5 Kandroudi, M., & Bratitsis, T. (2012). Exploring the Educational Perspectives of XBOX Kinect Based Video Games. In P. Felicia (ed), Proceedings of the 6th European Conference on Games Based Learning, 219-227, Cork, Ireland.
  3. 3.0 3.1 Henderson, V., Lee, S., Brashear, H., Hamilton, H., Starner, T., and Hamilton, S. (2005). Development of an American Sign Language Game for Deaf Children. In Proceedings of International conference on Interaction Design and Children (IDC2004), Boulder, CO, June 2005. doi:10.1145/1109540.1109550
  4. Lang, S. (2011) Sign Language Recognition with Kinect. Unpublished. Free University of Berlin. http://page.mi.fu-berlin.de/block/abschlussarbeiten/Bachelor-Lang.pdf
  5. Pirani, E., & Kolte, M. (2010). Gesture Based Educational Software for Children with Acquired Brain Injuries. International Journal On Computer Science & Engineering, 1(3), 790-794.
  6. [2], Retrieved on March 2, 2013 from http://www.xbox.com/en-GB/Kinect/Kinect-Effect.
  7. 7.0 7.1 el Kaliouby R., & Robinson, P. (2005). The Emotional Hearing Aid: an Assistive Tool for Children with Asperger Syndrome. Universal Access in the Information Society, 4 (2), 121-134. doi:10.1007/s10209-005-0119-0
  8. [3] Retrieved on March 2, 2013 from http://commons.wikimedia.org/wiki/File:Gesture_Recognition.jpg
  9. 9.0 9.1 9.2 9.3 9.4 9.5 9.6 Evans, M. (2012). Gestural Interfaces in Learning. In P. Resta (Ed.), Proceedings of Society for Information Technology & Teacher Education International Conference 2012 (pp. 3337-3340). Chesapeake, VA: AACE.
  10. Segal, A. (2011). Do Gestural Interfaces Promote Thinking? Embodied Interaction: Congruent Gestures and Direct Touch Promote Performance in Math. ProQuest LLC.
  11. 11.0 11.1 11.2 Johnson, L. L., Smith, R. R., Willis, H. H., Levine, A. A., Haywood, K. K., New Media, C., & EDUCAUSE. (2011). The 2011 Horizon Report. New Media Consortium.
  12. 12.0 12.1 12.2 12.3 Johnson, L. L., Adams, S. S., Cummins, M. M., New Media, C., & EDUCAUSE. (2012). The NMC Horizon Report: 2012 Higher Education Edition. New Media Consortium.
  13. O'Hanlon, C. (2007). Gaming: Eat Breakfast, Drink Milk, Play Xbox. T.H.E. Journal, 34(4), 34-39.
  14. [4],Retrieved on March 2, 2013 from http://www.cs.uoregon.edu/Research/cm-hci/EyeMusic/
  15. Moyle, M. M., & Cockburn, A. A. (2005). A Flick in the Right Direction: A Case Study of Gestural Input. Behaviour & Information Technology, 24(4), 275-288. doi:10.1080/01449290512331321866
  16. 16.0 16.1 16.2 16.3 16.4 Oviatt, S. (2008). Multimodal Interfaces. In A. Sears & J. Jacko. The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications (413-432). New Jersey: Lawrence Erlbaum.
  17. 17.0 17.1 KinectEducation, Kinect Education. 2011. Five benefits of using Kinect in education. Retrieved February 25, 2013, from http://www.kinecteducation.com/blog/2011/07/02/5-benefits-of-using-kinect-in-education/ KinectEducation Cite error: Invalid <ref> tag; name "Kinect education website" defined multiple times with different content
  18. MIT Tangible Media Group, Image pg 87. Retrieved on March 2, 2013 from http://tmg-trackr.media.mit.edu:8020/SuperContainer/RawData/Papers/446-Play%20it%20by%20eye/Published/PDF.
  19. 19.0 19.1 19.2 Wired,Blum, M. 2011. After the Kinect What's Next for Gesture Recognition Technology. Retrieved February 15, 2013 from http://www.wired.com/geekdad/2011/09/after-the-kinect-whats-next-for-gesture-recognition-technology/ Cite error: Invalid <ref> tag; name "Wired" defined multiple times with different content
  20. Morris, M.R., Huang A., Paepcke, A., Winograd, T. (2006). Cooperative gestures: multi-user gestural interactions for co-located groupware. Proceedings from the SIGCHI conference on Human Factors in computing systems. Montreal, QC. doi: 10.1145/1124772.1124952.
  21. Rick, J., Marshall, P., Yuill, N. Beyond one-size-fits-all: how interactive tabletops support collaborative learning. Proceedings of the 10th International Conference on Interaction Design and Children, 109-117, Ann Arbor, Michigan. doi:10.1145/1999030.1999043
  22. Digital Arts Online, Elgan, M. 2013. Opinion: Gesture-based interfaces are out of control. Retrieved February 25, 2013, from: http://www.digitalartsonline.co.uk/news/interactive-design/gesture-based-interfaces-are-out-of-control
  23. 23.0 23.1 23.2 Kinect Education, Kinect Education. 2011. Three things Kinect needs to be successful in education. Retrieved February 25, 2013, from http://www.kinecteducation.com/blog/2011/07/30/3-things-kinect-needs-to-be-successful-in-education/ Cite error: Invalid <ref> tag; name "Kinect Education 3" defined multiple times with different content
  24. Horn, M.S., Crouser, R.J., and Bers, M.U. (2012). Tangible Interaction and Learning: the Case for a Hybrid Approach. Personal Ubiquitous Comput. 16(4), 379-389. doi:10.1007/s00779-011-0404-2
  25. Giz Mag, Moore, N. 2013. Leap Motion sensor offers 3D gesture control at an affordable price. Retrieved February 25, 2013, from http://www.gizmag.com/leap-motion-gesture-control-sensor/22644/.
  26. Microsoft Store,Kinect product information, Retrieved February 25, 2013 from http://www.microsoftstore.com/store/msstore/pd/Kinect-for-Xbox-360-Refurbished/productID.226908400 .
  27. Garg, P. Aggarwal, N. & Sofat, S. (2009). Vision Based Hand Gesture Recognition. World Academy of Science, Engineering and Technology., 25, 972-977.
  28. BBC Future, Rubens, P. 2012. The end for keyboards and mice? Retrieved February 25, 2013, from http://www.bbc.com/future/story/20121023-the-end-for-keyboard-and-mice
  29. New Scientist, Firth, N. 2013. Hands on with leap motion's gestural interface. Retrieved February 25, 2013 from http://www.newscientist.com/blogs/onepercent/2013/01/hands-on-with-leap-motion.html.