Making Pen-Based Interaction Intelligent and Natural:
Papers from the AAAI Fall Symposium
Randall Davis, James Landay, Tom Stahovich, Rob Miller, and Eric Saund, Cochairs
October 21-24, 2004, Arlington, Virginia
Technical Report FS-04-06
174 pp., $35.00
ISBN 978-1-57735-217-4
[Add to Cart] [View Cart]
With the growing interest in and use of PDAs and tablet computers, pen-based interaction has become an area of increasing research interest and practical consequences. To date, however, most pen-based interaction is still done using either traditional mouse motions or with an artifi- cial gesture language like Palm's Graffiti. This symposium aims to explore what it would take to make intelligent pen-based interaction feel much more like the kind of writing and drawing we routinely do on paper. What would it take to make sketching on a tablet computer, for example, feel as natural as sketching on paper, yet have the computer understand what is being drawn? Can we extend the interaction so that the system also understands the often fragmentary speech and the variety of hand gestures that go with drawing in environments like collaborative design reviews? How can multimodal input such as pen strokes, speech, and gestures be naturally combined and used for mutual disambiguation? Solving these challenges would provide an enormous advance over traditional tools for tasks like design and brainstorming. The central goal of the symposium was to provide a focus for the growing community interested in making pen-based computing more natural by making it smarter, and interested in uses of penbased computing that go beyond handwriting recognition. It was also an opportunity to cross-fertilize research in AI and HCI, aiming on one hand to make human-computer interaction more natural by making it smarter, and on the other to infuse AI research with the insights and expertise of the HCI community.