Workshop on Rich Multimodal Application Development
The ways that we can interact with local and distributed devices and with our environments are progressing way beyond the traditional mouse and keyboard at a faster and faster rate. Touchscreens, speech input, and motion input, now ubiquitous, have only been common on smartphones for a few years. Smartphone cameras are now used for much more than taking pictures. Today we take for granted camera applications like object recognition and QR code readers. In addition to input technologies, a wide variety of sensors are coming on the scene as well, including common sensors such as GPS, as well as a wide range of specialized medical and environmental sensors. Not only smartphones but other platforms such as cars, home entertainment systems, games, and ebooks are also becoming more and more capable of interacting with users in sophisticated ways.
All of this technology has the potential to support truly revolutionary applications. But, in order to make this possible, developers need easy to use, interoperable standards for integrating these technologies into systems. Otherwise, multimodal applications will be expensive, complex, proprietary, and out of the reach of small companies with limited resources. Standards can level the playing field. The upcoming workshop on Rich Multimodal Applications will show how W3C standards such as HTML5 and the W3C Multimodal Architecture can support exciting new forms of interaction on a wide variety of platforms. We'll do this with demos, overviews of current standards, and by discussing new use cases from many industries that will motivate new features of the standards. Please join us and help create the future of interaction!
Comments (0)
Comments for this post are closed.