Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Semantic Models for Adaptive Interactive SystemsUser Interaction Templates for the Design of Lifelogging Systems

Semantic Models for Adaptive Interactive Systems: User Interaction Templates for the Design of... [A variety of life-tracking devices are being created to give opportunity to track our daily lives accurately and automatically through the application of sensing technologies. Technology allows us to automatically and passively record life activities in previously unimaginable detail, in a process called lifelogging. Captured materials may include text, photos/video, audio, location, Bluetooth logs and information from many other sensing modalities, all captured automatically by wearable sensors. Experience suggests that it can be overwhelming and impractical to manually scan through the full contents of these lifelogs. A promising approach is to apply visualization to large-scale data-driven lifelogs as a means of abstracting and summarizing information. In this chapter, we outline various UI templates that support different visualization schemes.] http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

Semantic Models for Adaptive Interactive SystemsUser Interaction Templates for the Design of Lifelogging Systems

Part of the Human–Computer Interaction Series Book Series
Editors: Hussein, Tim; Paulheim, Heiko; Lukosch, Stephan; Ziegler, Jürgen; Calvary, Gaëlle

Loading next page...
 
/lp/springer-journals/semantic-models-for-adaptive-interactive-systems-user-interaction-OnpU0hscK2
Publisher
Springer London
Copyright
© Springer-Verlag London 2013
ISBN
978-1-4471-5300-9
Pages
187 –204
DOI
10.1007/978-1-4471-5301-6_10
Publisher site
See Chapter on Publisher Site

Abstract

[A variety of life-tracking devices are being created to give opportunity to track our daily lives accurately and automatically through the application of sensing technologies. Technology allows us to automatically and passively record life activities in previously unimaginable detail, in a process called lifelogging. Captured materials may include text, photos/video, audio, location, Bluetooth logs and information from many other sensing modalities, all captured automatically by wearable sensors. Experience suggests that it can be overwhelming and impractical to manually scan through the full contents of these lifelogs. A promising approach is to apply visualization to large-scale data-driven lifelogs as a means of abstracting and summarizing information. In this chapter, we outline various UI templates that support different visualization schemes.]

Published: May 14, 2013

Keywords: Social Network Service; Wearable Sensor; Energy Expenditure Measurement; Radar Graph; Wearable Camera

There are no references for this article.