Interactive Gesture Chair

Show simple item record

dc.contributor.author Arefin, Shamsul
dc.contributor.author Foysol, Muhaiminul Islam
dc.date.accessioned 2021-01-22T10:07:24Z
dc.date.available 2021-01-22T10:07:24Z
dc.date.issued 2015-11-15
dc.identifier.citation 1. K. Probst, D. Lindlbauer, P. Greindl, M. Trapp, M. Haller, B. Schwartz, and A. Schrempf, 2013."Rotating, Tilting, Bouncing: Using an Interactive Chair to Promote Activity in Office Environments," in Proceedings of the 31st international conference extended abstracts on Human factors in computing systems, Paris, France, 2013, pp. 79-84. 2. K. Probst, D. Lindlbauer, M. Haller, B. Schwartz, and A. Schrempf, 2014."A Chair as Ubiquitous Input Device: Exploring Semaphoric Chair Gestures for Focused and Peripheral Interaction" in CHI14: Proceedings of the 32nd International Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 2014. 3. K. Probst, D. Lindlbauer, M. Haller, B. Schwartz, and A. Schrempf, 2014."Exploring the Potential of Peripheral Interaction through Smart Furniture" in Peripheral 54 | P a g e Interaction: Shaping the Research and Design Space, Workshop at CHI 2014, Toronto, ON, Canada, 2014. 4. Anttonen, J. and Surakka, V. Music, Heart Rate, and Emotions in the Context of Stimulating Technologies. ACII 2007, 290–301. 5. Baudel, T. and Beaudouin-Lafon, M. Charade: Remote Control of Objects Using Free-Hand Gestures. Communications of the ACM 36, 7 (1993), 28–35. 6. Beckhaus, S., Blom, K., and Haringer, M. ChairIO – The Chair-Based Interface. In Concepts and Technologies for Pervasive Games. 2007, 231–264 7. Van Beurden, M., Ijsselsteijn, W., and De Kort, Y. User Experience of Gesture Based Interfaces: A Comparison with Traditional Interaction Methods on Pragmatic and Hedonic Qualities. GW 2011, 36–47 8. Card, S., Moran, T., and Newel, A. The KeystrokeLevel Model for User Performance Time with Interactive Systems. Communications of the ACM 23, 7 (1980), 396–410 9. Cohen, M. The Internet Chair. International Journal of Human Computer Interaction 15, 2 (2003), 56–57. 10. Dourish, P. Where The Action Is. MIT Press, 2001 11. Endert, A., Fiaux, P., Chung, H., Stewart, M., Andrews, C., and North, C. ChairMouse: Leveraging Natural Chair Rotation for Cursor Navigation on Large, HighResolution Displays. Ext. Abstracts CHI 2011, 571–580. 12. Forlizzi, J., DiSalvo, C., Zimmerman, J., Mutlu, B., and Hurst, A. The SenseChair: The Lounge Chair as an Intelligent Assistive Device for Elders. DUX 2005, 31 13. Hart, S. and Staveland, L. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Human Mental Workload. 1988, 139–183. 14. Hausen, D., Bakker, S., Van den Hoven, E., Butz, A., and Eggen, B. Peripheral Interaction: Embedding HCI in Everyday Life. Workshop at INTERACT 2013. 15. Hausen, D., Boring, S., and Greenberg, S. The Unadorned Desk: Exploiting the Physical Space around a Display as an Input Canvas. INTERACT 2013, 140–158. 55 | P a g e 16. Hausen, D., Richter, H., Hemme, A., and Butz, A. Comparing Input Modalities for Peripheral Interaction: A Case Study on Peripheral Music Control. INTERACT 2013, 162–179. 17. Hausen, D. Comparing Modalities and Feedback for Peripheral Interaction. Ext. Abstracts CHI 2013, 1263–1268. 18. Healy, G., Lawler, S., Thorp, A., Neuhaus, M., Robson, E., Owen, N., and Dunstan, D. Reducing Prolonged Sitting in the Workplace. Human Factors and Ergonomics Society, 2012. 19. Karam, M. and schraefel, m.c. A Study on the Use of Semaphoric Gestures to Support Secondary Task Interactions. Ext. Abstracts CHI 2005, 1961–1964. 20. Karam, M. and schraefel, m.c. A Taxonomy of Gestures in Human Computer Interaction. 2005. 21. Levine, J. a. Non-Exercise Activity Thermogenesis (NEAT). Best Practice & Research Clinical Endocrinology & Metabolism 16, 4 (2002), 679–702. 22. Mankoff, J., Hudson, S., and Abowd, G. Interaction Techniques for Ambiguity Resolution in Recognitionbased Interfaces. UIST 2000, 11–20. 23. Montero, C. and Marshall, M. Would You Do That? – Understanding Social Acceptance of Gestural Interfaces. MobileHCI 2010, 275–278. 24. Norman, D. Natural User Interfaces Are Not Natural. Interactions 17, 3 (2010), 6–10. 25. Oakley, I. and Park, J. Designing Eyes-Free Interaction. In Haptic & Audio Interaction Design. 2007, 121–132. 26. Probst, K., Lindlbauer, D., Greindl, P., Trapp, M., Haller, M., Schwartz, B., and Schrempf, A. Rotating, Tilting, Bouncing: Using an Interactive Chair to Promote Activity in Office Environments. Ext. Abstracts CHI 2013, 79–84. 27. Quek, F., McNeill, D., Bryll, R., and McCullough, K. Multimodal Human Discourse: Gesture and Speech. Transactions on Computer-Human Interaction 9, 3 (2002), 171–193. 56 | P a g e 28. Shen, X. An Evaluation Methodology for Ambient Displays. Journal of Engineering, Computing and Architecture 1, 2 (2007). 29. Springer, T. The Future of Ergonomic Office Seating. Knoll Workplace Research, 2010. 30. Streitz, N., Geißler, J., Holmer, T., Konomi, S., MüllerTomfelde, C., Reischl, W., Rexroth, P., Seitz, P., and Steinmetz, R. i-LAND: An Interactive Landscape for Creativity and Innovation. CHI 1999, 120–127. 31. Tan, H., Slivovsky, L., and Pentland, A. A Sensing Chair Using Pressure Distribution Sensors. Transactions on Mechatronics 6, 3 (2001), 261–268. 32. Van Uffelen, J., Wong, J., Chau, J., van der Ploeg, H., Riphagen, I., Gilson, N., Burton, N., Healy, G., Thorp, A., Clark, B., Gardiner, P., Dunstan, D., Bauman, A., Owen, N., and Brown, W. Occupational Sitting and Health Risks: A Systematic Review. American Journal of Preventive Medicine 39, 4 (2010), 379–88. 33. Weiser, M. The Computer for the 21st Century. Scientific American 265, 3 (1991), 66–75. 34. Wexelblat, A. Research Challenges in Gesture: Open Issues and Unsolved Problems. GW 1997, 1–11. 35. Wobbrock, J., Aung, H., Rothrock, B., and Myers, B. Maximizing the Guessability of Symbolic Input. Ext. Abstracts CHI 2005, 1869–1872. 36. Zheng, Y. and Morell, J. A Vibrotactile Feedback Approach to Posture Guidance. HAPTIC 2010, 351–358. 37. Bailey, B., Konstan, J., and Carlis, J. The Effects of Interruptions on Task Performance, Annoyance, and Anxiety in the User Interface. INTERACT 2000, 593–601. 38. Hudson, S.E., Harrison, C., and Harrison, B. Whack Gestures: Inexact and Inattentive Interaction with Mobile Devices. TEI 2010, 109–112. 39. Pohl, H. and Murray-Smith, R. Focused and Casual Interactions: Allowing Users to Vary Their Level of Engagement. CHI 2013, 2223–2232. 57 | P a g e 40. Weiser, M. and Brown, J.S. The Coming Age of Calm Technology. In Beyond Calculation: The Next Fifty Years of Computing. 1997, 75–85. 41. Alpern, M. and Minardo, K. Developing a Car Gesture Interface For Use as a Secondary Task. In Proc. CHI EA ’03, ACM Press (2003), 932–933. 42. Daian, I., Van Ruiten, A.M., Visser, A., and Zubić, S. Sensitive Chair: A Force Sensing Chair with Multimodal Real-Time Feedback via Agent. In Proc. ECCE ’07, ACM Press (2007), 163–166. 43. Jacob, R.J.K. Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces. In Advances in Human-Computer Interaction (1993), 151–190. 44. Kapoor, A., Mota, S., and Picard, R.W. Towards a Learning Companion that Recognizes Affect. In Proc. Emotional and Intelligent II, AAAI Press (2001). 45. MacKay, B., Dearman, D., Inkpen, K., and Watters, C. Walk’n Scroll: A Comparison of Software-based Navigation Techniques for Different Levels of Mobility. In Proc. MobileHCI ’05, ACM Press (2005), 183–190. 46. Nanayakkara, S., Wyse, L., and Taylor, E.A. Effectiveness of the haptic chair in speech training. In Proc. ASSETS ’12, ACM Press (2012), 235–236. 47. Owen, N., Bauman, A., and Brown, W. Too much sitting: a novel and important predictor of chronic disease risk? In British Journal of Sports Medicine 43, 2 (2009), 81–83. 48. Probst, K., Leitner, J., Perteneder, F., Haller, M., Schrempf, A., and Glöckl, J. Active Office: Towards an Activity-Promoting Office Workplace Design. In Proc. CHI EA ’12, ACM Press (2012), 2165–2170. 49. Quek, F., McNeill, D., Bryll, R., Duncan, S., Ma, X.-F., Kirbas, C., McCullough, K.E., and Ansari, R. Multimodal Human Discourse: Gesture and Speech. In ACM Transactions on Computer-Human Interaction 9, 3 (2002), 171–193. 50. Saffer, D. Designing Gestural Interfaces. O’Reilly Media, 2008. 58 | P a g e 51. Schrempf, A., Schossleitner, G., Minarik, T., Haller, M., Gross, S., and Kurschl, W. PostureCare - Towards a Novel System for Posture Monitoring and Guidance. In Proc. IFAC ’11, Elsevier (2011), 593–598. 52. Vanhala, T., Surakka, V., and Anttonen, J. Measuring Bodily Responses to Virtual Faces with a Pressure Sensitive Chair. In Proc. NordiCHI ’08, ACM Press (2008), 555–559. en_US
dc.identifier.uri http://hdl.handle.net/123456789/793
dc.description Supervised by Dr. Md. Kamrul Hasan, Associate Professor, Islamic University of Technology en_US
dc.description.abstract A computer operator or an office worker now-a-days spend most his time sedentary. Technology is improving day by day and we are also becoming mechanic. Office employees have to stay in front of the PC almost all the day long and this sedentary behavior is not good for health. As a consequence people of all ages can suffer through health problems. Doctor suggested to make some movements during office time so that the probability of attacking by chronic diseases decreases. Therefore many proposals had been proposed to keep people moving during work time. However, for most office workers it is difficult to achieve a considerable reduction of the time spent seated within the office environment. To promote physical activity even in such sedentary situations, this work explores the possibilities of using an interactive office furniture to smoothly integrate physical activity into the daily working routine Chair is the most frequently used furniture by the office workers. We have made a system to interact with the PC using chair. By equipping motion sensing sensors with a chair the movement of the user can be detected which can be used as input device for PC. This way, the “Interactive Gesture Chair” becomes an input device that is ubiquitously embedded into the working environment, and provides an office worker with the possibility to use the movements of his body for rotating, tilting, or bouncing a chair to intuitively control the operations in Desktop Computer. In our thesis work we have used thresholding to define the chair gestures/movements. By analyzing the result we saw that threshold based gestures vary with the variety of weight of the people i.e. the people with different weights have different threshold value for same gesture. Then we tried machine learning algorithms to define gestures so that defined gestures should work for the people of different weights. First of all we tried Euclidian distance method to define gestures. Then we tried Dynamic Time Warping algorithm to define gestures and then we tried decision tree to find a universal threshold for gestures. These defined gestures can be used to control many application of PC. We defined these gestures to control Windows Multimedia Player. en_US
dc.language.iso en en_US
dc.publisher Department of Computer Science and Engineering, Islamic University of Technology, Gazipur, Bangladesh en_US
dc.title Interactive Gesture Chair en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search IUT Repository


Advanced Search

Browse

My Account

Statistics