dc.identifier.citation |
[1] Alexander Gruenstein, Ian McGraw, Ibrahim Badr, “The WAMI toolkit for developing, deploying, and evaluating web-accessible multimodal interfaces,” in Proceedings of the 10th international conference on Multimodal interfaces, Chania, Crete, 2008. [2] Kevin Christian, Bill Kules, Ben Shneiderman, Adel Youssef, “A comparison of voice controlled and mouse controlled web browsing,” in Proceedings of the fourth international ACM conference on Assistive technologies, Arlington, Virginia, 2000. [3] Pourang Irani, Sharon Oviatt, Matthew Aylett, Gerald Penn, Shimei Pan, Nikhil Sharma, Frank Rudzicz, Randy Gomez, Ben Cowan, Keisuke Nakamura, “Designing Speech, Acoustic and Multimodal Interactions,” in Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems , Denver, Colorado,, 2017. [4] Lee, Vicki L. HansonJohn T. RichardsChin Chin, “Web Access for Older Adults: Voice Browsing?,” in International Conference on Universal Access in Human-Computer Interaction, Verlag Berlin Heildelberg, 2007. [5] S. L. Hura, “Usability Testing of Spoken Conversational Systems,” Journal of Usability Studies, vol. 12, no. 4, pp. 155-163, 2017. [6] Rajeev Agarwal, Yeshwant Muthusamy, Vishu Viswanathan, “Voice Browsing the Web for Information Access,” 1998. [Online]. Available: https://www.w3.org/Voice/1998/Workshop/RajeevAgarwal.html. [7] Yael Dubinsky, Tiziana Catarci, Stephen Kimani, “A User-Based Method for Speech Interface Development,” in In: Stephanidis C. (eds) Universal Acess in Human Computer Interaction. Coping with Diversity. UAHCI 2007. Lecture Notes in Computer Science, Berlin, Springer, 2007, pp. 355-364. [8] Karl Lewis, Pettey Micheal, Shneiderman Ben, “Speech Activated versus Mouse-Activated Commands for Word Processing Applications: An Empirical Evaluation,” International Journal of Man-Machine Studies, pp. 667-687, 1993. [9] S. Oviatt, “Interface techniques for minimizing disfluent input to spoken language systems,” in CHI '94 Conference Companion on Human Factors in Computing Systems, Boston, 1994. [10] E. Protalinski, “VentureBeat.com,” 17 05 2017. [Online]. Available: https://venturebeat.com/2017/05/17/googles-speech-recognition-technology-now-has-a-4-9-word-error-rate/. [Accessed 11 2017]. [11] Jhilmil Jain, Arnold Lund, Dennis Wixon, “The future of natural user interfaces,” in CHI EA '11 CHI '11 Extended Abstracts on Human Factors in Computing Systems , Vancouver, 2011. [12] Andreia Sias Rodrigues, Vinicius Kruger da Costa, Rafael Cunha Cardoso, Marcio Bender Machado, Marcelo Bender Machado, Tatiana Aires Tavares, “Evaluation of a Head-Tracking Pointing Device for Users with Motor Disabilities,” in Proceedings of the 10th International 61 Conference on PErvasive Technologies Related to Assistive Environments, Island of Rhodes, 2017. [13] Patle Pooja, Waigaonkar Snehal, Patil Jyoti, Anjankar Piyush, “A Camera Mouse- Applicaiton for Disable Person: A review,” International Journal of Engineering Science and Computing, vol. 7, no. 9, 2017. [14] João M.S.Martins, João M.F.Rodrigues, Jaime A.C.Martins, “Low-cost natural interface based on head movements,” in International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Infoexclusion (DSAI 2015), 2015. [15] Matthew R. Williams, Robert F. Kirsch, “Evaluation of Head Orientation and Neck Muscle EMG Signals as Command Inputs to a Human-Computer Interface for Individuals With High Tetraplegia,” IEEE Transaction on Neural Systems and Rehabilitation Engineering, vol. 16, no. 5, 2008. [16] Amer Al-rahayfeh, Miad Faezipour, “Eye Tracking and Head Movement Detection: A State-of-Art Survey,” IEEE journal of translationall engineering in health and medicine, vol. 1, 2013. [17] P.C.Anjankar, S.A.Waigaonkar, P.D.Patle, J.D.Patil, “A Camera Mouse - An Application for Disable Person,” International Journal of Computer Sciences and Engineering, vol. 6, no. 3, pp. 133-137, 2018. [18] Tunhua Wu, Ping Wang, Yezhi Lin, “A Robust Noninvasive Eye Control Approach For Disabled People Based on Kinect 2.0 Sensor,” IEEE Sensors Letters, vol. 2, no. 3, 2017. [19] I. Scott MacKenzie, Abigail Sellen, William A. S. Buxton, “A comparison of input devices in element pointing and dragging tasks,” in SIGCHI Conference on Human Factors in Computing Systems, New Orleans, 1991. [20] I. S. Mackenzie, Human-Computer Interaction An Empirical Research Perspective, Elsevier, 2013. [21] Behnaz Yousefi, Xueliang Huo, Maysam Ghovanloo, “Using Fitts’s Law for Evaluating Tongue Drive System as a Pointing Device for Computer Access,” in Annual International Conference of the IEEE in Medicine and Biology Society, 2010. [22] Jing Kong, Xiangshi Ren, “Calculation of Effective Target Width and Its Effects on Pointing Tasks,” IPSJ Journal, vol. 45, no. 5, pp. 1570-1572, 2006. [23] João M.S.Martins, João M.F.Rodrigues, Jaime A.C.Martins, “Low-cost Natural Interface Based on Head Movements,” in Proceedings of the 6th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion, 2015. |
en_US |