Bibliography COGAIN

From COGAIN: Communication by Gaze Interaction (hosted by the COGAIN Association)
Jump to: navigation, search

COGAIN related publications by COGAIN partners

Aoki, H., Hansen, J.P. and Itoh, K. (2008). Learning to interact with a computer by gaze. Behaviour and Information Technology 27(4), 339-344
http://www.itu.dk/docadm/detail.php?DocID=1696

Aula, A., Majaranta, P. and Räihä, K.-J. (2005). Eye-tracking Reveals the Personal Styles for Search Result Evaluation. Human-Computer Interaction - INTERACT 2005, Lecture Notes in Computer Science 3585, Springer-Verlag, September 2005, 1058-1061
http://www.springerlink.com/openurl.asp?genre=article&id=doi:10.1007/11555261_104

Barth, E., Dorr, M., Böhme, M., Gegenfurtner, K.R. and Martinetz, T. (2006). Guiding the mind's eye: improving communication and vision by external control of the scanpath. In B. Rogowitz and T.V. Papathomas, editors, Human Vision and Electronic Imaging, volume 6057 of Proc. SPIE. Invited contribution for a special session on Eye Movements, Visual Search, and Attention: a Tribute to Larry Stark.
http://www.inb.uni-luebeck.de/publications/pdfs/BaDoBoGeMa06.pdf

Bates, R., Istance, H.O., Donegan, M., Oosthuizen, L. (2005). Fly Where You Look: Enhancing Gaze Based Interaction in 3D Environments. In Proceedings of HCI International 2005, Volume 7 - Universal Access in HCI: Exploring New Interaction Environments, Caesars Palace, Las Vegas, USA, July 22-27 2005.

Bates, R., Donegan, M., Istance, H. O., Hansen, J. P. and and Räihä, K-J (2007). Introducing COGAIN: communication by gaze interaction. Universal Access in the Information Society (UAIS) 6, 2, 159-166
http://www.springerlink.com/content/k62q412852250323/

Bates, R., Donegan, M., Istance, H.O., Hansen, J.P. and Räihä, K.-J. (2006). Introducing COGAIN – Communication by Gaze Interaction. Proceedings of the 3rd Cambridge Workshop on Universal Access and Assistive Technology (CWUAAT), Fitzwilliam College, University of Cambridge, 10th - 12th April 2006

Bates, R. and Istance, H.O. (2005). Towards eye based virtual environment interaction for users with high-level motor disabilities. Special Issue of International Journal of Disability & Human Development: The International Conference Series on Disability, Virtual Reality and Associated Technologies, Vol. 4(3)

Bates, R., Istance, H.O. and Vickers, S. (2008). Gaze Interaction with Virtual On-Line Communities: Levelling the Playing Field for Disabled Users. Proceedings of the 4th Cambridge Workshop on Universal Access and Assistive Technology (CWUAAT 2008). University of Cambridge, 13-16 April 2008.
http://www.cse.dmu.ac.uk/~svickers/pdf/CWUAAT%202008.pdf

Bonino, D. and Garbo, A. (2006). An Accessible Control Application for Domotic Environments. First International Conference on Ambient Intelligence Developments, September 2006, Sophia-Antipolis, pp.11-27. Ed. Springer-Verlag, ISBN-10: 2-287-47469-2
http://www.cad.polito.it/FullDB/exact/amid06_1.html

Borello, F., Bonino, D. and Corno, F. (2006). Accessing Ambient Intelligence through Devices with Low Computational and Communication Power. First International Conference on Ambient Intelligence Developments, September 2006, Sophia-Antipolis, pp.93-107. Ed. Springer-Verlag, ISBN-10: 2-287-47469-2
http://www.cad.polito.it/FullDB/exact/amid06_2.html

Böhme, M. and Barth, E. (2005). Challenges in Single-Camera Remote Eye-Tracking. In 1st Conference on Communication by Gaze Interaction (COGAIN), Copenhagen, Denmark
http://www.inb.uni-luebeck.de/publications/pdfs/BoBa05.pdf

Böhme, M., Dorr, M., Graw, M., Martinetz, T. and Barth, E. (2008). A software framework for simulating eye trackers. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, 251-258.
http://doi.acm.org/10.1145/1344471.1344529

Böhme, M., Dorr, M., Martinetz, T. and Barth, E. (2006). Gaze-contingent temporal filtering of video. In Proceedings of Eye Tracking Research & Applications (ETRA), pages 109-115
http://doi.acm.org/10.1145/1117309.1117353

Böhme, M., Martinetz, T. and Barth, E. (2005). A Gaze-Estimation Algorithm for Single-Camera Remote Eye Tracking. In Proceedings of the BIP Workshop on Bioinspired Information Processing, Lübeck, Germany

Calvo, A., Chiò, A., Castellina, E., Corno, F., Farinetti, L., Ghiglione, P., Pasian, V. and Vignola, A. (2008). Eye Tracking Impact on Quality-of-Life of ALS Patients. In Computers Helping People with Special Needs (ICCHP 2008), LNCS Vol 5105/2008, Berlin: Springer, pp. 70-77.
http://dx.doi.org/10.1007/978-3-540-70540-6_9

Castellina, E., Corno, F. and Pellegrino, P. (2008). Integrated speech and gaze control for realistic desktop environments. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA 2008). ACM, New York, NY, 79-82.
http://doi.acm.org/10.1145/1344471.1344492

Cerrolaza, J. J., Villanueva, A. and Cabeza, R. (2008). Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA '08), ACM, New York, NY, 259-266.
http://doi.acm.org/10.1145/1344471.1344530

Corno, F. and Garbo, A. (2005). Multiple Low-cost Cameras for Effective Head and Gaze Tracking. 11th International Conference on Human-Computer Interaction, Las Vegas, USA, July 2005
http://www.cad.polito.it/FullDB/exact/hci05.html

Daunys, G. and Ramanauskas, N. (2004). The accuracy of eye tracking using image processing. In Proceedings NordiCHI '04, October 23-27, 2004 Tampere, Finland, ACM press, pp. 377-380
http://portal.acm.org/citation.cfm?id=1028074&coll=ACM&dl=ACM&CFID=1269532&CFTOKEN=13860591

Daunys, G. and Ramanauskas, N. (2006). Nonlinear Mapping of Pupil Centre Coordinates from Image Sensor to Screen for Gaze Control Systems. K. Miesenberger et al. (Eds.): ICCHP 2006, LNCS 4061, pp. 962 – 965
http://dx.doi.org/10.1007/11788713_140

Donegan, M., Oosthuizen, L., Bates, R., Daunys, G., Hansen, J. P., Joos, M., Signorile, I. and Majaranta, P. (2005). Providing eye control for those who need it most - a study on user requirements. Abstracts of the 13th European Conference on Eye Movements, ECEM13, JEMR Vol. 1
http://www.jemr.org/index.php?page=20

Fejtová, M., Novák, P. and Štěpánková, O. (2008). EasyControl - Universal Control System. Computers Helping People with Special Needs. LNCS Vol 5105/2008, Berlin: Springer, pp. 1024-1029
http://dx.doi.org/10.1007/978-3-540-70540-6_153

Graupner, S. T., Velichkovsky, B. M., Pannasch, S. and Marx, J. (2007). Surprise, surprise: Two distinct components in the visually evoked distractor effect. Psychophysiology, 44(2), 251-261
http://rcswww.urz.tu-dresden.de/~cogsci/pdf/graupner2007.pdf

Graupner, S.T., Heubner, M., Pannasch, S. and Velichkovsky, B.M. (2008). Evaluating requirements for gaze-based interaction in a see-through head mounted display. In Proceedings of the 2008 symposium on Eye tracking research & applications (ETRA 2008), ACM, New York, pp. 91-94.
http://doi.acm.org/10.1145/1344471.1344495

Hansen, D. W., Skovsgaard, H. H., Hansen, J. P. and Møllenbach, E. (2008). Noise tolerant selection by gaze-controlled pan and zoom in 3D. Proceedings of the Eye Tracking Research & Applications Symposium (ETRA 2008). ACM, New York, NY, 205-212
http://doi.acm.org/10.1145/1344471.1344521

Hansen, D.W. and Ji, Q. (2009). In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23 Jan. 2009. IEEE computer Society Digital Library. IEEE Computer Society
http://doi.ieeecomputersociety.org/10.1109/TPAMI.2009.30

Hansen, D.W and Hammoud, R.I. (2007). An improved likelihood model for eye tracking. Computer Vision and Image Understanding 106(2007), 220-230
http://dx.doi.org/10.1016/j.cviu.2006.06.012

Hansen, J.P., Lund, H., Aoki, H. and Itoh, K (2006). Gaze communication systems for people with ALS. Presented in the ALS Workshop, in conjunction with the 17th International Symposium on ALS/MND, Yokohama, Japan (File:ALS Workshop Yokohama2006.pdf)

Hansen, J.P, Itoh, K., Aoki, H. and Lund, H. (2007). New usability metrics for the evaluation of eye typing systems. Abstracts of the 14th European Conference on Eye Movements, ECEM2007, JEMR, Vol. 1
http://www.jemr.org/contents/volume_1/ecem2007_special/oral_presentations/usability

Helmert, J. R., Pannasch, S. and Velichkovsky, B. M. (2008). Influences of dwell time and cursor control on the performance in gaze driven typing. Journal of Eye Movement Research, 2(4):3, 1-8.
http://www.jemr.org/online/2/4/3

Helmert, J.R, Pannasch, S. and Velichkovsky, B.M. (2007). From gaze mouse to attentive interfaces: three selected problems. Abstracts of the 14th European Conference on Eye Movements, ECEM2007, JEMR, Vol. 1
http://www.jemr.org/contents/volume_1/ecem2007_special/oral_presentations/usability

Hyrskykari, A., Majaranta, P. and Räihä, K.-J. (2005). From Gaze Control to Attentive Interfaces. Proceedings of HCII 2005, Las Vegas, NV, July 2005
http://www.cs.uta.fi/hci/gaze/docs/Hyrskykari-HCII2005.pdf

Istance, H.O., Bates, R., Hyrskykari, A. and Vickers, S. (2008). Snap clutch, a moded approach to solving the Midas touch problem. Proceedings of the 2008 symposium on Eye Tracking Research & Applications (ETRA 2008), ACM Press, 221-228.
http://doi.acm.org/10.1145/1344471.1344523

Itoh, K., Aoki, H. and Hansen, J. P. (2006). A comparative usability study of two Japanese gaze typing systems. Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, ETRA '06. ACM Press, New York, NY, 59-66
http://doi.acm.org/10.1145/1117309.1117344

Johansen, A.S. and Hansen, J.P. (2006). Augmentative and alternative communication: the future of text on the move. Universal Access in the Information Society (UAIS), 5, 2
http://www.springerlink.com/content/j83t41473p555541/

Laurutis V. (2008). Channel Information Capacity of the Sensomotor System of the Eye. Electronics and Electrical Engineering 5(85), 85-88.
http://www.ee.ktu.lt/journal/2008/5/21_ISSN_1392-1215_Channel%20Information%20Capacity%20of%20the%20Sensomotor%20System%20of%20the%20Eye.pdf

Majaranta, P., Aula, A. and Räihä, K.-J. (2004). Effects of Feedback on Eye Typing with a Short Dwell Time. Proceedings of the Eye Tracking Research & Applications Symposium, ETRA 2004, New York: ACM Press, pp. 139-146
http://doi.acm.org/10.1145/968363.968390

Majaranta, P., MacKenzie, I. S., Aula, A. and Räihä, K.-J. (2003). Auditory and visual feedback during eye typing. Extended Abstracts on Human Factors in Computing Systems, CHI '03, New York: ACM Press, 766-767 (Available in pdf format)
http://doi.acm.org/10.1145/765891.765979

Majaranta, P., MacKenzie, I. S., Aula, A. and Räihä, K.-J. (2006). Effects of feedback and dwell time on eye typing speed and accuracy. Universal Access in the Information Society, 5(2), 199-208
http://dx.doi.org/10.1007/s10209-006-0034-z

Majaranta, P. and Räihä, K.-J. (2005). Communication by gaze interaction - in search of new solutions. Abstracts of the 13th European Conference on Eye Movements, ECEM13, JEMR Vol. 1
http://www.jemr.org/index.php?page=20

Majaranta, P. and Räihä, K.-J. (2007). Text Entry by Gaze: Utilizing Eye-Tracking. In I.S. MacKenzie and K. Tanaka-Ishii (Eds.), Text entry systems: Mobility, accessibility, universality. pp. 175-187. San Francisco: Morgan Kaufmann.
http://www.cs.uta.fi/~curly/publications/Majaranta_and_Raiha_2007--DRAFT.pdf

Majaranta, Päivi and Räihä, Kari-Jouko (2002). Twenty Years of Eye Typing: Systems and Design Issues. Proceedings of the Eye Tracking Research & Applications Symposium, ETRA 2002, New York: ACM Press, 15-22 (File:Paper 2002 3.pdf)
http://doi.acm.org/10.1145/507072.507076

Mateo, J.C., San Agustin, J. and Hansen, J.P. (2008). Gaze beats mouse: hands-free selection by combining gaze and emg. In CHI '08 Extended Abstracts on Human Factors in Computing Systems, CHI '08. ACM, New York, NY, 3039-3044
http://doi.acm.org/10.1145/1358628.1358804

Meyer, A., Böhme, M., Martinetz, T. and Barth, E. (2006). A single-camera remote eye tracker. In Perception and Interactive Technologies, volume 4021 of Lecture Notes in Artificial Intelligence, pages 208-211
http://www.inb.uni-luebeck.de/publications/pdfs/MeBoMaBa06.pdf

Pannasch, S., Helmert, J.R., Malischke, S., Storch, A. and Velichkovsky, B.M. (2008). Eye typing in application: A comparison of two systems with ALS patients. Journal of Eye Movement Research, 2(4):6, 1-8.
http://www.jemr.org/online/2/4/6

Pannasch, S., Helmert, J.R. and Velichkovsky, B.M. (2007). On eye tracking and usability research: Introduction to the symposium. Abstracts of the 14th European Conference on Eye Movements, ECEM2007, JEMR, Vol. 1
http://www.jemr.org/contents/volume_1/ecem2007_special/oral_presentations/usability

Pellegrino, P., Bonino, D. and Corno, F. (2006). Domotic House Gateway. SAC 2006, ACM Symposium on Applied Computing, April 23-27, 2006, Dijon, France
http://www.cad.polito.it/FullDB/exact/sac2006.html

Ramanauskas, N. and Daunys, G. (2005). Elimination of Head Shifts in Videooculography. Electronics and Electrical Engineering, Kaunas: Technologija, No. 8 (64), pp. 69-72.
http://www.ktu.lt/lt/mokslas/zurnalai/elektr/z64/Ramanauskas str.pdf

Ramanauskas, N., Daunys, G. and Dervinis, D. (2008). Investigation of Calibration Techniques in Video Based Eye Tracking System. In Computers Helping People with Special Needs (ICCHP 2008), LNCS Vol 5105/2008, Berlin: Springer, pp. 1208-1215
http://dx.doi.org/10.1007/978-3-540-70540-6_182

Räihä, K.-J., Aula, A., Majaranta, P., Rantala, H. and Koivunen, K. (2005). Static Visualization of Temporal Eye-Tracking Data. Human-Computer Interaction - INTERACT 2005, Lecture Notes in Computer Science 3585, Springer-Verlag, September 2005, 946-949
http://dx.doi.org/10.1007/11555261_76

San Agustin, J.S., Villanueva, A. and Cabeza, R. (2006). Pupil brightness variation as a function of gaze direction. Proceedings of the 2006 Symposium on Eye Tracking Research & Applications (ETRA '06), ACM Press, 49-49
http://doi.acm.org/10.1145/1117309.1117334

Shi, F. and Gale, A. (2008). Eye-Operated Assistive Technology for Environmental Control. In Proc. The Fourth IASTED International Conference Telehealth and Assistive Technologies, 16-18 April 2008, Baltimore, USA, pp. 61-65
http://www.actapress.com/PaperInfo.aspx?PaperID=33223&reason=500

Shi, F., Gale, A. and Mollenbach, E. (2008). Eye, Me and the Environment. In Computers Helping People with Special Needs, LNCS Vol 5105/2008, Berlin: Springer, 2008, pp. 1030-1033
http://dx.doi.org/10.1007/978-3-540-70540-6_154

Shi, F., Gale, A. and Purdy, K. (2006). Helping People with ICT Device Control by Eye Gaze. In ICCHP 2006: Computers Helping People with Special Needs, LNCS 4061, Springer, 480-487
http://dspace.lboro.ac.uk/dspace/bitstream/2134/2285/1/PUB426.pdf

Shi, F., Gale, A. and Purdy, K. (2007). A New Gaze-Based Interface for Environmental Control. Universal Access in Human-Computer Interaction. Ambient Interaction. LNCS 4555. pp. 996-1005.
http://dx.doi.org/10.1007/978-3-540-73281-5_109

Shi, F., Gale, A. and Purdy, K. (2007). Exploring eye responsive control - from a head mounted to a remote system. In Bust (ed.), Contemporary Ergonomics. Proceedings of the Ergonomics Society Annual Conference, Nottingham, 17-19 April 2007.
http://hdl.handle.net/2134/2631

Tuisku, O., Majaranta, P., Isokoski, P. and Räihä, K.-J. (2008). Now Dasher! Dash Away! Longitudinal Study of Fast Text Entry by Eye Gaze. In Proceedings of Eye Tracking Research & Applications, ETRA 2008, ACM Press, 19-26. (DOI: 10.1145/1344471.1344476)
http://www.cs.uta.fi/~curly/publications/ETRA2008-Tuisku.pdf

Velichkovsky, B.M. (2007). Towards an evolutionary framework for humancognitive neuroscience. Theoretical Biology. (in press)
http://rcswww.urz.tu-dresden.de/~cogsci/pdf/theoretical_biology_velich.pdf

Velichkovsky, B.M., Joos, M., Helmert, J.R., & Pannasch, S. (2005). Two visual systems and their eye movements: evidence from static and dynamic scene perception. CogSci 2005: Proceedings of the XXVII Conference of the Cognitive Science Society. July 21-23 Stresa, Italy, pp. 2283-2288
http://rcswww.urz.tu-dresden.de/~cogsci/pdf/p2283.pdf

Vickers, S., Bates, R. and Istance, H. (2008). Gazing into a Second Life: Gaze-Driven Adventures, Control Barriers, and the Need for Disability Privacy in an Online Virtual World. Proceedings of the 7th International Conference on Disability, Virtual Reality and Associated Technologies; ICDVRAT 2008, Maia, Portuagal, 8th-10th September 2008.
http://www.icdvrat.reading.ac.uk/2008/abstracts.htm

Villanueva, A. and Cabeza, R. (2007). Models for gaze tracking systems. EURASIP Journal on Image and Video Processing, Volume 2007, Article ID 23570, 16 pages.
http://dx.doi.org/10.1155/2007/23570

Villanueva, A. and Cabeza, R. (2008). A Novel Gaze Estimation System With One Calibration Point. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 38(4), pp. 1123-1138.
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?isnumber=4567535&arnumber=4567549

Villanueva, A., Cerrolaza, J. and Cabeza, R. (2007). Geometry Issues of a Gaze Tracking System. Universal Access in Human-Computer Interaction. Ambient Interaction. LNCS 4555. pp. 1006-1015
http://dx.doi.org/10.1007/978-3-540-73281-5_110

See also

Bibliography Gaze Interaction

Books, Proceedings, Monographes and Theses related to eye tracking, eye movement research, gaze interaction

Personal tools