Contact

Helena M. Mentis, Ph.D.
Assistant Professor
mentis at umbc dot edu

Department of Information Systems
University of Maryland, Baltimore County
1000 Hilltop Circle
Baltimore, Maryland 21250

Office: ITE 431
Phone:410-455-3687
Fax: 410-455-1073


Helena Mentis

I am an assistant professor in the Department of Information Systems at the University of Maryland, Baltimore County (UMBC) and the director of the Bodies in Motion Lab.

My research interests span the areas of human-computer interaction (HCI), computer supported cooperative work (CSCW), and health informatics. I'm most interested in designing for movement-based interaction, specifically using commercially available sensors such as the Microsoft Kinect or Leap Motion device. Because of my interest in health I am fascinated with the body - how it makes us feel, how we perceive it, and how we use it to express ourselves. I’ve conducted research on using touchless interaciton in the operating room, the challenges with training gestural systems, how to sense and perceive movement qualities, the place for gestural interaction in the home, and designing for somaesthetic awareness through movement in the dark. I have employed a variety of methods in my work, but primarily I perform participant observations and interviews throughout a design research process.

Prior to my position at UMBC, I was an ERCIM postdoctoral researcher at Mobile Life in Sweden, held a joint postdoctoral fellowship at Microsoft Research Cambridge and Corpus Christi College at the University of Cambridge, and then went on to serve as a research fellow at Harvard Medical School. I received a PhD in Information Sciences and Technology from Penn State, MS in Communication from Cornell, and BS in Psychology from Virginia Tech.

Current Areas of Research

Imaging and Interaction in Surgery

Investigating the use and design of touchless systems (e.g. gestural, voice control, heads up displays) for seeing, coordinating, and learning around medical images in the operating room. The introduction of touchless interaction into this environment allows surgeons not only to maintain a sterile environment while accessing new functionality, but also gives them an opportunity to explore and discuss the imaging data to make informed surgical decisions.

Video explaining how our first Kinect system worked (2013)
Video from BBC Coverage (2013)

Feng, Y., Wong, C., Park, A., & Mentis, H. (to appear). Taxonomy of instructions given to residents in laparoscopic cholecystectomy. Presented at the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES) 2015 Meeting, Nashville, TN.

Mentis, H., Rahim, A., & Theodore, P. (to appear). Referencing CT Scans through a Headmounted Optical Display During Laparoscopic Surgery. Presented at the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES) 2015 Meeting, Nashville, TN.

Mentis, H.M., O’Hara, K., Gonzalez, G., Sellen, A., Corish, R., Criminisi, A., Trivedi, R., & Theodore, P. (to appear). Voice or Gesture in the Operating Room. Extended Abstracts of the Conference on Human Factors in Computing Systems (CHI), Seoul, South Korea, (pp. xx-xx), New York: ACM.

Feng, Y., Wong, C., & Mentis, H.M. (to appear). Direction-giving to Residents in Laparoscopic Surgery. Presented at the AMIA Workshop on Interactive Systems in Healthcare (WISH).

O’Hara, K., Gonzalez, G., Penney, G., Sellen, A., Corish, R., Mentis, H., ... & Carrell, T. (2014). Interactional Order and Constructed Ways of Seeing with Touchless Imaging Systems in Surgery. Computer Supported Cooperative Work (CSCW), 23(3), 299-337.

Mentis, H., Chellali, A., & Schwaitzberg, S. (2014). Learning to See the Body: Supporting Instructional Practices in Laparoscopic Surgical Procedures. Proceedings of the Conference on Human Factors in Computing Systems (CHI), Toronto, ON, Canada (pp. 2113-2122), New York:ACM.

O'Hara, K., Harper, R., Mentis, H.M., Sellen, A., & Taylor, A. (2013). On the naturalness of touchless: Putting the "interaction" back into NUI. Embodied Interaction. Spec. issue of ACM Transactions on Computer-Human Interaction (TOCHI), 20(1).

Mentis, H.M. & Taylor, A. (2013). Imaging the body: Embodied vision in minimally invasive surgery. Proceedings of the Conference Human Factors in Computing Systems (CHI), Paris, France (pp. 1479-1488), New York: ACM.

Mentis, H.M., O’Hara, K, Sellen, A., & Trevedi, R. (2012). Interaction proxemics and image use in neurosurgery. Proceedings of the Conference on Human Factors in Computing Systems (CHI), Austin, Texas (pp. 927-936), New York: ACM.

Parkinson's Disease Movement Assessment

Investigating the benefits of presenting Parkinson’s patient with sensor-based assessments of movement as well as collecting self-assessed impairment to be presented alongside sensor-based symptom data on patient quality of life, patient empowerment, and clinical treatment decision-making. This area of study emphasizes the need for integrating a patient’s subjective assessment of motor impairment into objective motor sensing data in order to provide a more complete view of the patient’s illness.

Mentis, H., Shewbridge, R., Powell, S., Fishman, P., & Shulman, L. (to appear). Being seen: Co-Interpreting Parkinson’s Patient’s Movement Ability in Deep Brain Stimulation Programming. Proceedings of the Conference on Human Factors in Computing Systems (CHI), Seoul, South Korea (pp. xx-xx), New York:ACM.

Shewbridge, R., Mentis, H.M., Pharr, C., Powell, S., Fishman, P., Armstrong, M., & Shulman, L. (2014). Getting in Sync: Health and Digital Literacy in Patient Deep Brain Stimulation Device Use. Presented at the AMIA Workshop on Interactive Systems in Healthcare (WISH).

Morrison, C., Culmer, P., Mentis, H., & Pincus, T. (2014). Vision-based body tracking: turning Kinect into a clinical tool. Disability and Rehabilitation: Assistive Technology, (0), 1-5.

Seeing, Feeling, and Interpreting Movement

Investigating the applicability of movement sensors align with the experience of movement or with the professional vision of movement assessment. Much of this work has occurred outside of health contexts - including dance, museums, and gaming in the home. The lessons learned, though, provide insights on how to use, train, and appropriate movement sensors for specific human-centered contexts.

Mentis, H., Laaksolahti, J., & Höök, K. (2014). My Self and You: Tension in Bodily Sharing of Experience. ACM Transactions on Computer-Human Interaction, 21(4), article 20.

Mentis, H.M. & Johansson, C. (2013). Seeing movement qualities. Proceedings of the Conference Human Factors in Computing Systems (CHI), Paris, France (pp. 3375-3384), New York: ACM.

Harper, R. & Mentis, H.M. (2013). The mocking gaze: ‘You are a poor controller!’ Proceedings of the Conference on Computer Supported Cooperative Work (CSCW), San Antonio, Texas (pp. 167-180), New York: ACM.

Vongsathorn, L., O’Hara, K., & Mentis, H.M. (2013). Bodily interaction in the dark. Proceedings of the Conference Human Factors in Computing Systems (CHI), Paris, France (pp. 1275-1278), New York: ACM.

Fothergill, J., Mentis, H.M., Nowozin, S., & Kohli, P. (2012). Instructing people for training gestural interactive systems. Proceedings of the Conference on Human Factors in Computing Systems (CHI), Austin, Texas (pp. 1737-1746), New York: ACM.