Hello! 👋

I am an HCI Research Scientist at Reality Labs Research at Meta, working under the supervision of Ben Lafreniere on the application of LLMs for novel augmented-reality experiences.

Previously, I was a postdoctoral researcher at the Shared Reality Lab (SRL) at McGill University, supervised by Dr. Jeremy Cooperstock. I completed my PhD at the Accessible Computing Technologies Research Group (ACT Lab) at McGill University, where I worked with Dr. Karyn Moffatt on developing Click AAC, the first Augmentative and Alternative Communication (AAC) app that leverages AI techniques to provide automated language support to people with communication disabilities, and its successor, QuickPic AAC, developed in collaboration with Howard Shane from Boston Children's Hospital and Harvard University.

My work lies at the intersection of Human-Computer Interaction (HCI) and accessibility, with a particular interest in designing, developing, and evaluating assistive technologies. My research has been published in top-tier venues on HCI (ACM CHI), haptics (IEEE Transactions on Haptics, World Haptics), computational linguistics (ACL Meeting), and accessibility (ACM ASSETS), and has been awarded Best Student Paper and Best Artifact awards at ACM ASSETS 2022.

 

Projects

Research Interests

Human-Computer Interaction

Assistive Technologies

Natural Language Processing

Haptics