Visual Language Models for Visual Assistance

2026

A student team led by Henry Pfister’s research group will develop and evaluate an AI-powered “visual assistant” that pairs smart glasses with modern visual language models to assist people with visual impairment.  We’ll start by benchmarking models on the VisAssistDaily dataset and then prototype a real-time system using open-source multimodal AI software. Students will gain hands-on experience in visual LLMs, data analysis, and human-centered evaluation.  The project will focus on the technical development side in consultation with experts in wearable computing and accessibility/health.

Project Lead: Henry Pfister

Project Manager: TBD

Contact

Assistant Director of Student Research, Data+ Program Director

Mathematics

Related People

Co-Assistant Director of Research

Electrical and Computer Engineering