A student team led by Henry Pfister’s research group will develop and evaluate an AI-powered “visual assistant” that pairs smart glasses with modern visual language models to assist people with visual impairment. We’ll start by benchmarking models on the VisAssistDaily dataset and then prototype a real-time system using open-source multimodal AI software. Students will gain hands-on experience in visual LLMs, data analysis, and human-centered evaluation. The project will focus on the technical development side in consultation with experts in wearable computing and accessibility/health.
Project Lead: Henry Pfister
Project Manager: TBD


