Everyday objects around us are becoming intelligent. Phones, watches, glasses, and even clothing now sense, interpret, and respond to human behavior. But as we move toward an AI-powered world, the challenge is no longer just building smarter devices; it is designing sensing systems that understand human context while preserving privacy, supporting user agency, and earning trust.

This course explores how sensing and AI come together to create interactive, context-aware systems that responsibly perceive and act on human activity. Students will learn the principles and practice of designing end-to-end sensing systems: from data capture and signal processing to interpretation and user interaction, through the lens of human–AI collaboration. We will discuss how fields such as human–computer interaction, embedded computing, computer vision, distributed systems, machine learning, and security intersect to create systems that are not only powerful but also transparent, ethical, and user-centered.

The course emphasizes a hands-on approach: students will prototype, program, and evaluate sensing systems in domains such as activity recognition, health and wellness, environmental awareness, and gestural interaction. Along the way, we will examine questions such as:

Students will gain practical experience in:

The class will combine lectures, tutorials, and discussions with project-based learning. Assessment will include three mini-projects, weekly readings and reflections, and a final project that integrates technical and human-centered design principles.

Instructors: Yuvraj Agarwal, Mayank Goel (Office hours: Fridays 1:30pm - 2:30 pm, TCS Hall 235)

TA: Prasoon Patidar, Riku Arakawa (Office hours: TBD)

Location: Posner Hall 146

Time: Mondays and Wednesdays, 11:00 AM -12:20 PM

Canvas: https://canvas.cmu.edu/courses/52576

Schedule