Official newspaper of The University of Texas at Austin

The Daily Texan

Official newspaper of The University of Texas at Austin

The Daily Texan

Official newspaper of The University of Texas at Austin

The Daily Texan

Advertise in our classifieds section
Your classified listing could be here!
October 4, 2022
LISTEN IN

First Person Vision exposes hidden behaviors

STOCKTOWER_393552_10151387259366258_271700243_n
Chelsea Purgahn

In the future, wearable cameras could diagnose autism.

During a talk on March 1 in the computer science department, Georgia Institute of Technology professor Jim Rehg discussed how combining wearable cameras with advancements in visual technology can create new ways of analyzing and observing behavior.

“For me, one of the exciting opportunities in wearable cameras is not just the ability to advance our understanding of vision, but it is really to connect visions to disciplines that historically have not have had any contact with us, but where I believe, and others are beginning to believe, we can really have a big impact,” Rehg said. 


Rehg discussed clinical applications of wearable cameras for the relationship between psychologists and their child patients. Rehg said that because these young patients are unaware of the therapists’ cameras, their behavior remains perfectly natural. 

“We identify the core behavior constructs that need to be measured: eye contact,” Rehg said. “We identify the sensors and algorithms capable of measuring those constructs, and we package the whole thing and insert it into the interaction in a way that doesn’t change anything.”

Rehg focuses on an important skill, maintaining eye contact, as the main indicator of behavior. Children who have problems sustaining eye contact may be at risk for greater social behavior problems, such as autism. 

Rehg said the camera-wearing psychologist can track eye contact, gathering lots of data during therapy sessions. This camera records the amount of eye contact as well as any specific actions by the therapist that inspire more eye contact.

Danna Gurari, a postdoctoral fellow in Computer Vision at UT-Austin who attended the lecture, said she enjoyed how Rehg made this connection to possible advancements in behavior research. 

“The most interesting part of this lecture was the big vision,” Gurari said. “[Rehg] certainly touched on a lot of technical details, but I enjoyed that all of that was geared to some sort of bigger picture.”

Rehg says that he is hopeful that “First Person Vision” can impact behavior research, even though privacy issues still prohibit collecting large amounts of data and the technology cannot yet work in real time. 

“We have to go further to really validate this,” Rehg said. “I’m not claiming today that this is ready to roll-out in autism clinics across the country. There are lots of challenges, actually, to achieve that goal. But our goal is to get this technology into broader use, and I think we are on a path to get there.”

More to Discover
Activate Search
First Person Vision exposes hidden behaviors