Yuksel’s New HCI Lab Applying Machine-Learning to Tutoring, VR to Social Justice

By Daniel Morgan, College of Arts and Sciences Posted Thu, 01/11/2018 - 11:58

The first thing Professor Beste Yuksel did when she came to USF’s computer science department in Fall 2016 was create a Human-Computer Interaction (HCI) lab. Today, the lab is equipped with fully-immersive virtual reality (VR) and brain-computer interfacing (BCI), with which Yuksel and her students have been putting to use on research in areas of tutoring and social justice.

Yuksel’s primary focus is on building systems that detect, analyze, adapt to, or simulate human emotions (aka, human computer interfaces) by applying principles of physiological and affective computing.

“You can measure physiological signals,” Yuksel explained, “such as heart rate, heart rate variability, and electrodermal activity — or you could look at facial expression recognition or posture detection or gesture detection — and use those to create a computing system that detects and responds to your physiological signals. Which is a part of affective computing.”

The lab is equipped, thanks to an $85,000 National Science Foundation grant, with function near infrared spectroscopy (fNIRS) equipment, which is integral in building brain-computer interfaces. The fNIRS equipment enables lab members to detect a person’s cognitive workload (how hard someone is thinking) by measuring oxygen levels in the brain. For instance, areas of the brain that are working hard have a higher presence of oxygenated hemoglobin because the heart is pumping more blood and oxygen to where it’s needed.

We apply machine-learning techniques so that the system can classify the user’s state — determined by measuring brain and/or physiological signals — so that the system can respond in real-time as the user is carrying out the task.

An example of a real-world application of a system like this can be something like the one Yuksel developed while at Tufts University for her PhD, BACh (Brain Automated Chorales). BACh is a computing system that helps teach people to play the piano. Using fNIRS to measure cognitive workload, the BACh system will only offer a new line of music for the learner to play when it determines that the brain’s workload has fallen below a certain threshold. It’s a clever system and earned Yuksel a best paper award at CHI 2016.

And these computer-tutors are only going to get smarter, according to Yuksel.

“We have a miserable bandwidth of communication between the human and the computer,” she said. “But if we can provide the computer with more information about the human user, such as their cognitive workload or affective state, then the computer can respond more intelligently in return.”

Getting involved

Students can get involved in the lab at any time, with any level of experience. The lab’s members are a mixture of undergraduate and graduate students. As new, inexperienced students come in, they are matched with more senior and/or experienced students until they get their sea legs under them. Then, once they have a grasp on things, they can start to branch off and do projects of their own.

I’m involved in all of the projects, and I make a point to work with students’ interests. So if a student is interested in a particular subject, then I will work with them and guide them in that topic so that they can carry out research and publish in that area.

Which is a huge advantage to students when they graduate. Because both undergrads and grads are able to publish their research in Yuksel’s lab, this gives them outstanding talking points they can take with them into applications to graduate school or a PhD program, or into an interview for a research position at a company.

A bent toward social justice

At USF, Yuksel is the faculty adviser for the Women in Tech club on campus, and this past fall, accompanied 21 of those students to the Grace Hopper celebration of Women in Computing, the biggest computing conference for women in the world. She also accompanied students to Lesbians in Tech last year, held in San Francisco, and will be going again next year.

Yet her social justice sensibilities don’t remain only in the community, they extend into her lab, as well, where she is taking full advantage of its fully-immersive virtual environment, complete with full-body-enabled tracking.

“We’re currently doing research in areas of social justice using VR,” she said. “And watch this space for some exciting results coming soon!”