Klemchuk

View Original

The Privacy Implications of Intel’s Classroom AI

Intel’s AI Software Collects Biometric Information to Determine Emotional State of Students

The coronavirus pandemic has not only significantly changed the landscape when it comes to working from home, but the pandemic has also made remote learning commonplace. From kindergarten to graduate schools, Zoom and other online platforms have become the new normal when it comes to serving as classrooms for students. Responding to these innovations, Intel has developed software that claims to detect the emotional state of children by interpreting their body language via web cameras.

Intel’s AI for Use on Students Raises Questions on Biometric Information Privacy

Intel’s technology raises new privacy concerns as it is unclear what biometric information Intel will store and analyze. Biometric information is already governed differently state-to-state, and Intel’s software enters unchartered territory. According to Intel, the software will use artificial intelligence to analyze the body language and faces of students online. Then, based on this analysis, the artificial intelligence concludes the “emotional state” of the student and passes that on to the teacher in real-time. Intel claims that this will allow teachers to recognize when students are confused, bored, or in need of attention. Accordingly, Intel notes the technology’s main objective is to improve one-on-one interactions between the student and teacher as opposed to entire classrooms. The technology is expected to be integrated into Zoom, and if successful, Intel intends to carry the technology into other videoconferencing processes.

Translating Biometric Information

With Intel's software apparently having the ability to interpret the biometric data, it is unclear what standards Intel would use to conclude students’ mental states. Studies have demonstrated that human expressions vary by person, and cultural backgrounds of students heavily influence a student’s general demeanor in the classroom. Intel’s accuracy when it comes to interpreting facial expressions is unknown and, it is unclear what the technology actually takes into account.

In response to these criticisms, Intel has stated that its software was created in conjunction with a team of psychologists, and an emotion had to be validated by two out of three psychologists before making the cut. On the other hand, Intel admitted that the final assessment of the software was based on its utility for teachers as opposed to its accuracy in judging students’ emotions.

Privacy Advocates Question Intel’s Collection of Student Biometric Information

As such, it is unclear what Intel will do with the storage and collection of such biometric information. If Intel concludes that a student is “distracted” or “bored,” will such information be stored on a server somewhere? Will this conclusion be appealable, or will it be forever recorded in a student’s permanent record? Will the information be categorized as personal information, and as such, be protected under applicable privacy laws? As Intel’s software raises new questions about student’s privacy and sensitive information, privacy attorneys and advocates would do well to follow the application of such technology as it forms and is integrated into student classrooms.

Key Takeaways on Intel’s AI to Interpret Student Emotional States 

Intel is working on artificial intelligence intended to analyze the emotional state of students through the collection of biometric data. This technology raises new issues in law and biometric privacy such as:

  • To what standards is Intel comparing students’ expressions;

  • How will Intel store such biometric information; and

  • Will the conclusions about a student’s emotional state be considered sensitive information?

For more information about technology law, see our Technology Law Services and Industry Focused Legal Solutions pages.