THE #1 AV NEWS PUBLICATION. PERIOD.

Privacy Is an Important Consideration for AV Professionals

privacy in classrooms

If you attended, followed up on, and/or heard anything about Zoomtopia, you are aware that their big push is all about AI. No surprise here, as every company is currently doing an AI push. For me, this landed in my feed at the same time as the Harvard Meta AI study. All I could think about since is how privacy concerns are becoming increasingly prevalent, especially in the context of higher education. Zoom, AI-enabled devices (such as smart glasses), and advanced artificial intelligence (AI) technologies are raising serious questions about data collection, surveillance and the security of sensitive information. These concerns are particularly prevalent in colleges and universities, where students and faculty members rely on technology for teaching and learning. Most importantly, students and faculty expect a classroom to be an experimental place where they can ask questions and think aloud, without fear of repercussions. A near constant state of recording and analyzing interactions risks this expectation.

Let’s start with Zoom. I believe that Zoom has been responsive and thoughtful about privacy and protecting information. Yet, it is important to be cognizant of all the data that they have the ability to collect and store about what happens on their platform. The AI companion, as an example, knows who is logged in, and quite literally, exactly what people say during a meeting. Of course, it does not always understand the context in which it was said. Not being human means that it can misinterpret something completely.

For example, during a recent meeting where we were testing out the AI Companion, a team member mentioned they had to bring lunch to their daughter at school. We talked for a moment about when he would be leaving work to bring it to her. The AI companion gave us a summary of the meeting in which it declared that two of us in the meeting had setup a lunch date with the employee’s daughter. Alone, this is no big deal, we knew this was a poor interpretation. Yet, when you think about the possibility of this being plugged into an AI database and including exact people associated with it, you can begin to see the inherent risks. It now believes as fact that I and another employee had a lunch date with this person, and that is not true.

I have previously written about the emergence of AI-powered glasses, such as Meta’s Ray-Ban glasses. These glasses come equipped with cameras, microphones and AI algorithms capable of analyzing visual and auditory data in real-time. While these devices are useful and can capture hands-free video or translate spoken words into text —they also pose significant privacy challenges.

See related  Preview of DSE 2024: Why You Should Attend

One major concern is the ability of AI glasses to record and analyze individuals without their consent. In a classroom setting, a student or professor wearing AI-enabled glasses could potentially record others without their knowledge, violating their privacy. AI algorithms can also enhance the data that is collected. For example, facial recognition technology embedded in AI glasses can identify and track individuals, analyze their emotions and even infer such things as mood based on visual cues. Depending on how this is used, it could either be an enormous invasion of privacy or a benefit to an instructor. Students in a class could identify people in the room simply by looking at them. They could get answers to questions that are posed by the instructor. On the beneficial end, data from cameras in the room could use AI to analyze students’ understanding of the concept. A faculty member could get a non-biased view of whether students are engaging with and understanding a topic, or if they seem confused. It could also delve into questionable areas like suggesting whether certain scores should be scaled based on the attentiveness of the student in classes leading up to an exam. This could be done even without a set of Meta AI glasses. We all have cameras in our classrooms today, we would just need to feed the data to an AI algorithm.

AI’s ability to mine personal information from seemingly innocent data — like a video recording of a lecture — poses a significant risk to the privacy of students and faculty. If such technologies are integrated into classroom tools or adopted by educational institutions, there is a danger that students’ private information could be harvested, analyzed and potentially misused.

As we in higher education become increasingly reliant on digital platforms like Zoom and have cameras and microphones installed in spaces all over campus, privacy concerns cannot be ignored. It is incumbent upon us as technology leaders to be sure these concerns are understood and explained. We should be a driving force in creating and publicizing policies around how the technology and the data collected are used.

Top