Disney properties’ rights state that they are allowed to “photograph, film, videotape, record or otherwise reproduce the image and/or voice of any person who enters.” And while CCTV cameras are commonplace throughout theme parks, this kind of tech would take in-person surveillance to new heights, said Calli Schroeder, senior counsel and global privacy counsel at the Electronic Privacy Information Center.
This tech could be applicable for security surveillance or watching for “pressure or frustration points” in the park, Schroeder said, such as places where people struggle to find ride entrances or bathrooms that they could optimize park layouts. But it still implies a certain level of AI-based emotion recognition, she said, which can often be inaccurate for myriad reasons.
Everyone emotes differently, she noted. And since Disney’s system bases its predictions off a set of behaviors it deems “normal,” it may be set off by the emotional expression of people from different cultures or those who are neurodivergent.
“The use of the word ‘normal’ is a red flag already,” said Schroeder. “There may be some (guest behaviors) that are more common than others, but setting a threshold for normal is a little concerning.”
In addition, a large part of Disney’s clientele includes children and minors, said Schroeder, so the potential tracking and use of their data by and for this system’s deep learning models is concerning in and of itself.
“Incorporating AI into these systems, it’s going to be learning and trying to detect patterns from what they pick up,” she said. “You’re using this information in a way that these people very likely are unaware of.”
Read more here.