By Yoonj Kim
A blonde in a ponytail. Teenaged brunette wearing eyeliner. Latina with glitter on her cheeks. All stared unknowingly into a hidden camera recording their facial information as they watched rehearsal videos of a beloved blue-eyed pop star at a concert, their images beamed 3,000 miles away to a Nashville command post to be cross-referenced against a database of known stalkers.
Such was the scene at Taylor Swift’s Reputation concert last spring at the Rose Bowl, where a kiosk discretely equipped with facial-recognition technology reportedly recorded fans without their knowledge or consent at her May 18 concert. When the news broke seven months later after a security expert spilled the beans to Rolling Stone, the creepily dystopic revelation began with the fact that fans were essentially tricked into participating in a secret facial recording.
“I did see [the kiosk] but didn’t go through it,” Bianca Peralta, a Swift fan who was at the concert that weekend, tells MTV News. “I don’t think she should have had it at all. I understand it was to help her out with her security. But, I mean, these are her fans.”
Facial recognition is a form of biometric technology that can identify individuals based on live facial data — like walking passersby at a concert — using a special camera. This facial data is compared against an existing database of images to identify who the person is — or, in other cases, it could be gathered to compile a new database. To this day, it remains unclear exactly how the technology from the Swift concert was utilized.
“Everybody who went by would stop and stare at it, and the software would start working,” Mike Downing, CPO of Oak View Group and former police officer, told Rolling Stone in December. Downing knew about the secret surveillance as a guest of the kiosk manufacturer. Oak View Group has not responded to a request for comment, and neither Swift nor her security team have issued any public explanation. The most pressing questions pertain to what happened with the fans’ facial images: Were they deleted, sent to law enforcement, saved in a private database?
As any owner of an iPhone X would know, facial recognition technology is entering — or invading, depending on your privacy stance — our lives from many unregulated angles. Apple’s Face ID requires a live image of the owner’s face to unlock the iPhone. Google’s Nest cam is able to detect and recognize human faces at the doorbell. Airports like JFK and LAX have partnered with Homeland Security to conduct facial screenings in lieu of tickets at international flight gates. In other words, you can reasonably expect the existence of face-tracking technologies nearly everywhere you go.
This means if you’re a privacy freak, it may be a good time to invest in an attractive ski mask or massive sunglasses. (Unless you live in Illinois, that is — the only state that requires anyone using biometrics to get the consent of the people from whom they’re collecting information.)
“There’s no national privacy law at all,” says Jennifer Lynch, senior staff attorney at the Electronic Freedom Foundation. “Facial recognition has been used in other situations like casinos for years, mainly to track card counters […] But what those systems don’t tend to do is take a picture of everyone and store it in a database. I don’t know if that’s what was happening at Taylor Swift’s concert or if she was just looking for people who’ve been known to harass her.”
When I asked her if she thought the screening at the concert was illegal, she paused and replied, “I think it was ill-advised. It’s sort of a Wild West.”
Taylor Swift greets fans during at Wembley Stadium during the Reputation stadium tour.
Taylor Swift has had a disturbing number of stalkers throughout her career. And one could presume that the facial recognition was intended to keep not just herself but the concert-goers safe as well.
“I think it’s good for safety reasons,” says Stephanie Worth, another Swift fan who attended her Rose Bowl performance that weekend. “If you think about the Ariana Grande concert, had this been rolled out back then maybe something could have prevented that situation, because people are fearing going to concerts.”
Assuming that the majority are not criminals or persons of interest, most attendees at the Swift concert likely did not need to worry about being singled out by the technology. However, if, for example, you were a Latino man in your early twenties with thick eyebrows, these features could have triggered alarms in the mysterious command post for fitting the profile of a known Swift stalker.
This particular man, Roger Alvarado, was recently sentenced to six months in prison for breaking into Swift’s home and taking a nap in her bed — an understandably terrifying scenario. But the example highlights an intrinsic civil rights issue. A common fear among critics is that the technology could perpetuate existing inequalities in the criminal justice system. Several studies have confirmed intersectional bias in facial recognition algorithms, including one by MIT which found that lighter-skinned men are subject to an error rate of only 0.8 percent while darker-skinned women are misclassified up to 34.7 percent of the time.
“We know stores discriminate against people of color when they think they find shoplifters,” says Lynch, referring to the possibility of security cameras utilizing facial recognition to deter theft. “It could be young white women who are shoplifting, but the store is targeting young black men, so if the store accuses you and puts your face in a database and shares that with other stores, it could be you’re not allowed to go into stores anymore.”
It’s also likely that the Swift concert incident was just one cat that got out of a big, fat bag.
“It’s definitely an emerging technology,” says Jason Porter, a vice president at Pinkerton, a private global security agency servicing celebrities and VIPs. “Clients have asked for it.”
Taylor Swift performs onstage at the Rose Bowl on May 18, 2018.
When it comes to concerts and live events, a new wave of facial recognition is already upon us. Madison Square Garden had reportedly been using facial scanning without customer knowledge. “As part of our ongoing efforts to protect this world-famous venue, we are doing everything in our power, including using facial recognition, to make it the safest place possible,” says a spokesperson from The Madison Square Garden Company.
Live Nation and its subsidiary Ticketmaster also invested in Blink Identity, a facial recognition startup with links to the Department of Defense, which is currently doing private beta tests to roll out at venues in the near future. Its immediate purpose would be to speed up the ticketing process and identify customers.
“We are focusing on creating a technology that allows individuals to use face recognition to gain access to locations and services in a convenient manner,” says Mary Haskett, CEO of Blink Identity. “People will be able to enroll by taking a photograph of themselves with their cell phone. That will give them access to a special ‘VIP’ lane at venues.”
When I asked her how they would ensure the security of their photos, she said they follow best practices for information technology including encryption so that venues would not have access to personal data. Since data leaks, hacks, and other breaches of privacy are constant threats, the encryption would offer a layer of protection in the event the database does get compromised. Beyond that is the question of audience perception, with varying estimates on Americans’ favorability to facial recognition technology.
“We believe strongly that the use of face recognition technology should always be disclosed and should always be voluntary,” she said. “If Taylor Swift had posted signs saying that identity technology was going to be used at her concerts to help protect her from stalkers, I don’t think it would have affected ticket sales at all.”
A September 2018 survey by the Brookings Institute, however, found that only 33 percent of Americans were favorable to facial recognition being used in stadiums.
Regardless, in a time when concert safety has proven to be a critical concern, it’s understandable that venues and artists will try out new methods like facial recognition for enhanced crowd control. But if pleasing the crowd is also a concern, more needs to be done to address issues of consent and privacy.
“I’m not any less of a Swift fan,” Peralta says. “But I also understand this can be discriminating. Where is the line drawn?”
Yoonj Kim is a journalist and writer based in Los Angeles. Find her on Twitter @yoonjkim.