Christopher Pierson thinks facial recognition, as both a topic and a biometric, is “super cool,” and that enthusiasm energized his 2018 CyberCrime Symposium session, “The Privacy and Security Implications of Facial Recognition.” But because he’s worn so many professional hats — from inventor, CISO, CPO, and general counsel, to member of DHS’s Data Privacy and Integrity Advisory Committee and its Cybersecurity Subcommittee — he could exult in facial scanning technology’s cool factor while laying out some of the growing legal, ethical, and privacy concerns surrounding it.
“There are cameras everywhere, integrated in everything,” said Pierson, CEO of Blackcoat, a cybersecurity consultancy. To illustrate, he detailed the trail of devices that captured his image en route to the symposium — outside his house, at tollbooths, on police vehicles, in parking garages, on shuttles, in two airports, and finally, at the venue where he took the stage. Sometimes, he said, cameras and facial recognition technology combine for valid, valuable purposes, and other times, they’re used because social media platform engineers, technology providers, or other entities have decided they can be.
As security and privacy practitioners, Pierson told attendees, “we have to be comfortable with that,” while also accepting that there will be fewer situations “where security automatically wins and privacy gets shoved to the side.” With GDPR flexing its muscle and three US states with biometric privacy laws on the books, he advised CISOs and CPOs to get a handle on their own use of facial scanning technology, and move to formalize policies that ensure they’re compliant with laws governing how they use, secure, transfer, and destroy the corresponding personal data.
Lights, Camera, Capture
Today, entities across the public and private sector, including federal, state, and local government buildings and outdoor venues, businesses, airports, public transportation systems, and stadiums, employ cameras and facial recognition technology for security and other purposes. Back-end facial recognition software extracts unique facial data from samples, compares them against images in specific databases, and identifies any matches.
Facial recognition has made major advances since its earliest incarnation, which projected different points, or dots, on the face, measured the distance between them, created a unique hash of a sample, and uploaded it to a database for later comparison. “In terms of technology, that’s version 0.1 and we’re now somewhere around version five,” Pierson said, “It’s no longer just capturing those points on the face.” It’s also looking at an individual’s face tilt, hair color, degree of smile, head pitch, roll and yaw, exposure, skin color, likely gender, and other parameters. Moreover, facial recognition software can now analyze the actual mood of an individual, gauging whether they’re happy, unhappy, or otherwise.
Just consider, said Pierson, the facial recognition technology in the most recent releases of the iPhone, the device of choice for so many consumers. Today, it illuminates 30,000 dots on an owner’s face in order to capture an “undeniable” facial sample and use it to secure access.
“This is technology that’s getting better day-by-day, but there are issues,” he said. From a technology standpoint, it’s more difficult to measure depth on faces with darker complexions, for example, creating a higher chance of false positives. Addressing this shortfall requires more testing on a more diverse subject population.
Privacy’s Fine Line
From a privacy standpoint, scenarios involving the possible use — or misuse — of facial recognition and the private data involved run the gamut. When is it applied for positive reasons and when does it cross the line to create “a surveillance state?”
When Pierson posed this question to attendees along with examples, they generally agreed that cameras used by a police department to capture images of everyone entering a city park to check them against their state’s registered sex offender database — provided they’re not storing or transferring data to a third-party — was a reasonable use. But what if an insurance company that’s part of a large healthcare network could broker access to facial recognition data from different hospitals because it wanted to analyze the general health or, say, chromosomal abnormalities of a specific population?
“You might not have a reasonable expectation of privacy when you’re in public, but that doesn’t mean an organization should be able to suck up private data every second of every day, keep it forever, and use it to layer on new parameters as technology advances,” Pierson said. Many institutions would jump at the chance to harness facial recognition data to improve the customer experience, but can they de-identify it and securely store it? What controls need to be in place?
Considering that around 50 class-action lawsuits have already been filed in Illinois — primarily against social media companies charged with violating the state’s Biometric Information Privacy Act because they allow subjects to be tagged without their written consent — Pierson expects there to be frenzied activity surrounding biometrics and its privacy implications as time and technology advance.
However, when Pierson dons his CISO hat, he can home in on facial recognition’s “awesome” security-enhancing applications in the areas of access, authentication, and auditing. It’s particularly effective in employee-security scenarios — whether these involve physical entry of a facility or a network log-in — because the system’s checking a known person against a limited database.
In a company with 1,000 employees, SOC staff might run a facial recognition database with two good images of each. “This isn’t a one-to-many situation — you’re checking one known image against a fixed population,” Pierson said. “That means you know with 99.9999% certainty that the system identified the right person.”
An overarching theme he hoped attendees would take-away with them, said Pierson, “is that biometrics technology, especially facial scanning, is here to stay.” He advised CISOs to start meeting with CPOs, CTOs, CIOs, and HR, document internal and external use of biometrics, and understand “onward use of data” possibilities. That way they can create security and privacy policies and guidelines that can be updated accordingly
“It’s up to all of us to read the latest papers and review the data on both sides of this debate,” Pierson said. “That way, we can identify the technology’s positive uses and places where it might go off the rails.”
This is the ninth in our series of posts presenting key takeaways from our 2018 CyberCrime Symposium, held November 1-2, 2018. The program—“The Future of Privacy and Security”— featured an incredible line-up of speakers. If you couldn’t get a seat at the event or want a refresher on various sessions, this is a not-to-be-missed series!