Since academics and investigative journalists first reported last year that Facebook was using people’s two-factor authentication numbers and “shadow” contact information for targeted advertising, Facebook has shown little public interest in fixing this critical problem. Subsequent demands that Facebook stop all non-essential uses of these phone numbers, and public revelations that Facebook’s phone number abuse was even worse than initially reported, failed to move the company to action.
Yesterday, rather than face a lawsuit from FTC, Facebook agreed to stop the most egregious of these practices.
Read more about two-factor authentication HERE.
In one of just a few concrete wins in an overall disappointing settlement, Facebook agreed not to use phone numbers provided for any security feature (like two-factor authentication, account recovery, and login alerts) for targeted advertising purposes.
Until this settlement, Facebook had been using contact information that users explicitly provided for security purposes for targeted advertising, contrary to those user’s expectations and Facebook representatives’ own previous statements. Revelation of this practice seriously damaged users’ trust in a foundational security practice and undermined all the companies and platforms that get two-factor authentication right.
The FTC’s order that Facebook stop using security phone numbers for targeted advertising is, hopefully, a first step toward rebuilding users’ trust in security features on Facebook in particular and on the web in general.
The Loose Ends
But the FTC didn’t go far enough here, and Facebook continues to be able to abuse your phone number in two troubling ways.
First, two-factor authentication numbers are still exposed to reverse-lookup searches. By default, anyone can use the phone number that a user provides for two-factor authentication to find that user’s profile. Problems with this search functionality have been public since at least 2017. Facebook even promised to disable it over a year ago in the wake of the Cambridge Analytica scandal, but left open a loophole in the form of contact uploads. For people who need two-factor authentication to protect their account and stay safe, Facebook’s failure to fill this loophole forces them into an unnecessary choice between security and privacy.
Counter Markets Newsletter - Trends & Strategies for Maximum Freedom
Second, the FTC’s settlement misses a whole additional category of phone numbers: “shadow” contact information, which refers to a phone number you never gave Facebook but which your friends uploaded with their contacts. In other words, even if you never directly handed a particular phone number over to Facebook, advertisers may nevertheless be able to associate it with your account based on your friends’ phone books.
This shadow contact information remains available to advertisers, and inaccessible and opaque to users. You can’t find your “shadow” contact information in the “contact and basic info” section of your profile; users in Europe can’t even get their hands on it despite explicit requirements under the GDPR that a company give users a “right to know” what information it has on them.
Throughout this year, we have been demanding that a handful of companies fix some of their biggest privacy and security problems. For Facebook, we have taken aim at its tendency to use phone numbers for purposes contrary to what users understood or intended. While the FTC’s order may seem like a fix, it does not go far enough for us to consider it a complete victory. Until Facebook takes the initiative to address the reverse-lookup and shadow contact information problems described above, users can expect that its reckless misuse of their phone numbers will continue. And we’ll continue watching and putting pressure on them to fix it already.
Gennie conducts and manages research and advocacy for the Electronic Frontier Foundation on consumer privacy, surveillance, and security issues.
Prior to joining EFF, Gennie earned a Master of Library and Information Science from the University of Washington Information School, where she published on Internet censorship in Thailand and zero-rating in Ghana, as well as investigating mobile access and technology terms in Myanmar (Burma) and public Internet access in Laos. While at the UW, she also co-founded and led a successful initiative for a university Open Access policy.
Outside work, Gennie is a cyclist, avid CouchSurfer, sticker enthusiast, and friend to all cats.
This article was sourced from EFF.org
Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.