January 18, 2021

Research Shows iOS Covid Apps Are A Privacy Mess

By Karl Bode
Jonathan Albright, director of the Digital Forensics Initiative at the Tow Center for Digital Journalism, recently released analysis he did into 493 COVID-19 related iOS apps across dozens of countries. The results are…not great, and highlight how such apps routinely hoover up far more data than they need to, including unneeded access to cameras and microphones, your photo gallery, your contacts, and far more location data than is needed. Much of this data then winds up in the adtech ecosystem for profit, where it winds up in the hands of third parties.

Only 47 of the apps used Google and Apple’s more privacy-friendly exposure-notification system, resulting in a number of folks building their own apps with substandard (in some cases borderline nonexistent) privacy standards. Six out of seven COVID iOS apps worldwide are allowed to request any permissions they’d like. 43 percent of all apps were found to be tracking user location at all times. 44% requested access to the users’ camera, 22 percent asked for access to users’ smartphone mic, 32 percent asked for access to users’ photos, and 11 percent asked for full access to user contact lists.

Albright told Ars Technica that while many of these app makers may be well intentioned, they’re often working at cross purposes, while hoovering up far more data than they actually need. Data that in many instances is then being sold to unknown third parties:

“It’s hard to justify why a lot of these apps would need your constant location, your microphone, your photo library,” Albright says. He warns that, even for COVID-19-tracking apps built by universities or government agencies—often at the local level—that introduces the risk that private data, sometimes linked with health information, could end up out of users’ control. “We have a bunch of different, smaller public entities that are more or less developing their own apps, sometimes with third parties. And we don’t know where the data’s going.”

Albright’s study focused on iOS, while other studies focused on Android and showed the same problem(s). Albright notes that he didn’t find any nefarious activity himself, but he also made it pretty clear than once this data starts circulating in the largely unaccountable adtech universe, it’s possible that sensitive data (including your COVID status) could be revealed to third parties:

“some COVID-19 apps he analyzed went beyond direct requests for permission to monitor the user’s location to include advertising analytics, too: while Albright didn’t find any advertising-focused analytic tools built into exposure-notification or contact-tracing apps, he found that, among apps he classifies as “information and updates,” three used Google’s ad network and two used Facebook Audience Network, and many others integrated software development kits for analytics tools including Branch, Adobe Auditude, and Airship. Albright warns that any of those tracking tools could potentially reveal users’ personal information to third-party advertisers, including potentially even users’ COVID-19 status.”

That’s not to say many of these apps aren’t doing good things, but they’re doing them so in a way that potentially puts consumer privacy at risk, a particular problem when you can’t opt out of using it due to work or school requirements. That’s not particularly surprising here in the States, where we can’t even pass a baseline privacy law for the internet era, resulting in no real concrete guidance from the top down. The end result is, well, precisely what you’d expect.

Source:: https://www.techdirt.com/articles/20201116/08012145714/research-shows-ios-covid-apps-are-privacy-mess.shtml