What We Learned From Apple’s New Privacy Labels



We all know that apps collect our data. Yet one of the few ways to find out what an app does with our information involves reading a privacy policy.

Let’s be real: Nobody does that.

So late last year, Apple introduced a new requirement for all software developers that publish apps through its App Store. Apps must now include so-called privacy labels, which list the types of data being collected in an easily scannable format. The labels resemble a nutrition marker on food packaging.

These labels, which began appearing in the App Store in December, are the latest attempt by tech designers to make data security easier for all of us to understand. You might be familiar with earlier iterations, like the padlock symbol in a web browser. A locked padlock tells us that a website is trusted, while an unlocked one suggests that a website can be malicious.

The question is whether Apple’s new labels will influence the choices people make. “After they read it or look at it, does it change how they use the app or stop them from downloading the app?” asked Stephanie Nguyen, a research scientist who has studied user experience design and data privacy.

To put the labels to the test, I pored over dozens of apps. Then I focused on the privacy labels for the messaging apps WhatsApp and Signal, the streaming music apps Spotify and Apple Music and, for fun, MyQ, the app I use to open my garage door remotely.

I learned plenty. The privacy labels showed that apps that appear identical in function can vastly differ in how they handle our information. I also found that lots of data gathering is happening when you least expect it, including inside products you pay for.

But while the labels were often illuminating, they sometimes created more confusion.

To find the new labels, iPhone and iPad users with the latest operating system (iOS and iPadOS 14.3) can open the App Store and search for an app. Inside the app’s description, look for “App Privacy.” That’s where a box appears with the label.

Apple has divided the privacy label into three categories so we can get a full picture of the kinds of information that an app collects. They are:

  • Data used to track you. This information is used to follow your activities across apps and websites. For example, your email address can help identify that you were also the person using another app where you entered the same email address.

  • Data linked to you: This information is tied to your identity, such as your purchase history or contact information. Using this data, a music app can see that your account bought a certain song.

  • Data not linked to you: This information is not directly tied to you or your account. A mapping app might collect data from motion sensors to provide turn-by-turn directions for everyone, for instance. It doesn’t save that information in your account.

Now let’s see what these labels revealed about specific apps.

On the surface, WhatsApp, which is owned by Facebook, appears to be nearly identical to Signal. Both offer encrypted messaging, which scramble your messages so only the recipient can decipher them. Both also rely on your phone number to create an account and receive messages.

But their privacy labels immediately reveal how different they are under the hood. Below on the left is the privacy label for WhatsApp. On the right is the one for Signal:

>
>

The labels immediately made it clear that WhatsApp taps far more of our data than Signal does. When I asked the companies about this, Signal said it made an effort to take less information.

For group chats, the WhatsApp privacy label showed that the app has access to user content, which includes group chat names and group profile photos. Signal, which does not do this, said it had designed a complex group chat system that encrypts the contents of a conversation, including the people participating in the chat and their avatars.

For people’s contacts, the WhatsApp privacy label showed that the app can get access to our contacts list; Signal does not. With WhatsApp, you have the option to upload your address book to the company’s servers so it can help you find your friends and family who are also using the app. But on Signal, the contacts list is stored on your phone, and the company cannot tap it.

“In some instances it’s more difficult to not collect data,” Moxie Marlinspike, the founder of Signal, said. “We have gone to greater lengths to design and build technology that doesn’t have access.”

Business & Economy

Latest Updates

  • Microsoft profits jump 33 percent as pandemic continues shift to cloud computing.
  • Fox gives a show to one former Trump aide, but shoots down claims it hired another.
  • Verizon is still investigating the cause of a widespread internet outage on the East Coast.

A WhatsApp spokeswoman referred to the company’s website explaining its privacy label. The website said WhatsApp could gain access to user content to prevent abuse and to bar people who might have violated laws.

I then took a close look at the privacy label for a seemingly innocuous app: MyQ from Chamberlain, a company that sells garage door openers. The MyQ app works with a $40 hub that connects with a Wi-Fi router so you can open and close your garage door remotely.

Here’s what the label says about the data the app collected. Warning: It’s long.

>

Why would a product I paid for to open my garage door track my name, email address, device identifier and usage data?

The answer: for advertising.

Elizabeth Lindemulder, who oversees connected devices for the Chamberlain Group, said the company collected data to target people with ads across the web. Chamberlain also has partnerships with other companies, such as Amazon, and data is shared with partners when people opt to use their services.

In this case, the label successfully caused me to stop and think: Yuck. Maybe I’ll switch back to my old garage remote, which has no internet connection.

Finally, I compared the privacy labels for two streaming music apps: Spotify and Apple Music. This experiment unfortunately took me down a rabbit hole of confusion.

Just look at the labels. Below on the left is the one for Spotify. On the right is the one for Apple Music.

>
>

These look different from the other labels featured in this article because they are just previews — Spotify’s label was so long that we could not display the entirety of it. And when I dug into the labels, both contained such confusing or misleading terminology that I could not immediately connect the dots on what our data was used for.

One piece of jargon in Spotify’s label was that it collected people’s “coarse location” for advertising. What does that mean?

Spotify said this applied to people with free accounts who received ads. The app pulls device information to get approximate locations so it can play ads relevant to where those users are. But most people are unlikely to comprehend this from reading the label.

Apple Music’s privacy label suggested that it linked data to you for advertising purposes — even though the app doesn’t show or play ads. Only on Apple’s website did I find out that Apple Music looks at what you listen to so it can provide information about upcoming releases and new artists who are relevant to your interests.

The privacy labels are especially confusing when it comes to Apple’s own apps. That’s because while some Apple apps appeared in the App Store with privacy labels, others did not.

Apple said only some of its apps — like FaceTime, Mail and Apple Maps — could be deleted and downloaded again in the App Store, so those can be found there with privacy labels. But its Phone and Messages apps cannot be deleted from devices and so do not have privacy labels in the App Store. Instead, the privacy labels for those apps are in hard-to-find support documents.

The result is that the data practices of Apple’s apps are less upfront. If Apple wants to lead the privacy conversation, it can set a better example by making language clearer — and its labeling program less self-serving. When I asked why all apps shouldn’t be held to the same standards, Apple did not address the issue further.

Ms. Nguyen, the researcher, said a lot had to happen for the privacy labels to succeed. Other than behavioral change, she said, companies have to be honest about describing their data collection. Most important, people have to be able to understand the information.

“I can’t imagine my mother would ever stop to look at a label and say, ‘Let me look at the data linked to me and the data not linked to me,’” she said. “What does that even mean?”