Privacy is emotional — we often value privacy when we feel vulnerable or powerless when confronted with creepy data practices. But in the court’s eyes, emotions don’t’ always constitute harm or a reason for structural change in how privacy is legally codified.
It might take a material perspective on widening privacy disparities — and their implication in broader social inequality — to catalyze the privacy improvements the U.S. desperately needs.
Apple’s leaders announced their plans for the App Tracking Transparency (ATT) update in 2020. In short, iOS users can refuse an app’s ability to track their activity on other apps and websites. The ATT update has led to sweeping three-quarters of iOS users opting out of cross-app tracking.
With fewer data available to advertisers looking to develop individual profiles for targeted advertising, targeted ads for iOS users look less effective and appealing to ad agencies. As a result, new findings show that advertisers spend one-third less on advertising spending on iOS devices.
They are redirecting that capital into advertising on Android systems, which account for just over 42.06% of the mobile O.S. market share, compared to iOS at 57.62%.
Beyond a vague sense of creepiness, privacy disparities increasingly pose risks of material harm: emotional, reputational, economic, and otherwise. As many tech companies say, privacy belongs to us, so why does it cost so much? Whenever one user base gears up with privacy protections, companies simply redirect their data practices along the path of least resistance toward the populations with fewer resources, legal or technical, to control their data.
More than just ads
As more money goes into Android ads, we could expect advertising techniques to become more sophisticated or at least more aggressive. It is not illegal for companies to engage in targeted advertising, so long as it is done in compliance with users” legal rights to opt-out under relevant laws like CCPA in California.
This raises two immediate issues. First, residents of every state except California currently lack such opt-out rights. Second, granting some users the right to opt-out of targeted advertising strongly implies harms, or at least risks, to targeted advertising. And indeed, there can be.
Targeted advertising involves third parties building and maintaining behind-the-scenes profiles of users based on their behavior. Gathering data on app activity, such as fitness habits or shopping patterns, could lead to further inferences about sensitive aspects of a user’s life.
At this point, a representation of a user exists in an under-regulated data system containing — whether correctly or incorrectly inference — data that the user did not consent to share. (Unless the user lives in California, let’s suppose they live anywhere else in the U.S.)
Further, research finds that targeted advertising, in building detailed profiles of users, can enact discrimination in housing and employment opportunities, sometimes violating federal law. And targeted advertising can impede individuals” autonomy, preemptively narrowing their window of purchasing options, even when they don’t’ want to. On the other hand, targeted advertising can support niche or grassroots organizations in connecting them directly with interested audiences. Regardless of a stance on targeted advertising, the underlying problem is when users have no say in whether they are subject to it.
Targeted advertising is a massive and booming practice, but it is only one practice within a broader web of business activities that do not prioritize respect for users” data. And these practices are not illegal in much of the U.S. Instead of the law, your pocketbook can keep you clear of data disrespect.