SUMMARY - Third-Party Data Sharing

Baker Duck
Submitted by pondadmin on

A person downloads a weather app that shares their location with 47 advertising partners, none of which they have heard of. A fitness tracker sells workout patterns to insurance companies that use the data for risk assessment. A retailer shares purchase histories with data brokers who combine it with information from hundreds of other sources, creating detailed profiles sold to anyone willing to pay. Someone reads a privacy policy claiming data is not sold to third parties, then discovers it is "shared with partners" through arrangements the company insists are not "sales." Third-party data sharing transforms personal information from data about individuals into commodities flowing through ecosystems most people never see. Whether this represents innovative business models enabling free services or surveillance capitalism that treats people as products to be analyzed and sold remains profoundly contested.

The Case for Strict Limits on Third-Party Sharing

Advocates argue that third-party data sharing creates harms that dwarf those from first-party collection because it removes information from contexts where people provided it and distributes it to entities they have no relationship with. From this view, someone who shares location with a navigation app expects that company to use it for directions, not to sell it to advertisers, data brokers, law enforcement, or anyone else. Each data share multiplies privacy risks: more entities storing information means more breach targets, more potential misuses, more analysis creating profiles people never consented to. Moreover, third-party sharing operates through deliberately opaque systems. Privacy policies mention "partners" without naming them. Consent is buried in terms of service users must accept to use services. Data flows through multiple intermediaries: an app shares with an SDK provider who shares with an ad network who shares with a data broker who sells to anyone. Tracking where information goes and who has it becomes impossible. From this perspective, meaningful privacy requires prohibiting or severely restricting third-party sharing: requiring specific, informed consent for each third-party recipient, not blanket authorization; allowing sharing only for purposes directly related to service provision; prohibiting data broker models that aggregate information across sources; establishing data minimization requirements that prevent collection for sharing purposes; and creating rights to know all entities that received someone's data with ability to demand deletion from each. The solution recognizes that once data is shared with third parties, original collectors lose control and users lose any meaningful privacy. Countries establishing data sharing restrictions and data broker regulations demonstrate these protections are achievable. The obstacle is business models built entirely on monetizing user data through third-party sharing.

The Case for Allowing Beneficial Data Sharing

Others argue that third-party data sharing enables services people value and that blanket restrictions would eliminate benefits while failing to prevent harms. From this perspective, data sharing serves legitimate purposes: analytics providers helping services understand usage; payment processors enabling transactions; cloud infrastructure storing information; security vendors detecting fraud; advertising supporting free services users enjoy. Prohibiting third-party sharing would make many current business models impossible, forcing everything behind paywalls that exclude people who cannot pay. Moreover, some data sharing improves outcomes. Sharing health data with researchers advances medical knowledge. Sharing fraud patterns across financial institutions protects all customers. Sharing threat intelligence across companies improves security. From this view, the problem is not third-party sharing but lack of transparency and meaningful consent. The solution involves: clear disclosure of what data is shared with whom and why; genuine choice about sharing with specific third parties; contractual requirements that third parties use data only for stated purposes; prohibition of onward sharing without additional consent; and enforcement against misuse rather than preventing all sharing. Additionally, distinguishing categories matters. Sharing with service providers necessary for operation differs from selling to data brokers. Sharing for fraud prevention differs from sharing for behavioral advertising. Treating all third-party sharing identically misses that some serves users while other exploits them. Risk-based approaches allowing beneficial sharing while restricting harmful uses better balance interests than categorical prohibitions.

The Consent Fiction in Third-Party Sharing

Privacy policies state that data may be shared with third parties, buried in dense legal text that users must accept to use services. From one view, this consent is fiction. No one reads policies. Even those who do cannot assess whether sharing is appropriate because most people do not know who the third parties are, what they will do with data, or how to evaluate risks. Clicking "I agree" to access a service is not meaningful consent to share data with dozens or hundreds of unknown entities. The solution requires granular, specific consent for each third party with plain language explanations and realistic ability to refuse. From another view, users implicitly consent by choosing free, ad-supported services over paid alternatives. They understand the basic bargain even if they do not read policies. Perfect comprehension is unrealistic standard that would make digital services impossibly burdensome. Whether implied consent through service use is adequate or whether explicit, specific authorization for sharing is required determines what compliance demands.

The Data Broker Ecosystem

Data brokers aggregate information from thousands of sources—apps, websites, public records, purchase data, location traces—creating profiles sold to advertisers, insurers, employers, landlords, and anyone else willing to pay. Most people have never heard of companies holding detailed information about them. From one perspective, this shadow data economy represents surveillance capitalism's most egregious form: profiting from personal information without people's knowledge or consent, enabling discrimination without accountability, and creating profiles affecting life opportunities that subjects cannot see or correct. The solution requires: prohibiting data broker models entirely or severely restricting them; mandatory registration and transparency about sources and uses; individual rights to access all profiles and demand inaccuracies be corrected; restrictions on sensitive inferences like health conditions or financial status; and prohibition of using brokered data for consequential decisions about employment, housing, or credit. From another perspective, data brokers provide valuable services: risk assessment protecting against fraud, marketing enabling businesses to reach customers efficiently, and aggregated insights informing business and policy decisions. Restricting brokers would harm legitimate commerce while forcing less transparent arrangements that serve the same functions. Whether data brokers should be prohibited, heavily regulated, or simply required to be more transparent determines what reforms are pursued.

The SDK and Advertising Network Problem

Apps often incorporate software development kits (SDKs) and advertising libraries that collect data the app developer never sees. A seemingly simple app may contain dozen of SDKs each collecting information and sharing with their own networks. From one view, this represents outsourcing of surveillance where app developers disclaim responsibility for what SDKs do while benefiting from their functionality or revenue. Users interacting with one app become known to dozens of companies. The solution requires developers to be responsible for all data collection and sharing that occurs through their apps, with liability for SDK behavior and requirements to disclose all third parties receiving data. From another view, developers cannot control or even monitor what third-party code does, and requiring them to be liable for SDK behavior would make incorporating any third-party code impossibly risky, preventing code reuse that makes development efficient. Whether developers or SDK providers bear primary responsibility determines how sharing through embedded code is governed.

The Cross-Context Tracking Challenge

Third-party sharing enables tracking individuals across different contexts: the same advertising network sees someone's health website browsing, shopping behavior, location patterns, and social media activity. From one perspective, this cross-context tracking enables manipulation, discrimination, and privacy invasion far exceeding what any single first party knows. The solution requires technical and legal restrictions preventing entities from connecting data across contexts: prohibiting cross-site tracking, requiring data isolation between services, and establishing that combining data from multiple sources creates new collection requiring fresh consent. From another perspective, cross-context data enables beneficial personalization and efficiency. Seeing someone's activity across services allows better fraud detection, relevant recommendations, and consistent experiences. Prohibiting data combination would fragment services and prevent legitimate analytics. Whether cross-context integration should be prohibited, restricted, or simply made transparent determines what data combination is permitted.

The Question

If third-party data sharing means personal information flows to dozens or hundreds of entities people have never heard of, who use it for purposes beyond the original context and often resell it further, does that represent legitimate business model enabling free services or surveillance capitalism treating people as products? When consent to third-party sharing is buried in unreadable policies that users must accept to access services, can it be considered meaningful, or does this reveal that consent frameworks are fundamentally inadequate for governing data ecosystems? And if restricting third-party sharing would eliminate advertising-supported business models and force services behind paywalls, does that justify current practices, or does it demonstrate that business models built on monetizing user data through opaque sharing should not be allowed regardless of their commercial success?

 

 

 

 

 

0
| Comments
0 recommendations