Can You Pay For Privacy? Consumer Expectations and the Behavior of Free and Paid Apps – Berkeley Technology Law Journal – article

Can You Pay For Privacy? Consumer Expectations and the Behavior of Free and Paid Apps

Kenneth A. Bamberger | Berkeley Technology Law Journal


“Paid” digital services have been touted as straightforward alternatives to the ostensibly “free” model, in which users actually face a high price in the form of personal data, with limited awareness of the real cost incurred and little ability to manage their privacy preferences. Yet the actual privacy behavior of paid services, and consumer expectations about that behavior, remain largely unknown.This Article addresses that gap. It presents empirical data both comparing the true cost of “paid” services as compared to their so-called “free” counterparts, and documenting consumer expectations about the relative behaviors of each.We first present an empirical study that documents and compares the privacy behaviors of 5,877 Android apps that are offered both as free and paid versions. The sophisticated analysis tool we employed, AppCensus, allowed us to detect exactly which sensitive user data is accessed by each app and with whom it is shared. Our results show that paid apps often share the same implementation characteristics and resulting behaviors as their free counterparts. Thus, if users opt to pay for apps to avoid privacy costs, in many instances they do not receive the benefit of the bargain. Worse, we find that there are no obvious cues that consumers can use to determine when the paid version of a free app offers better privacy protections than its free counterpart.We complement this data with a second study: surveying 1,000 mobile app users as to their perceptions of the privacy behaviors of paid and free app versions. Participants indicated that consumers are more likely to expect that the free version would share their data with advertisers and law enforcement agencies than the paid version, and be more likely to keep their data on the app’s servers when no longer needed for core app functionality. By contrast, consumers are more likely to expect the paid version to engage in privacy-protective practices, to demonstrate transparency with regard to its data collection and sharing behaviors, and to offer more granular control over the collection of user data in that context.Together, these studies identify ways in which the actual behavior of apps fails to comport with users’ expectations, and the way that representations of an app as “paid” or “ad-free” can mislead users. They also raise questions about the salience of those expectations for consumer choices.In light of this combined research, we then explore three sets of ramifications for policy and practice.First, our findings that paid services often conduct equally extensive levels of data collection and sale as free ones challenges understandings about how the “pay for privacy” model operates in practice, its promise as a privacy-protective alternative, and the legality of paid app behavior.Second, by providing empirical foundations for better understanding both corporate behavior and consumer expectations, our findings support research into ways that users’ beliefs about technology business models and developer behavior are actually shaped, and the manipulability of consumer decisions about privacy protection, undermining the legitimacy of legal regimes relying on fictive user “consent” that does not reflect knowledge of actual market behavior.Third, our work demonstrates the importance of the kind of technical tools we use in our study — tools that offer transparency about app behaviors, empowering consumers and regulators. Our study demonstrates that, at least in the most dominant example of a free vs. paid market — mobile apps — there turns out to be no real privacy-protective option. Yet the failures of transparency or auditability of app behaviors deprive users, regulators, and law enforcement of any means to keep developers accountable, and privacy is removed as a salient concern to guide user behavior. Dynamic analysis of the type we performed can both allow users to go online and test, in real-time, an app’s privacy behavior, empowering them as advocates and informing their choices to better align expectations with reality. The same tools, moreover, can equip regulators, law enforcement, consumer protections organizations, and private parties seeking to remedy undesirable or illegal privacy behavior.


Your comment

Scroll to Top