Apps selling data can monetize in safer ways
|Scott Walsh in Security Wednesday, October 28, 2020|
Scott Walsh, principal threat intelligence researcher at SecurityScorecard, offers insights into how apps selling data can monetize in safer ways. The different challenges to ending data collection inside of apps, while not creating a two-tier system of privacy haves and have-nots.
Your data is everywhere. You give it away, knowing and unknowingly, all of the time. The most ubiquitous and nefarious, of these applications that harvest data are often with you all the time; they live on your smartphone. The wide availability of insights on consumer behaviors has given rise to a robust data economy that now subsidizes costs so people can have apps and services for “free”.
While privacy issues have gained mindshare with the general public, moving away from our current models isn’t as simple as it sounds.
Apps selling data can monetize in safer ways
Whether you pay for an app or not, unless it is specifically called out in the “terms and conditions” do not expect that your data remains only with the application developer. It’s much safer to assume that the application developer will sell as much user data as possible, with as many parties as possible, all in the name of making money.
Advertising-supported apps can be particularly bad, as both the application and the advertisement service get access to your data at the same time, which only amplifies the impact and aggregated value of data that is then sold. The old adage of “if you’re not paying for it, you are the product” has given way to “even if you are paying for it, you are the product.” Other models, the ones that you pay for the application, are far less clear, perhaps by design, about what they do with your data.
Cost Offsetting / Profit maximizing
When you buy a newspaper, even though you’ve paid for the paper, you’re still pummeled with advertisements. In this model, the purchase price is subsidized by advertising dollars. When purchasing a news application, the usual reward, especially if there is a free and a paid version, is that advertisements are removed. Since the developer is unable to monetize via ads, data analysis and sale is the only way to offset costs and increase profits.
Use Case Is to Leave the Service
Paid online dating apps, like match.com have an interesting paradox: they want paying users, while stating that their objective is, effectively, to have the users leave their service. This presents a number of issues that make the business model challenging. The first problem is that of critical mass; having enough users in a particular region to have matches occur. This can directly be influenced by the cost of membership, and a higher price will slow adoption. If the company has to lower prices to create an influx of users, it’s likely the shortfall will be made up in aggregate statistics or discrete user data sales. Once a service is established with a good reputation, they may be able to raise prices and stop selling data, but why wouldn’t they raise prices and continue to sell data? Plus, when you’re looking for someone to spend time with you’re more apt to be honest about how you live your life than you probably would be with an insurance provider. Discrete data on your lifestyle has many knock-on effects that could run the range of targeted advertising, to increased insurance premiums and many things in between.
Bait and Switch
Strava, a fitness tracker, needed critical mass before pivoting. This is an extreme case of discounting up-front, with a plan to monetize at a later date. In this case the discount was that almost all of the functionality was free. In May of 2020, Strava made the social aspect of their platform, the “Segment Leaderboard”, which is arguably the best feature of the service, became a subscriber-only feature. Financially speaking, Strava would have an expected conversion rate from the free users to paid subscribers, but what if that falls short? By the very nature of the applications, Strava has mountains of GPS data on all of its users, a rich source of targeted advertising data, which it can combine with lots of other unique insights like general fitness level or exercise frequency.
Unsustainable Business Models When You Can’t Sell Data
Wink, a home sensor and automation platform, had to run infrastructure to deliver functionality and relied on their hardware sensor sales to keep themselves profitable. Due to the nature of the product, selling the data collected isn’t a viable option. This strategy suffered from several problems, and they have pivoted to a subscription model in an attempt to keep the lights on in June of 2020 (later extended until July 2020). In this case, Wink delivered a hub that was compliant to several standards, which meant that users were under no obligation to purchase Wink branded sensors to outfit their homes. It would appear that consumers were happy to purchase other standards compliant sensors. Even if consumers were only buying Wink branded sensors, the second issue is that the sensor market is not one of continual growth; consumers don’t keep buying sensors, they build out a system and then move into “monitoring and automation only”. Without additional revenue from data sales, this becomes a make or break for Wink, and if the company fails, the entire installed user base is left with systems that no longer function.
Privacy as a Feature
We’re coming to a point where privacy will be considered a feature, but that will come at a cost. When companies choose to forgo monetization of data, there is a good chance that the loss of revenue will result in an increase in a one-time purchase price or a transition to a subscription model. This creates yet another digital divide: those who can afford to pay for privacy and those who can’t. Privacy feels like it should be the default setting for all applications and services not something people must pay a premium for. Unfortunately, people have become so conditioned to getting things for “free”, a switch to subscriptions will be very difficult. Consumers will feel like there’s a combination of a “money grab” and “being nickel and dimed” in the case of many subscriptions, which leaves us with a chicken and egg problem of making the transition to a “privacy by default” world.
This content is made possible by a guest author, or sponsor; it is not written by and does not necessarily reflect the views of App Developer Magazine's editorial staff.