Fitify’s publicly accessible Google cloud storage bucket has exposed hundreds of thousands of files. Some of the files were user-uploaded progress pictures that individuals upload to track their body changes over time. After Cybernews contacted the company, the unprotected instance was closed.
Unfortunately, uploading media online always brings some risks, even when the recipient is a trusted vendor. Enter: Fitify, a popular fitness app with over 10 million downloads from the Google Play store and an estimated 25 million total installs across all platforms.
In early May, the Cybernews research team discovered a Fitify-owned and publicly accessible Google cloud storage bucket. While most of the files exposed in the unprotected instance were workout plans and instruction videos, researchers also noticed photos that users shared with the app’s “AI coach” and their body scans.
The app’s target audience is users who want to lose weight, get in shape, or otherwise better their body. Body scans allow the tracking of changes over time, as Fitify users exercise or diet according to their fitness plan. Fitify’s Google App store description clearly states that “data is encrypted in transit,” providing reassurance to users that their private photos won’t be exposed.
However, the Cybernews team, or anyone else for that matter, could access the cloud storage without any passwords or security keys.
“It is also worthwhile to note that 'progress pictures' and ‘body scans’ are often captured with minimal clothing to better showcase the progress of weight loss and muscle growth. Therefore, most of the leaked images might be of the types that users normally would like to keep private and not share with anyone on the internet,” the team said.
Fitify Workouts, the company behind the app, responded after being contacted by Cybernews researchers and closed the exposed instance, removing it from the public site.
Cybernews’ journalists have reached out to the company for official comment and will update the article once they receive a reply.
The now-closed Google cloud storage bucket contained a total of 373,000 files. Two hundred and six thousand of these were user profile photos, and another 138,000 were labeled as progress pictures. Thirteen thousand of the files were shared via the app’s AI coach message attachments, and another 6,000 were labeled as “Body Scan” data, including pictures and AI metadata.
The body scan feature allows users to make a 3D scan of their body, with the app providing a detailed analysis of their lean mass, body fat, posture, and other aspects they may want to improve or track.
“The leak shows that the access controls implemented by the app were insufficient to secure user data, and the fact that this data could be accessed by anyone without any passwords or keys demonstrates that user data was not encrypted at rest,” the team explained.
After discovering the exposed instance, the researchers cross-checked whether Fitify’s name was included in the randomly selected dataset, which the team used to investigate how secure Apple App Store’s apps actually are.
Cybernews researchers downloaded 156,000 iOS apps, around 8% of all apps on the Apple App Store, and discovered that developers often leave plaintext credentials in the application code accessible to anyone.
The findings revealed that 71% of the apps analyzed leak at least one secret, with an average app's code exposing 5.2 secrets. It turns out that Fitify was no different.
“After investigating the exposed secrets, we discovered credentials that could potentially be used to access even more customer data and the application’s backend infrastructure,” the team explained.
“It also shows that the misconfigured cloud bucket access controls weren't the only mistake made by the app’s developers, as numerous API Keys and sensitive endpoint locations were also hardcoded within the app’s front-end.”
Developers hardcode secrets for numerous reasons. While sometimes necessary for the app to function properly, some secrets and keys should not be kept accessible as they allow attackers to dive deep into the app and potentially access private user data.
The research team noted different types of secrets between development and production environments. Fitify’s development environment had the following hardcoded secrets exposed:
Attackers could use IDs and keys to access Google and Firebase infrastructure components, gather information, and then dig into the app, potentially obtaining sensitive user data.
For example, exposing Google Client ID and Android Client ID could enable malicious actors to impersonate legitimate app instances, potentially gaining access to user accounts. At the same time, the storage bucket could enable attackers to inject malicious files or modify existing content.
Meanwhile, Fitify’s production environment had the following hardcoded secrets:
Coupled with previously leaked information, hardcoded secrets may enable attackers to access users’ social media data through Fitify. Combined with Google credentials, this creates multiple attack vectors for scenarios affecting both fitness data and social media profiles.
Our researchers note that the Algolia API Key is one of the less commonly leaked secrets. Algolia provides software and tools that allow businesses to implement rapid web search for individual websites. The research team did not test the leaked API key, so it is unclear what data the database stores.
Our researchers believe that to effectively mitigate the issue, it’s best to focus on exposed instances and hardcoded secrets separately.
To fix cloud storage bucket-related issues, the team advises:
Configuring the cloud storage buckets' built-in authentication features to restrict access to only the employees and systems that are meant to access the stored data.
Meanwhile, to prevent apps’ secrets from falling into the wrong hands, the team advises the following:
Address:
1855 S Ingram Mill Rd
STE# 201
Springfield, Mo 65804
Phone: 1-844-277-3386
Fax:417-429-2935
E-Mail: contact@appdevelopermagazine.com