1. Ethics standards and security protocols app developers should follow
9/30/2019 7:51:02 AM
Ethics standards and security protocols app developers should follow
Internet of Things,Very,Ben Wald,App Security
https://appdevelopermagazine.com/images/news_images/Best-Practices-for-Dealing-with-User-Data-App-Developer-Magazine_wf54gfbg.jpg
App Developer Magazine

Ethics standards and security protocols app developers should follow



Richard Harris Richard Harris in Security Monday, September 30, 2019
9,276

We chat with Ben Wald, co-founder of Very, about ethics standards and security protocols app developers should follow.

Consumers put a lot of trust in app developers. Sacrificing data for convenience isn't a new concept - FaceApp is only the most recent example of the low bar set for allowing our private data into the hands of any company that pushes out an app. Yet, even after so many breaches and all-encompassing terms of use, why aren't more developers doing more to advocate and educate the space on security? Founders, project managers, and engineers need to be working off of clear company ethical guidelines before any product is designed for the market to balance innovation while respecting the privacy and security of their end-users.

Ben Wald, co-founder and Head of Client Strategy at Very, shares insight on how developers should be thinking about the best practices when dealing with user data. As a highly specialized Internet of Things (IoT) app and platform design and development firm, Ben and his team not only need to be cognizant of both software and hardware developments to help their clients build, maintain and scale their solutions. 

We recently sat down with Ben to discuss why so many apps are gaining popularity despite a lack of meeting security and privacy standards. 

ADM: What are the biggest opportunities with app development security?

Wald: We've seen over and over again companies getting hacked and having huge data breaches, the stakes are high to get it right however, so are the costs for cutting corners. Because of this, companies need to consider how to effectively maximize their security efforts to ensure these features protect the consumer, while also allowing them to quickly make updates to their software. Unfortunately, sometimes new features, tight timelines, and budget dictate cutting corners because rarely is more security simpler or easier. Security needs to be made a priority, and there is no simple fix to make this process easier.

What are the biggest opportunities with app development security

ADM: Why do developers sacrifice security for function or design?

Wald: Building in security isn't inherently difficult, but it does require time and dedication. If security isn't a priority for stakeholders driving the project, it's easy for other features to be prioritized over security. Often times the priority is building a new product or updating an existing one is new feature development. Every project we've ever encountered has needed to be launched under a tight deadline and under a particular budget. Given those constraints, it can be hard to prioritize work that doesn't necessarily provide user/customer value and sometimes detracts. When you build additional layers of security into an application, you are almost always compromising convenience and ease of use. Product and design stakeholders are going to be inherently pushing for the simplest sign-up path, while developers and stakeholders concerned with account security may be lobbying for two-factor authentication. There's always going to be a give and take and ultimately a balancing act at play. Given the number of times we have seen product vulnerabilities exploited, it's critical for companies to take security seriously and carve out time, energy, and budget to address it. Avoid having the engineering team up against a wall to deliver new features in a fast and furious style and consider the longer-term life cycle of the product. 

ADM: Companies and development teams love having more data to analyze and iterate from, so how can developers set the scope to gather helpful data without scraping too much from users?

Wald: There is a huge difference between collecting user information as it relates to engagement and tracking behavior and personally identifiable information (PII.) To continue iterating on an application you need to be collecting an enormous amount of data to analyze. If you are building an app that personalizes experiences or makes recommendations, the quality of those are dependant on collecting copious amounts of user information and processing it. With that in mind, there is a huge difference between collecting that type of information and information like first and last name, email, cell phone, home address, social security number, or any other piece of personal information. When this information is collected, it should be collected for a very discrete reason. The scope of data collected around a user's interactions inside the application can and should be extremely broad, but the scope of the PII should be extremely limited.  

ADM: What type of ethical considerations should developers be considering and discussing when they draft up the terms around the data they will collect? How can they communicate this effectively to users?

Wald: Developers are also passionate users and often view architecting and building an application through the lens of themselves being an end-user. Ethically, it's important to tell users and explain to them how their data is going to be used -  there's a big difference using data to improve the application versus being added to a list that's sold for others. Companies may even have an agreement to share data that they should be upfront about. The new standard needs to be "only collect what you need to collect" and if you are collecting, be very explicit and transparent about the purpose in doing so. 

ADM: FaceApp's reappearance in the broader tech discussion earlier this summer helped spur an even larger conversation around what people were giving developers access to when they use a given app. What are you hearing from customers and companies in the wake of that very public scrutiny? 

Wald: Apps like FaceApp have brought data and privacy under the spotlight and it's not necessarily a bad thing for consumers or developers. It forces the development community to abide by standards and best practices - no excuses. Companies know they are vulnerable to extreme backlash from their user base if something goes wrong - especially if it was easily preventable. If a company gets hit with a state-sponsored attack, that's one thing. If huge amounts of customer data get exposed because they did something reckless and preventable, that's something else entirely. The discussion of security to the forefront is critical for users to understand how their data is shared, and for developers to be conscientious about using data in a meaningful and purposeful way.

ADM: Have you seen consumers avoid or abandon applications because of poor security and privacy practices? Are you able to quantify how many potential users are lost because of data collection practices that are too broad or security practices that are too lax? 

Wald: The only customer attrition or suffering conversion rates in applications we have seen are from companies trying to collect too much information. The less information you force users to provide, the fewer the signup steps and overall the higher your conversion rates will be. This is especially the case when users aren't able to easily understand how the app will be returning value to the user for providing that information. If an application is asking you for your cell phone number, it should be abundantly clear how that application is going to be using it to provide value back to that user. If it's unclear, you should expect that the friction created there will manifest in suboptimal conversion rates. Developers and product companies should have an obligation to be straightforward with users about how their data is being used. Companies should be consistent with their philosophies and policies - transparency is key. 

ADM: Do you see longer-term effects for companies that choose sweeping data collection over consumer privacy and security?

Wald: Sweeping data collection is what has launched incredibly successful companies, like Google, into the stratosphere. The more you know about people and their habits, buying patterns, demographics, and psychographics - the more you can build and launch new products with high probabilities of connecting with customers and the latest trends. But, buyer beware. As a company, the more valuable information you are collecting, the more dangerous of a position you put yourself in as you'll have a target on your back. What this means is that there is a hidden cost to the company for collecting sensitive information and the more PII information you collect, the more you will need to spend securing all that information. This could result in your company needing to carry higher levels of cyber risk liability insurance or that you may need to hire dedicated engineers focused on your cybersecurity strategy. It will definitely mean that launch fast / fail fast - where breakage along the way is acceptable - will not be a viable product strategy. Therefore, it's important that when designing the application there is an evaluation done to determine the risk/reward proposition and cost assessment for collecting different scopes of user information.

Best practices for developers when setting ethics standards and security protocols

Best practices for developers when setting ethics standards and security protocols 

  1. Wald: For security protocols, there are a couple of strategies we have found that enable a lot of data to be collected on users but minimize the threat of the data falling into the wrong hands.
     
  2. Maintain two separate databases. This is essential, as one database will contain PII and sensitive user information, the other database will contain analytics about user behavior within the app. The latter will also have additional contextual information so that reports can run effectively. There is a key that will decipher and connect the specific user to their behavior, but this is a much more secure approach. If one database were to be breached, it means the other one is not automatically breached, so siloing these will be helpful in the long-run.
     
  3. Limit your attack surface, sanitize your files, and anonymize your data. Stay away from using services that don't sanitize or anonymize data and log files. While a service like Sentry does this naturally, not all services do, so it's important to recognize what programs you're using and how you're using them. Files get passed around through stakeholders for products all the time - developers pass around logs, a marketer or analyst will request a report to perform some analysis and these files will be sent possibly via email Slack or Skype - the possibilities are endless. Unfortunately, as soon as that happens, platforms like Slack or email can become an attack vector. Your core systems may be extremely secure, but without knowing it product stakeholders can easily create new vulnerabilities. This is why it's so important to have the practice in place of sanitizing files as a part of your process instead of after the fact.
     
  4. Be careful with access points. Tokens should always expire. There should never be any permanent access credentials or tokens that exist when building an application. You should be able to rotate how you are validating credentials and be able to invalidate all tokens at once if a bad actor gets access to the systems. By limiting these access points, you can ensure that you're staying on top of your security game.
     
  5. Be wise about where you share and leave your data. You have choices on where to persist data in both software and firmware. Be careful about where you leave important and sensitive data, or store personal information where another app may be able to access it. Avoid leaving your immutable secrets in source code where possible. Another important best practice is not to bundle keys, passwords or secrets you can't change in your application source code because someone could decompile the APK and rebuild the app and then see your secrets. Once your app is in the wild, it could be reverse-engineered - anything valuable inside of the code would be vulnerable. For example, if you bundled an AWS API key in your application, and it was to be reverse-engineered, your key could now be used to access AWS resources putting your systems and budget, in jeopardy. Or another example, if you have a random secret key that allows you to sign requests, someone could take that key and impersonate you.  
     
  6. Don't store things in plain text. Leverage hardware security modules (HSMs, see also: Apple secure enclave.) Any external communication from a mobile application or hardware device, like in non-IoT/embedded development, should be encrypted. Credentials should always be stored in a keychain or some type of secure storage, even if it's just sitting on your local computer or development machine. Use Transport Layer Security, which provides cryptographic protocols for communications over computer networks. When building mobile applications, use a secure URL and recent versions of deps/libraries. Security updates to these are very common and if you stay pinned forever you fall behind; known exploits are addressed, but you're on a version without the fix.
     
  7. For any hardware out in the field enable over-the-air firmware updates. This will allow you to respond quickly when necessary to update your devices when you discover a vulnerability or new threat - and it's likely you will. While you can never stay ahead of every common value exposure (CVE), it's essential to be aware of them and know how to handle these issues when they arise. Stay up-to-date on websites and email lists that publish them in a timely fashion or learn from engineers and security experts who share insight and best practices daily. 
     
  8. Use client-side SSL. Typically browsers verify that the server you're interacting with is safe, with a green bar or lock. Client-side SSL is the server doing a similar check on you additionally. This isn't really tenable where humans are involved, but in an IoT system between a device and a server, it allows the server to have confidence at the very edge that they are playing with someone who is who they say they are.

ADM: How can developers better communicate with their executives, clients or project managers about what they need versus what they are already collecting?

Wald: Asks can and should be much more specific than speculative. Maybe in the past, product stakeholders would default to collect tons of information, more than they needed, so that in the future they would have it to use. Today, engineers need to be knowledgeable about the applications they are building and the tools they are using. Executives and marketers need to have a similar level of expertise and be looking around the corner before a request for user data makes its way into a product roadmap. This is incredibly important nowadays because if you are collecting a lot of PII, as a company, you'll have a target on your back.

Ben Wald

Who is Ben Wald?

Ben Wald is the co-founder and Head of Client Strategy at Very, where he develops high-level strategies to solve prospective client challenges, focused on IoT scalable solutions. In his role, Ben works day-to-day with clients to create and plan IoT projects from any stage of development, assessing their long-term roadmap and ROI. He is passionate about improved security, better UX for the enterprise, as well as agile IoT. During Ben's career, he co-founded three successful startups. His first endeavor — an online education software company was acquired by eCampus, resulting in Ben being named one of Businessweek's Top 25 Young Entrepreneurs.