1. Applause Platform improves quality assurance platform
1/9/2020 11:15:58 AM
Applause Platform improves quality assurance platform
Agile,Jonathan Zaleski,Quality Assurance,Applause
https://news-cdn.moonbeam.co/Applause-Platform-improves-quality-assurance-platform-App-Developer-Magazine_my9fh0oz.jpg
App Developer Magazine
Artificial Intelligence

Applause Platform improves quality assurance platform


Thursday, January 9, 2020

Richard Harris Richard Harris

Applause the crowd testing provider is looking to evolve along with QA. We recently talked with Jonathan Zaleski about the improvements in the Applause Platform.

Quality assurance (QA) is a rapidly changing field, a result of dev and engineering teams moving faster and releasing more frequently. Applause – the leading crowdtesting provider – is looking to evolve along with QA. The company recently launched Applause Labs, an innovation engine where it hopes to develop new concepts for companies’ QA processes.

As the head of Applause Labs, Jonathan Zaleski is focused on improving the Applause Platform using cutting-edge capabilities like artificial intelligence and machine learning, and exploring the next big innovations in testing.

ADM: What is the impact on testing as development teams become increasingly Agile?

Zaleski: The movement to Agile has changed the quality assurance process. It’s gone from “test everything” to “test as fast as you can.” That’s because Agile methodologies are focused on speed, so quality too often becomes secondary. The approaches of the past – like lab-only testing and offshoring – are not able to keep pace with Agile teams. But there are other approaches that can deliver quality with speed. Companies are embracing automation and crowdtesting because they work with Agile and shift testing left in the SDLC. I expect this trend to continue so teams can move quickly and be confident in the quality of their releases.

ADM: Where do you think the future of testing is headed?

Zaleski: The move to automation and crowdtesting has already begun, in large part due to the ubiquity of Agile. Automation can now be used for more testing types, which has helped it grow in popularity. Plus, improvements in throughput have decreased latency. Crowdtesting, too, is constantly improving and becoming “smarter.” At Applause, for example, we are leveraging AI to match the right testers to the right projects. This has enabled us to quickly segment our community and provide testers with the desired capabilities and skills to test for the customer.

ADM: What kind of companies use crowdtesting?

Zaleski: Thousands of leading companies rely on crowdtesting as a best practice to deliver high-quality digital experiences that their customers love. Next-generation brands like Google, Airbnb, and eBay all turn to Applause for their testing. These companies all need testing solutions that are both global and highly scalable. Crowdtesting delivers on these promises by providing testers across the world, who speak different languages, match customer demographics and have access to the relevant devices that need to be tested. Having access to such a large community of testers means that companies can ensure all types of applications – from websites and mobile apps to voice experiences – work for real users under real-world conditions.

ADM: When is the right time to release software? How do companies know when it’s the right time?

Zaleski: Today, engineers are essentially making a gut call when it comes to releasing software. The “right” time to release is highly dependent on a number of factors, namely the needs of the business. The application’s maturity also plays a role in how quickly it can be tested and released. In fact, the maturity of the engineering team also plays a large role – i.e. how disciplined the team is relating to making backwards compatible changes, the robustness of local/integration testing, and their ability to respond and course-correct rapidly in the case of any issues. Finally, knowing when it is time to release depends on the industry. Many e-commerce sites release hundreds (if not thousands) of times per week, based largely on the competitive nature of the market, whereas financial and government firms may release only a handful of times per year due to the need to stay stable.

How did the quality score come to be? The quality score started in our innovation engine – Applause Labs. We were sitting on over a decade of testing data and felt we were well-positioned to deliver deep insights into product maturity for our customers. With the Applause Quality Score (AQS) our goal is to take the guesswork out of when clients should release software by making it a data-driven decision. The AQS is a calculated value – ranging from 0 to 100 – which describes the level of quality tested for a build during one or more test cycles based on testing done and results collected (think of it like a credit score for release quality). The AQS is a comparable metric, providing clients with insights into their overall quality build-over-build, as well as quality trends over time.

ADM: What is the ultimate goal of Applause Labs? What do you hope to accomplish with it?

Zaleski: Applause Labs is a formalized innovation engine. With it, we hope to increase our pace of innovation and release new features (and in some cases, entirely new solutions), sometimes taking our business in completely new directions. The rapid prototyping of exploratory projects that Applause Labs enables will help to source and proof the company’s next great ideas.

ADM: How did Applause Labs come to be?

Zaleski: Applause Labs was conceptualized out of our business need to continue to innovate and differentiate ourselves in the application testing market. We are still improving our core offerings on a daily basis, but Applause Labs goes beyond this day-to-day work, moving us out of our comfort zone in order to evaluate new ideas and concepts that have the potential to yield a more dramatic impact to the company and the solutions we deliver to our clients.

ADM: How do you source ideas and how often are new concepts prototyped?

Zaleski: We source new ideas from employees and internal audiences via an Aha! ideas portal. We also source ideas from customers that are part of our Customer Advisory Board. In the future, we may open up the process to allow external audiences to submit ideas to Applause Labs. We are already seeing a ton of ideas come through from our internal audiences, which the team acted on quickly. Within the first two weeks of its existence, Applause Labs evaluated and prototyped over ten concepts. Projects are ideally scoped to fit into under two weeks so that we can maintain a “fail fast” mentality and spend our time investigating ideas that we can influence quickly while discarding projects that do not pan out or deliver the expected value.

ADM: How do you evaluate the prototypes that are produced by Applause Labs?

Zaleski: At the end of each project, we want to arrive at a tangible recommendation on next steps.

These are divided into four categories:

1.    Yes – we should get this project onto the product roadmap.
2.    No – this project isn’t worthwhile or feasible.
3.    Not now – The project could be valuable, but the scope is too big, there is a technical limitation or there is some other blocker.
4. Keep going – The project completion is close and we need to take it a bit further in order to provide a clearer recommendation.


Subscribe to App Developer Magazine

Become a subscriber of App Developer Magazine for just $5.99 a month and take advantage of all these perks.

MEMBERS GET ACCESS TO

  • - Exclusive content from leaders in the industry
  • - Q&A articles from industry leaders
  • - Tips and tricks from the most successful developers weekly
  • - Monthly issues, including all 90+ back-issues since 2012
  • - Event discounts and early-bird signups
  • - Gain insight from top achievers in the app store
  • - Learn what tools to use, what SDK's to use, and more

    Subscribe here