Artificial intelligence and app development


VR
Posted 11/29/2016 12:01:15 PM by RICHARD HARRIS, Executive Editor


Artificial intelligence and app development
The growth of artificial intelligence and machine learning is driving a whole new class of application possibilities. From chatbots to predictive analytics and more, developers and businesses as a whole should explore ways to utilize their customer and business data to deliver better customer service, new products and reimagine their processes. What went wrong with Microsoft’s Tay bot on Twitter? How will AI impact app development in the next 12 months? In a recent conversation, Barry Coleman, CTO at Agent.ai, explored the paradigm shift in development that will change the way we interact, work and even craft the perfect smoked meats.

ADM: How is artificial intelligence (AI) influencing app development today?


Coleman: AI has been an influential force in application development for several years, perhaps most visibly to consumers in the form of Apple’s Siri. The relative inflexibility of static, rules-based algorithms has opened the door to a huge variety of applications as machine learning has moved out of its infancy. The availability and growing sophistication of AI and machine learning is causing a paradigm shift in the way that developers think about modeling algorithms and interactions within applications. 

For many applications, the outputs are constrained by the programmer that wrote the algorithm in the first place and their expectations about usage and performance. If the outcome isn’t clearly evident from the input, oftentimes the systems don’t perform well. Adopting AI and machine learning techniques enables the inputs and outputs to evolve as the environment changes. A system using AI and machine learning techniques will be able to look at the inputs and learn from them over time, adjusting the outputs. 

An example of this is a traditional thermostat vs. Nest. With a standard thermostat, a homeowner would set a temperature that would trigger the heater or air conditioner to switch on. The concept of a Nest-style “smart” device is more dynamic. Nest would learn the family’s preferences and make adjustments based on that criteria over time. It can also adapt based on new inputs. 

Developers today should spend a lot more time thinking about how to collect, sort and store the data that their applications gather so they can feed it into the machine learning algorithm, as opposed to the standard static algorithm that does not change. 

What made this possible? What’s changed over the last 3-5 years to really move it forward? 


Coleman: One of the key technology shifts that makes complex machine learning and AI possible is cloud and big data computing. Now that a company doesn’t have to own, manage and maintain its own data center to operate complex algorithms, it is feasible to collect and utilize terabytes of information without maxing out on capacity or cost. It’s as simple as spinning up a few new servers in AWS or any cloud platform when needed and turning them off when not utilized. 
The availability of GPU based system in the cloud has brought the compute power needed to run large scale learning algorithms at a price point that is affordable even for startups. These huge physical farms of servers used to be the realm of large companies and government agencies. 

Bigger projects like IBM Watson have brought AI under the spotlight, as well. Now Watson, and many other similar projects, are making their way into a huge variety of enterprise applications. The algorithms have been there, but the computation capability is what made it possible.

Where is AI and machine learning gaining traction? 


Coleman: There are endless possibilities for AI and machine learning to impact consumers and businesses. Driverless cars, search recommendations, chatbots and predictive analytics all use these technologies. In the advertising space, advertisers use machine learning to predict the outcomes of the bidding process on a given advertising space; forecasting, in real time, the likelihood of a click or more recently a down stream revenue event and deciding not only whether to bid but how much they are willing to pay for that specific impression. 

Agent.ai is utilizing AI and machine learning to compute customer interaction data and feed that information into chatbots to help businesses respond more quickly and effectively to customer needs. 

Any closed-loop system is ripe for machine learning. If there is a decent volume of historical data that can be used to help predict future outcomes, AI can have an impact.

Why does some so-called “AI” really miss the mark?


Coleman: In most cases, algorithms need some kind of constraints to make them useful. Microsoft put “Tay” up on Twitter without those constraints and the Twitterverse “taught” the AI some pretty ridiculous things.

One specific challenge is that people refer to applications as “AI” when they’re not. There’s an expectation when you put the word “intelligence” on something, and if it’s not truly built to learn, users are disappointed. One example of this is in chatbots that you might interact with if you 
Barry Coleman
Barry Coleman
 select the “live chat” option on a website. If the bot isn’t given enough input and “training,” its scope of response will be very limited and often frustrating to customers. This “training” phase is critical and should incorporate as much information as possible to become truly useful.

How can we expect AI to mature over the coming years?


Coleman: This is an exciting moment for AI and, with so many companies developing applications with it, it’s maturing quickly. In the case of chatbots, we can expect them to become more mature relatively quickly – in the next one to two years. Compared with interactive voice response (IVR), chatbots are dealing with text interactions that are more constrained than voice recognition. Has Siri ever made a suggestion to call someone with a name radically different than the one you asked for? Voice recognition is a hard problem that has a long way to go; chatbots skip the speech-to-text problem and allow us to focus just on the language.

In the near future, you might actually prefer to interact with a chatbot that’s available 24/7 rather than a long wait time in a call center to talk to an agent. But rather than replace a customer service agent’s job with a chatbot, we are focusing on increasing the efficiency of agents, leading to faster response time to customers. AI and Machine Learning allow the chatbot to learn the common conversation threads, what a call center agent would call the “easy” questions, and handle those automatically. The agent’s time is optimized to address more complex issues that humans are particularly well suited to solve. In a world where millennials are the leading population in the workforce, self-service via chat is sure to be a growing channel for businesses of all kinds.

On a separate note, I recently built a small learning system for my meat smoker. I sample the air temperature inside and outside the smoker chamber. I sample the temperature of each piece of meat. I also sample the volume of wood pellets being fed into the firebox. I then programmed the system to predict when I need to refill my smoker with wood and when a piece of meat should be “done”.  Those long cooks through the night now involve much less standing by the smoker and much more sleep. So the possibilities really are endless!

How do you think AI’s influence will impact businesses and consumers in the next 12 months or so?


Coleman: Well, first of all – better smoked meats! 
Seriously, though, many devices and applications were written with a fixed algorithm that does not adjust based on observed behavior. This is where things will change. As we get more AI and machine learning-driven apps, we will find ways to utilize the data that businesses are collecting via point-of-sale machines, online traffic, mobile devices and more. The algorithms will be able to sift through this data, find trends and corrolations, and adjust the applications themselves to create more meaningful, context-rich experiences that truly do bring meaning to the word “personalized.”

If we look to the future of chatbots specifically, we’ll see a new architecture that utilizes tiers of bots in the same application. For easier coding and faster response time, a chatbot may have multiple functions (checking on the status of an order, reporting specials, etc.), but several sub-bots actually execute an individual task as opposed to tackling them all in one program. 

Editors note: Barry Coleman is CTO at Agent.ai, a provider of artificial intelligence-enhanced customer service solutions. Prior to Agent.ai, Coleman served as CTO and vice president of support and customer optimization products at ATG, which was acquired by Oracle for $1 billion. Coleman is the author on several patents and applications in the areas of online customer support, including cross-channel data passing, dynamic customer invitation, and customer privacy. He holds a B.A. in Artificial Intelligence from the University of Sussex.




Subscribe to App Developer Daily

Latest headlines delivered to you daily.