IBM: Humanizing Digital Conversations
|Rama Akkiraju in Analytics Friday, October 7, 2016|
The definitions and examples are taken from Google and Merriam Webster definitions.- Sentiment: A view of or attitude toward a situation or event; an opinion (ex: Positive, Negative, Neutral)
- Emotion: An instinctive state of mind. (ex: Anger, Disgust, Joy)
- Attitude: A position with regard to a person or thing. (ex: Hostile, Compassionate, Cheerful)
- Tone: The general character or attitude of a place, piece of writing, speech, situation, etc. (speech added to the original definition) (ex: Confident, Tentative, Analytical)
- Intent: Intention or purpose. (ex: buy, try, explore, communicate)
- Mood: A temporary state of mind or a feeling. (ex: humorous, gloomy, nostalgic)
- Instinct: An innate, typically fixed pattern of behavior in animals in response to certain stimuli. (ex: avoid, hide, run, withdraw)
- Feeling: A belief, especially a vague or irrational one. An emotional state or reaction. (ex: bitter, elated, worried, sick).
I know, many of these terms are closely related and are often difficult to distinguish and categorize. While a fine distinction between these terms may or may not be as important in the business world, it is still important to have clarity on these terms, to understand how the linguistic and psychology community thinks of these terms and be able to work with these human forms of expression. Interestingly, out of the many forms of these human expressions, not all of them may be detectable always. For example, inner states such as instincts, feelings, emotions and moods are hard to detect until they are expressed in some form, say, either via facial expressions or via tones in one’s writing. Some humans may express their feelings voluntarily or involuntarily and some may not! So, first of all, we can only observe what’s been expressed. When not expressed, they can’t be detected. In general, this holds true not just for computer-based detection models but also for humans! If someone is stone-faced and doesn’t show any reaction during a conversation, it is hard for others to know what that person is thinking/feeling. Same applies to computer models as well. So, when building computer based models, to detect these internal states of humans, we usually rely on expressed clues. The internal states are typically expressed by humans via text, speech, facial expressions and gestures. Most computer models, therefore, focus on these mediums to detect human feelings.
In this article I will share my thoughts on a couple of these analytical services; namely, Sentiment Analysis Vs. Emotion Analysis and Tone Analysis Vs. Emotion Analysis.
Sentiment Analysis has been well researched in the past decade. Many companies offer Sentiment Analysis solutions that are applied to use cases such as client satisfaction assessment, customer relationship management including loyalty management, brand perception, product feature feedback gathering, campaign management, and employee engagement to name a few. For the most part, sentiment analysis is about detecting positive, negative and neutral sentiments expressed by people. Emotion Analysis is a newly emerging and fast growing area of research in AI community. Emotions offer more fine granularity over sentiments. For example, a negative sentiment could be caused by emotions such as anger, fear, sadness, and disgust. In some situations, knowing these fine granular emotions is very important to take meaningful actions. Similarly, positive sentiments could be caused by excitement, ecstasy, trust, and satisfaction among other things. Again, the actions one might take might be very different in each case. Let’s consider the following scenarios to illustrate these two cases.
A customer says “This company keeps adding and removing features randomly. They make me nervous. I don’t trust these guys to know what they are doing”. This review is overall negative and most sentiment analysis systems will classify this review correctly. However, there is more information in that review beyond the obvious negative sentiment. The customer is expressing trust issues with the company. She is saying that the company’s approach to product development is not well thought-through and that she would think twice next time before buying a product from that company. This information is invaluable to sales, marketing and product management teams.
Let’s consider this other scenario from customer loyalty management domain – this time to see how the granularity in positive sentiments can be helpful beyond knowing that it is positive. Companies are constantly evaluating where their customers fall on the loyalty continuum. Are their customers satisfied and content or are their customers ecstatic about their products? If they are ecstatic, they are likely to be great candidates to spread the word to others and offer referral. Companies try to recruit ecstatic customers to be their spokespersons both in print, online and social media to help advertise their products. In this scenario, where exactly the customer falls in their positive sentiment is very important to design different kinds of engagement strategies with their customers.
Let us also consider the class of digital virtual agents (DVA) and cognitive assistants/bots that are being developed these days. These include Apple’s Siri, Microsoft’s Cortana, Amazon Echo - Alexa, Google’s Google Assistant. In all these types of Dialog Systems, knowing the specific emotions being expressed by the individual that the agent is interacting with is very important to carrying out a natural, compassionate and personalized conversations. For example, if a bot detects that the customer is feeling frustrated, it could demonstrate compassion by first acknowledging the frustration and then by employing various strategies such as allowing them to vent their frustration before assuring help or assuring effort to help. Similarly, if a customer expresses confusion, the bot could offer explanation to clarify the situation. These kinds of dialog strategies can only be employed when more fine granular emotions are detected.
Overall, we are seeing Emotion Analysis being applied to customer churn reduction, loyalty management, Mood prediction, Dialog, Cognitive assistants, Wearable sensors/devices, Internet of Things types of scenarios that are distinct from Sentiment Analysis use cases.
In our definition, emotions are what people feel. Emotion is an inner state of a person. Tone is the way people choose to express their feelings and thoughts to have an intended effect on their target audience. It may be an exact reflection of what they feel or some manipulation of it to have an intended effect. So, when detecting emotions, you may look for things such as anger, disgust, fear, joy, sadness etc., whereas when detecting tones, you may look for the way a person is expressing her inner state. These could include: how analytical, confident, professional, cheerful, persuasive, polite, formal, and friendly a person is. In Watson’s Tone Analyzer, we have some of these tones presently and we are working on adding others.
In the previous section, we discussed why detecting emotions is important and the associated business cases. When and where would one apply Tone Analysis? One way to put it is as follows: your choice of tone is all about your brand! In personal and business communications, Tone Analysis could offer feedback about your communications. If noted to be not in-sync with the target audience, one can refine ones’ tone of communication to improve the effectiveness of one’s message to achieve the intended effect. Marketing firms, individual writers such as bloggers, and journalists could use such a tool. More importantly, we see tone detection and taking action based on detected tones as an integral part of Conversational Systems development.
Consider this scenario. Say that you are analyzing contact-center call-logs to identify opportunities to improve customer care. Say that you have detected non-courteous, tentative or unprofessional tones in a human agent communication, that is an opportunity to train and coach the agent to employ appropriate behavior while dealing with customers. In another scenario, say that you have deployed an automated bot as an agent, and say that it detected confusion and frustration in the tone of the customers, the bot might consider employing dialog-strategies such as clarifying the situation, acknowledging the frustration and assuring effort to help. If you are able to detect the tones during the conversation, in dialog management module of your bots, you can design appropriate dialog strategies to effectively deal with the observed tones. If you didn’t detect any of these tones, your Conversational agent may simply ignore the emotional and tonal aspects in the conversation and sound like a drone – too boring or too rigid to interact with!
These are just a couple of examples to illustrate how Sentiment Analysis, Emotion Analysis and Tone Analysis – all have related but distinct use cases. In future articles, I will discuss how personality fits into incorporating a persona to a bot. So, see, it pays to be sentimental, emotional and tone-aware!
Read more: http://www.ibm.com/watson/developercloud/conversat...
Are you paying more taxes than you have to as a developer or freelancer? The IRS is certainly not going to tell you about a deduction you failed to take, and your accountant is not likely to take the time to ask you about every deduction you’re entitled to. As former IRS Commissioner Mark Everson admitted, “If you don’t claim it, you don’t get it.
Get hands-on experience in performing simple to complex mobile forensics techniques Retrieve and analyze data stored not only on mobile devices but also through the cloud and other connected mediums A practical guide to leveraging the power of mobile forensics on popular mobile platforms with lots of tips, tricks, and caveats.
The Chirp GPS app is a top-ranked location sharing app available for Apple and Android that is super easy to use, and most of all, it's reliable.
Write and run code every step of the way, using Android Studio to create apps that integrate with other apps, download and display pictures from the web, play sounds, and more. Each chapter and app has been designed and tested to provide the knowledge and experience you need to get started in Android development.
This content is made possible by a guest author, or sponsor; it is not written by and does not necessarily reflect the views of App Developer Magazine's editorial staff.