Insight Into How VISR Uses Machine Learning to Monitor Kids Social Media
|Richard Harris in Big Data Wednesday, May 25, 2016|
We recently chatted with Robert Reichman, founder and CEO of VISR, a predictive wellness startup to learn about the challenges facing an app publisher in the kids and wellness genres.
The VISR app monitors social media, using data and machine learning to alert parents to issues their kids face online. By analyzing the online activities and interactions of kids - incoming and outgoing - VISR alerts parents to emerging issues, allowing them to be addressed before they become larger and more difficult to resolve. VISR currently supports Instagram, Gmail, YouTube, Facebook, Twitter, and Pinterest.
ADM: Tell us about VISR
Reichman: Our mission is to give parents the ability to keep their kids safe and healthy online while giving them independence at the same time.
We believe that although today’s digital world represents larger issues than ever before, it also offers unprecedented opportunities for preventative health. In our case, now that communication is digital, we can help parents understand issues their children might be facing, and help facilitate communication around those issues, before they escalate.
Our approach focuses on fusing our system insight with parental instinct, while maintaining an atmosphere that welcomes communication and trust - so that issues can actually be dealt with.
ADM: Online safety for children is an ongoing issue – how would you describe the issue? How much of an issue is it and how does it affect kids?
Reichman: I am an eternal optimist, and believe that things are not nearly as dire as they seem. But the threats are real. We can solve the problem created by the natural optimism, positivity and desire for exploration that kids have and the risks that social media create.
We are just learning how much damage bullying can do - and just how often - and pervasive it can be. But the anxiety of a self-conscious and overly- watched child has its own developmental risk. Finding that happy medium is what we do - the difference between over-protection and hands-off.
ADM: Can you explain how the technology behind VISR works and when/how it alerts parents?
Reichman: Our goal is really to help parents understand the state of their child. What issues are they dealing with? When should I act? Is this normal? We don’t have the answers to all these questions, but it's’ the direction we are heading in.
This starts by trying to identify issues or challenges kids are dealing with - like bullying, violence and possible mental health concerns. We do this utilizing data science and natural language processing techniques.
Essentially, the idea is trying to draw a correlation between the nature of speech and the particular issue. This includes trying to identify various indicators of issues, and then statistically correlating them to elements of speech. Was the post about my child or his pet rock? Was it a figure of speech or an actual threat? We do our best. And when the system does identify a possible issue, it then alerts the parent.
Our plan is to include more indicators, more data, and various other signals, to continuously provide more insight to parents.
ADM: What can an app developer do to be mindful of child safety at the app development stage?
Reichman: Kids will always find ways to use things in ways nobody could have ever imagined. Often issues take a life of their own, and really are a product of their atmosphere more than anything else.
Still, just as social apps and services provide fantastic new capabilities and possibilities - they also find themselves in the crosshairs of issues too, whether it’s bullying, anxieties or broader mental health issues.
We believe that the answer lies in giving people the ability to access their own data in their own interest. Just as a social network can be used destructively, it can also be used constructively. Companies like ours, exist for the sole purpose of helping keep people safe and healthy by way of their own data. Shouldn’t I be able to harness the positive opportunities of my own information, on my own behalf? We think so, and enable apps to offer this to their users by partnering with us.
Beyond the social networks and channels we currently support, we’re looking to add many more. Users ask us consistently about adding support to a wide range of social networks and messaging apps including KiK, Whatsapp, Snapchat, Musical.ly and others. We’re excited to partner with as many companies as we can, and encourage them to reach out to us to explore how we can sync efforts towards our commitment to family safe and health.
ADM: How many incidents and what type of issues regarding online safety has VISR identified?
Reichman: Choosing a random set of over 4,000 VISR users, we found that over 20% of kids have experienced real issues while on Twitter, Gmail, Facebook and other social media outlets.
Our view is to try and understand the macro picture. Still, there’s lots that we still haven’t figured out yet. Our parent community is amazing. We hear often about how parents recognize our bigger ambitions, and how they support us because they recognize that we’re truly vested in their success.
ADM: Is there an age, where kids are most at risk?
Reichman: From our data, we have seen the most alerts between the ages of 12-15.The best time to begin working together - parents and kids - is as young as possible and at positive engagement points. Kids will be most receptive to solutions like VISR or anything else, when they recognize that it’s in their interest; that it’s there to help them. Our average child is aged 8-14 with the highest concentration around 13.
ADM: How does the app help parents protect kids and balance their privacy at the same time?
Reichman: This is an age old question - that precedes social media, the internet and digital communications in general.
Would a parent say to their child, sure jump in the water, figure out how to swim - I trust you? Trust isn’t given, it’s earned. And not because of a lack of trust - but rather because trust is the sum total of a proper education, and the recognition of smart decision making. Barring that, it would not only be inappropriate, but an irresponsible move.
I think the question of privacy is often mischaracterized, in a way that polarizes the conversation. To me, it’s really about integrity. At no point would I ever, as a parent want to breach my child’s integrity.
How do I respect my child’s integrity? By truly and genuinely acting on his behalf. By not undermining her. By teaching him and building her trust along the way.
We continue to work super hard to create an atmosphere that respects children’s integrity. By giving children an opportunity to explore their digital surroundings more freely. By only showing parents potential issues. By allowing parents to not have to sign into their children’s accounts.
Every parent wants to ensure the safest and healthiest future for their kids. They can do that by building them up; respecting their integrity, and using tools like VISR to help them on that journey.
ADM: What types of technology are you compatible with? What platforms do you monitor?
Reichman: One of the great things about VISR is that it doesn’t install on a child’s device - instead it connects directly to the social networks we support. This means a safer connection, which will be able to analyze activity no matter what device they are using. As long as kids are signed into their account, we will be able to analyze and alert parents. We currently support - Instagram, Gmail, YouTube, Facebook, Twitter, Pinterest and Tumblr.
ADM: How can technologies integrate with platforms like VISR to ensure more child protection?
Reichman: While we are currently strictly cloud based and integrate with existing API’s, there are a lot more apps out there without API’s that we would like to support and need support - apps like Whatsapp, KiK, and Snapchat.
Read more: http://visr.co/