5/15/2018 1:09:21 PM
Machine learning gets faster thanks to Lifelong DNN technology
Machine Learning,Deep Learning,Neural Network
https://appdevelopermagazine.com/images/news_images/Neurala-Drops-Big-Update-to-Machine-Learning-App-Developer-Magazine_0t5bnstc.jpg
App Developer Magazine

Artificial Intelligence

Machine learning gets faster thanks to Lifelong DNN technology


Tuesday, May 15, 2018
11,097

The machine learning process sees major speed improvements from Neurala's Lifelong Deep Neural Network technology reducing the training time for AI considerably.

Major improvements to how fast machine learning can be have been announced by Neurala in a breakthrough update to its Lifelong Deep Neural Network (Lifelong-DNN) technology. The update allows for a "significant reduction in training time compared to traditional DNN - 20 seconds versus 15 hour - a reduction in overall data needs, and the ability for deep learning neural networks to learn without the risk of forgetting previous knowledge - with or without the cloud. 

“It takes a very long time to train a traditional DNN on a dataset, and, once that happens, it must be completely re-trained if even a single piece of new information is added. Our technology allows for a massive reduction in the time it takes to train a neural network and all but eliminates the time it takes to add new information,” said Anatoli Gorshechnikov, CTO and, co-founder of Neurala. “Our Lifelong-DNN is the only AI solution that allows for incremental learning and is the breakthrough that companies across many industries have needed to make deep learning useful for their customers.”

Off-the-shelf DNN is pretrained on an ImageNet - a massive database of images organized by keywords - and specific datasets. Until now, traditional DNN was fixed, and, to add new data, the system needed to be retrained on all objects from both datasets.

This traditional method required using powerful servers often located in the cloud. Neurala Lifelong Deep Neural Networks, Lifelong-DNN, enable the learning of objects on the edge incrementally, mimicking in software the way cortical and sub-cortical circuits in human and animal brains work “in tandem” to add new information on the fly.  Lifelong-DNN can use 20 percent of the number of instances per class, with only a single presentation of each during training to achieve optimal performance. This can decrease training time even more.

“This update is game-changing for edge analytics and for the way servers are used today,” added Gorshechnikov. “We can envision this technology slashing compute powers in server farms and enabling networks to be assembled on the fly on custom data. We are only scratching the surface of potential applications.”


Comments

There are no comments yet, be the first to leave your remarks.

Leave a Reply

co