Next year, spending on machine learning is expected to hit $12.5 billion. $8 billion of that will be spent on business services and machine learning applications. Deep learning algorithms are on the leading edge of that spending wave. 3 years ago, most businesses were getting up to speed with data science. Last year, it was machine learning. Now we are into deep learning. The technology moves quickly but my clients’ main question has not changed. What are the practical applications of deep learning for companies not named Google, Facebook, and Apple?
Not every business problem needs the latest solution. That was true with data science and earlier machine learning techniques. Take the problem of patient readmission in healthcare. Federal guidelines now link insurance payouts to patient outcomes, especially readmission rates. Several different deep learning approaches have been researched with limited increases in accuracy. Given the cost of building, training, and deploying these models, it is simply not cost effective. Deep learning’s value is in solving problems that could not be addressed with earlier technical approaches. Here are a few practical use cases for deep learning.
Skilled Robotics & Labor Automation
When companies talk about machine learning, the discussion inevitably leads to self-driving cars. It is a good entry point into the potential of deep learning and robotics. Previous generations of robots have been limited in capability. They were programmed to do one repetitive task or a small set of tasks. Deep learning opens those capabilities up significantly. A couple of key advancements, grasping and 2D/3D vision, are driven by deep learning. Google has done some interesting work with grasping and they are just one of many.
The application sounds simple on the surface. Robots are now able to identify objects, determine the object’s pose or relative position, and grasp it/pick it up. The use cases for this type of deep learning are a lot more exciting. Robots can now unpack pallets. They can help with inventory management and error checking. They can restock and pull items from store shelves. In manufacturing, they can do increasingly fine motor skill tasks. They can perform detailed quality control tasks. In some cases, it can do QC with a higher degree of accuracy than a person.
This is not a technology that most businesses will internally develop. Picking a robotics and automation partner requires asking questions about the core deep learning models and assessing their fit for the business’s individual needs. Is the training done using reinforcement learning or a supervised deep learning method? How is the initial model trained and how does it improve over time? How much effort is required by the business to initially train and continually train the models? How will the technology scale and adopt new advances? These and many other questions go into selecting a good solution. For any use case involving a third-party solution, the vetting process is highly technical but well worth the effort.
Text is something people handle natively. Computers on the other hand, have struggled with text. Traditional machine learning algorithms fail to achieve levels of accuracy which users consider acceptable. Deep learning provides a significant boost for natural language processing in several key areas.
Once a blob of text is broken down and parsed so machines can handle it, it can be mined for intent, sentiment, topic, or relevance to a particular search. Deep learning can make accurate, educated guesses along each of these lines with a minimal amount of training data. That drops the cost of these processes significantly and provides levels of accuracy people find acceptable.
The use case for deep learning-based text analytics centers around its ability to parse through massive amounts of text data and either aggregate or analyze. Using deep learning, computers can perform tasks like e-discovery. Large investment houses like JPMC are using deep learning-based text analytics for insider trading detection and regulatory compliance. Hedge funds use text analytics to mine through massive document repositories for insights into future investment performance and market sentiment. Facebook uses text analytics to recommend relevant posts among other things. Companies use text analytics on social media to gauge brand sentiment or respond to complaints in real time. Communications from messenger apps, emails, phone calls, etc. can be classified by importance. This allows the software to read the deluge of communications coming at an employee every day and showcase the most important.
In each case, it is not cost effective to hire the staff necessary to sift through all the documents. Being able to automate that task is not only a cost savings, but a competitive advantage. Text analytics is typically a hybrid project. Some of the code necessary to build deep learning text analytics capabilities are in open source libraries like Google’s TensorFlow and several others. There will be additional work to extend, customize, train, and integrate these libraries. There is no text analytics solution that works out of the box currently but the returns in productivity and improved capabilities make this worth the investment.
Deep learning has several applications in cybersecurity. The advantage of deep learning over other approaches comes down to accuracy. In most cases the improvement is significant; up to a 99.9% detection rate. The high risk and cost associated with failing to detect a threat make the expense associated with deep learning worthwhile.
Deep learning can play several roles within a larger cybersecurity or infosec strategy. It can automate intrusion detection with an extremely high discovery rate. Deep learning also does very well with malware, malicious URL, and malicious code detection. There are emerging use cases as well, but those have not been proven out yet.
Deep learning for cybersecurity is an interesting mix of unrealized potential and practical applications. That assessment applies to the lion’s share of deep learning use cases. Everything deep learning is subjected to a large amount of hype and speculation from uniformed sources. There are two questions to answer with any use case in this category. Is there proof of practical application? This comes in the form of peer reviewed research and industry benchmarks. Is this solution actually delivering a proven deep learning solution, appropriate for this use case? Again, this is a highly technical vetting process. It is well worth the effort to make sure the time and money spent implementing a solution yields the expected gains.
Time Series – Predictive Deep Learning
Deep learning has several advantages and applications in what is called time series analysis. Time series is exactly what it sounds like; data captured with a timestamp associated with each data point. Stock quotes to sensor data to traffic patterns and many other kinds of data falls into this bucket.
Deep learning methods have a powerful ability to scan large amounts of time series data and find patterns that are difficult for people or traditional data science methods to discover. Training times, data gathering, and engineering effort are all high, but the use cases justify the level of effort.
Predictive maintenance is one of the highest returning use cases. Using anomaly detection and survival analysis, deep learning algorithms can predict when a machine (everything from an airplane engine to machines in manufacturing facilities) will fail. That allows machine downtime to be planned with minimal impact to operations.
This theme is why deep learning for time series analysis is such a strong use case. Many events, from traffic jams slowing delivery times to weather events causing shortages in stores, have been extremely hard to predict. Companies are forced to react to these events, usually causing inefficiencies. Deep learning can analyze time series data and return accurate predictions for these types of events. That allows companies to plan for what used to be the unexpected.
Prescriptive Deep Learning Systems
In my opinion, this is the most exciting area of deep learning. Once systems begin to predict events, they can use those predictions as inputs and prescribe actions based on optimal outcome criteria. Basically, the system looks at the events to come and recommends what to do to achieve a best-case scenario.
This is an emerging use case and especially difficult to evaluate. When the inputs of a model come from the outputs of a different model, that dependency creates technical challenges with respect to accuracy over time. Any prescriptive system has a failure horizon. With traffic prediction, high accuracy at a horizon of 20-30 minutes is all a delivery company needs to reroute drivers away from delays. With predictive maintenance, a horizon of a few days to a week is sufficient to mitigate the impact of downtime. Both also have a low cost of failure. If the model is wrong, the costs are minimal so being wrong 1 time in 20 does not take away much from the cost savings.
I have implemented several of these types of models. In each case, a well-defined scope and well understood accuracy are critical for successful implementation. Human oversight and correction are needed to refine and customize the model. After a few months, the models are usually ready to run with minimal oversight. The model will need monthly maintenance and annual retraining as well.
In a recent survey of the healthcare industry, one of the largest barriers to adopting machine learning was cited as a lack of clarity on the use cases. From my experience, that sentiment is true across industries. The technical complexity associated with deep learning makes it difficult to navigate emerging use cases and decide which ones are right for the business.
That is causing many companies to sit on the sidelines while their competitors gain proficiency with the technology. Deep learning will drive the next 5 years of software and systems. A mature machine learning strategy will help businesses achieve the cost savings and competitive advantages the technology promises while avoiding the hype and false starts.