We live in an accelerated world. Organizations are pressed to move quickly to meet the demands of markets and stakeholders, gain resources, maintain competitiveness, or maintain service levels. The long planning cycles and steady growth of lumbering, giant organizations and agencies have been challenged by nimble competitors that are either smaller but empowered to move faster or by large organizations that have continuously invested in quicker processes to drive new capabilities. Our ability to collect data is greater today than it has ever been, enabling organizations to amass staggeringly large amounts of information. Without powerful tools to turn that data into actionable insights quickly, the marginal value of the data is severely limited.
ML is a powerful tool that is now available to organizations of all kinds and sizes, enabling them to identify trends and patterns more easily from massive quantities of diverse data without requiring human interaction. Including so-called “deep learning (DL)”—which uses a multitude of highly adaptive analysis layers for the greatest possible sophistication in pattern recognition and description—ML is a key new strategy to reduce the time and increase the consistency and efficiency of day-to-day analysis and decision-making.
ML uses algorithms that allow computer systems to “learn” and, ultimately, support informed decisions based on that learning—AI, in other words. DL applies to highly interconnected datasets, stringing together multiple algorithms across the adaptive analysis layers using artificial neural networks for the greatest flexibility in observation and analysis. It also allows for the most impressive machine capabilities such as speech and image recognition, biochemical design, and climate science.
Increasingly, organizations are turning to ML and AI to attack their new data challenges, working to shorten the lead time between problem and answer with the automation and efficiency of ML. In this new area of development, however, there is a challenge that arises from mismatches in skills within teams, leading to suboptimal execution of ML strategies.
Data scientists have the domain expertise on data and extrapolation, and it is in the interest of the organization that they concentrate most of their time on these areas. However, they are spending significant time outside of ML modeling and training, dealing instead with the design, configuration, and testing of the infrastructure as performance and reliability are so critical to project success. As organizations built up their data science capabilities, not as much attention was paid to the other key aspects of ML life cycle, resulting in gaps when it came to model testing, deployment, monitoring, and optimization. As infrastructure is not the domain of data scientists and not where their expertise and training can deliver the greatest value, IT needed to be part of the equation. There needed to be a way to bring IT and data science together.