July 31, 2018 | Machine Learning
Machine Learning, alongside a mature Data Science, will help to bring IT and business closer together. By leveraging data for actionable insights, IT will increasingly drive business value. Agile and DevOps practices enable the continuous delivery of business value through productionised machine learning models and software delivery.
To many of you reading this blog post, the notion that IT is often viewed as a Cost Centre will come as no surprise. Yet, for many working deep within IT it is rare to question the value IT provides to the business and consider whether it is a “mere Cost Centre”. As an IT engineer the value of IT is self-evident, surely?
Not so long ago I heard this phrase mentioned again, this time in relation to a project I was engaged in: the IT department in a typical organisation isn’t viewed as the seat of value for the organisation. Stop the press, the world doesn’t revolve around software delivery!
Before we continue let’s clarify. What is a cost centre? A cost centre is variously defined as a part of the business that “does not directly add to profit”, or one that is “detrimental to the bottom line” and has to “maximise efficiencies and reduce costs”. In short, a cost centre creates insufficient direct revenue to enhance the commercial bottom line.
This can seem a cruel assessment, but has helped me to understand why there is often a real disconnect between IT workers and the rest of the business. They operate under fundamentally different assumptions: the business generates value, whereas IT provides utility.
Fortunately, this perception is changing. No doubt this is due in part to a more realistic assessment of IT’s growing influence in the entire value chain. It is hard to imagine, for example, that even fifteen years ago businesses were still debating the need for having an online presence. In 2018 this is merely common sense.
It is true that a lot of backend processes still fly under the radar, but the everyday tech of companies like Google, Facebook, Apple, and Uber has changed public perception: IT is a no longer merely a support appendage working silently in the background. Nevertheless inside organisations, this change isn’t yet complete.
Machine learning presents an opportunity to bring IT and business much closer together – perhaps finally breaking the legacy distinction between IT as a cost centre and business as a value centre. The reason for this is the central role that data plays in machine learning.
Data is unarguably one of any business’ most critical assets. Indeed, if a business should lose its critical data, it could well go out of business in short order. However, there is more to data than keeping a business afloat. How it is used can also give a business its competitive edge. As Arun Murthy of Hortonworks says “It is only through insights on data that businesses truly get to competitive differentiation nowadays. Today’s digital world is otherwise just too complex to run intuitively.”
Data is gleaned from a large variety of sources – internal systems, transaction systems, field devices – and presents a multidimensional view of a business’ operating environment. Interpretation can be hugely complex, but by gaining market insights from this complex mass of data a business can produce meaningful and actionable insights that potentially give it an edge over its rivals, or helps it to find promising new leads to pursue.
To put it differently, a business that is able to gain commercial insights in a timely manner and respond quickly will not only survive, but thrive.
Who are the people in the organisation best placed to make use of Data in this way, discovering or generating insights to advise the business? Typically, these will be Business Intelligence analysts, Data Scientists, and Machine Learning Engineers. The latter is more of an IT or engineering role, whereas the former are more business focused, but all share a central concern with deriving value from data. When they work together the Business and IT sides are really just two sides of the same coin.
It is instructive to understand the difference between Data Science and Machine Learning a little more. David Robinson wrote a popular blog post on this topic earlier this year where he made three claims:
Data science produces insights.
Machine learning produces predictions.
Artificial intelligence produces actions.
This apparently neat delineation isn’t without controversy, but I think it goes a long way towards explaining our intuitive understanding of these fields of work.
It suggests that we would normally start with the Data Science effort to derive insights before formalising and optimising the process with Machine Learning. When they work together efficiently they turn into something more than the sum of their parts. In a phrase, they become Business Artificial Intelligence.
Business Artificial Intelligence produces actionable insights for business value.
You may know Mark Schwartz from his thought provoking book The Art of Business Value (if you haven’t read it, you should). In it he suggests that although a typical Agile engineering team looks to the product owner for information regarding the business value they are producing, engineers would learn more about it by working directly with actors in the rest of the enterprise.
In A Seat at the Table he develops this theme much further. He points out that the CIO is usually on the back foot with respect to other C-level leaders by having to continually demonstrate value to earn his or her seat at the leadership table.
In the light of ITs status as a cost centre this should not come as a surprise. It is even reflected in the language in use. For example we speak of the business requirements for a software project. This is the kind of term one would more typically use when engaging a contractor.
A better phrase, Mark Schwartz suggests, would be to speak of business objectives. By speaking of business objectives the IT team is able to take ownership of a particular business objective and strive towards outcomes for the business. In this way, IT would view itself as an integral part of the business rather than working in an isolated fashion with only a product owner as a proxy for the business.
What about DevOps? When we think of DevOps we usually think in terms of automation and collaboration. These are mainly engineering functions, even in the case of collaboration where the silos of Development and Operations, and maybe Testing too, are transcended by working together – either across teams, or together in small teams – and communicating using a shared domain language.
Mark Schwartz has a slightly different take on this theme. He defines DevOps as a
Cross-functional Community of Practice for Continuous Delivery. There are two really interesting ideas in his definition.
Firstly, the notion of a Community of Practice goes beyond the idea of a cross-functional Agile team that exists for the duration of a project (and includes people rolling on or off the project). Community suggests a broader domain in which people share their knowledge and expertise. The whole IT department can be such a community. And dare I say it, we could draw the line even further to include people from Business and other organisational departments. Imagine an entire organisation sharing its expertise in a Community of Practice! (Maybe I’m going too far in this case, but you get the idea.)
The second interesting idea results from adding Continuous Delivery into the mix. As engineers we tend to think of continuous delivery as involving software artefacts, updates, etc. However if we consider the difference between business requirements and business objectives we begin to see a different perspective. DevOps is ultimately about the continuous delivery of business objectives, not just engineering outcomes. By taking ownership of the business objectives (as opposed to business requirements), IT becomes much more closely involved in the value chain.
How does Machine Learning fit into all of this? Machine learning can take place in two different modes. The first is offline mode, which is the more traditional approach. It involves learning in large batches and is both time and resource intensive since it requires learning from all available data. The second is online mode, in which learning is incremental and ongoing, for instance through mini-batching or streaming of input data.
In the traditional offline mode machine learning operates very much at a disconnect from other systems. It works perfectly well, but it isn’t responsive to new information until a new batch has been processed. Online mode, on the other hand, is part of an increasingly popular data engineering paradigm that leverages streaming technologies and updates models in near real-time. In online mode, models are updated at increments that result in resource as well as time savings.
By leveraging both online data engineering strategies for ingestion and DevOps best practices for operational efficiencies, machine learning sits at the cutting edge of the new data engineering paradigm. This approach is starting to emerge as a data management field of its own called DataOps, which combines practices from the fields of Data Engineering, DevOps, and Agile. Not only does it enable machine learning models to be updated in near real-time, but it also delivers models at scale.
What makes this valuable to the business? We know that machine learning can improve specific features in a product. For example, if we have a medical diagnostics tool that relies on machine learning to improve its diagnostic capabilities, we can now update the predictive models more regularly as we roll out the product software updates. But this makes it seem like little more than an optimisation strategy. We’re falling into the old trap of looking to engineering for efficiencies rather than business value.
When we look at the bigger picture we see that the business itself can take advantage. By receiving information about its operating environment from all its data nerve endings – be they field devices, online customer-facing applications, or internal systems – the business is now in a position to produce actionable insights on a much more regular basis, and also to respond to this information in a more timely manner.
To illustrate this, imagine we are a retail company selling socks. Our best-selling sock is purple socks, but our real-time machine learning models this morning are predicting that demand for purple socks are going down rapidly, and demand for white and red socks are shooting through the roof. In other news, England is in a World Cup semi-final … coincidence?! Either way, we have to respond. Fortunately, we have continuous delivery in place and we can quickly roll out complex new red-and-white sock functionality in our amazing online and mobile apps.
This is obviously a slightly frivolous example, but hopefully it gets the point across. Real world data can be processed and modelled in near real-time, affording us an actionable insight and the opportunity for a timely intervention: update our software via continuous delivery pipelines, talk to our suppliers, prepare our customer reps and sales people, etc.
Given the central role of data in machine learning, and the value of regular and updated actionable insights to the business, it follows that competitive enterprises will almost certainly be those that can successfully leverage machine learning using DevOps and Agile.
The capability to act on insights and predictions is what Robinson defines as artificial intelligence. When we add agility and responsiveness to the mix we could think of such an artificially intelligent business as a complex adaptive system.
According to the Business Dictionary, a complex adaptive system is an “Entity consisting of many diverse and autonomous components or parts (called agents) which are interrelated, interdependent, linked through many (dense) interconnections, and behave as a unified whole in learning from experience and in adjusting (not just reacting) to changes in the environment.” (my emphasis).
To be able to learn from experience and successfully adjust to changes in the market and operating environment is a dream for every business. That describes exactly how a responsive Machine Learning practice with DevOps agility assists an enterprise’s workforce.
Such an enterprise is an adaptive organism, detecting changes in the environment through its senses (the data received from the business environment, business transactions, internal data, etc.), processed as insights and predictions, and adjusting its behaviour with the aid of the skill and expertise of its human workforce.
Data Science and Machine Learning are at the heart of it.
Machine Learning, via data and Data Science, will help to bring IT and Business closer together. By leveraging data for actionable insights IT will increasingly drive business value. Agile and DevOps practices enable the continuous delivery of business value through productionised machine learning models and software delivery.
This blog is written exclusively by the OpenCredo team. We do not accept external contributions.