One can’t read any news today without a barrage of articles about data science and machine learning and artificial intelligence. Just recently,
- Jeff Bezos opened up his private MARS (Machine Learning, Automation, Robotics, and Space) conference to the public
- There is a conspiracy theory stating that the Facebook 10 year challenge is just a way to help train algorithms on aging
- A recent copy of Wired was titled “Less Artificial, More Intelligent”
To better assess all of this talk and hype, I recently had the opportunity to sit down with Brian Sampsel, VP and Chief Analytics Innovator for the Columbus Collaboratory. Brian, when you look at all of this talk about machine learning, how are companies responding?
Brian Sampsel: Given all the talk and hype, it is surprising to find that so many large enterprises are doing things the way they have always been done, especially as it pertains to back office functions.
One glaring example of this is forecasting. Most organizations have a multitude of people making all kinds of forecasts. This is done in a Finance department through FP&A predicting sales, in IT where there are predictions of future storage and compute requirements (especially as things move to the cloud), and in Planning departments where there are questions as to how many products of a certain type are needed and how those products are distributed across the different sizes or models.
The typical workflow still looks like this: Data exists in an ERP system, or some type of database
- The analyst extracts aggregated data using a BI tool
- The extract gets pasted into Excel
- The analyst goes through a highly manual set of Excel calculations or macros set up to generate a forecast
- The forecast is then delivered just in time to business decision makers
- The process is repeated as decision makers modify the assumptions
Drenik: That process sounds familiar, but where does artificial intelligence or analytics fit in?
Sampsel: That’s my point – right now, for many, it doesn’t fit in. Highly accurate forecasts are the start of so many processes and is a primary goal for these organizations, but most don’t yet understand how or why to embed analytics into the process.
Drenik: Before we get to the “how”, help us understand the “why” – what are the primary benefits of incorporating analytics into the forecasting process?
Sampsel: There are a lot of benefits to designing this process using a statistician’s or data scientist’s work flow.
First, many of these steps can be automated, saving a significant amount of time. So often, each forecast can take many hours of someone’s week to produce, week in and week out. And, when a forecast is created, a VP or CFO will ask “what if we change X?”. This starts the process all over again, likely adding more hours to the task. Automating the process enables decision makers to use their time more efficiently by analyzing and understanding the results instead of crunching numbers and recalculating formulas in a spreadsheet.
Second, most organizations looking at forecasts would benefit from some understanding of variability. So many forecasts are delivered as a single number. There it is – that’s the number. However, we know that number is going to be wrong, we just don’t know how wrong. I think many leaders would make different decisions if they knew the range of likely outcomes and whether it was narrow around that estimate or quite wide.
Third, most forecasts would benefit from some 3rd party data. Weather is often the obvious culprit here, but we could also bring in commodity prices, local events, and even the marketing calendar (which is likely another Excel nightmare). Prosper’s forward looking customer intention data would provide large increases in forecasting accuracy, enabling more informed predictions based on how consumers likely to behave. Adding these data sources can create new insights and lead to more accurate forecasts.
And last, many leaders would want to know which assumptions have the greatest impact on the forecast. This is often known as a ‘sensitivity analysis’. For example, what happens to the forecast if the price of commodity X increases by 5%? What if it increases by 20%? Should I even care about the price of commodity X?
These benefits can all be had if we move our forecasting methods away from the traditional approaches and towards machine learning and statistical models.
Drenik: That makes sense. So how would a company move forward. What cautions or risks might they face along the way?
Sampsel: There are some potential bumps along the way:
- This will be a cultural shift and will take some time to get used to. I would recommend piloting alongside the current approach in a region or subset of the business. Use this pilot to prove the concept on a smaller scale and then build support in the organization gradually.
- There will need to be integration with other systems. Engage your IT partners as early as possible so they are along for the entire journey. Make sure that you consider building a process that will manage data efficiently going forward. However, for the pilot, simple manual data extracts are advisable to prove the concept without slowing down based on technical complexities.
- Determine the level of interpretability you would like. In other words, what how many attributes/assumptions do you want to make flexible to influence the result? Some of the machine learning techniques can be more like a “black box” than others, but may also provide better results.
Drenik: So, if a company adopts an artificial intelligence enabled forecasting process, will that eliminate their need for Excel?
Sampsel: I don’t ever see that happening. Excel is a great tool and of course very familiar to many. It’s just that adding artificial intelligence models into the process can both increase accuracy and consistency (reducing manual errors in spreadsheets – which are a significant problem). In fact, many data scientists use Excel extracts and spreadsheets as data sources into AI models.
Drenik: One of the common reactions that many have is that artificial intelligence will replace humans. If an organization builds these highly accurate analytical prediction engines, is that likely to happen here?
Sampsel: No. Remember, the goal is not to replace people. The benefit of this will be to free up people’s time for other value-added activities – i.e. thinking of ways we can grow sales and not just going through the manual steps of forecasting sales. Interpret the results and apply strategic context, intuition and experience to the discussion, while the data and analytics are calculated efficiently and accurately for you. There will always be a place for humans to make the appropriate strategic decisions to affect the business.
Drenik: Thanks Brian!
Columbus Collaboratory is a rapid innovation firm that works with companies to create synergistic solutions to complex cybersecurity and analytics challenges. www.columbuscollaboratory.com