mea culpa /mā′ə koo͝l′pə, kŭl′-, mē′ə/
noun
American Heritage Dictionary
- An acknowledgment of a personal error or fault.
I think it is time to admit the inaccuracies in some positions I held over the last two decades as a supply chain analyst. Here, I write a Mea Culpa of sorts.
Background
Each analyst firm defines “research” differently. As an industry analyst since 2002, I have learned a lot– often the hard way. When working with industry analysts, a good question is, “How do you define research?”
An analyst is different from a consultant. While a consultant knows the answers, an analyst attempts to understand the right questions to ask. The focus is on what practices drive improvement and value.
The work is not easy. There are many strong opinions in the industry. (Ever notice how many more “experts” appeared when the Ever Given got stuck in the Suez Canal?) I find the proliferation of opinions increasing and data-driven analysis diminishing. The reason? True research is time-consuming and expensive.
My History
When I was at Gartner, the research methodology was to write answers to the most commonly asked questions. The concept was that there was a pattern in the questions. The cycle was vendor briefing, inquiry pattern recognition of questions, and published market responses. This model worked. My problem was that the audience was late adopter IT teams. This is not my area of interest. I like writing for business innovators.
So, when I went to work at AMR Research, I was excited to write research again for business leaders. Writing for the IT audience is too boring, and I love working with business innovators.
AMR had two research disciplines that excited me—benchmarking and quantitative research. At the time, this focus was missing from the Gartner model. I developed a passion for writing quantitative surveys during my tenure at AMR from 2005 to 2010. I had some excellent coaches. We averaged a survey a month.
I ran into some issues. The benchmarking service was discontinued in 2008, and I sadly found that the survey data was not based on “named response.” (A named response carries a unique identifier to name the respondent.) Instead, AMR and I find most research companies and consultants use a hired research panel. The problem? You cannot validate the response. The survey response only carries the reported demographic data. As a result, you could never connect the respondent data to balance sheet outcomes.
When I started Supply Chain Insights, I designed the business model to use my LinkedIn followers to identify the responses. After reflection, I realized that I stubbed my toe several times in prior analyst positions when I gave some advice that deserved retraction.
My Mea Culpa
I think that my advice in the past decade was wrong on several fronts. Here, I share.
1. Hierarchy of Supply Chain Metrics. Debra Hofman developed the supply chain metrics hierarchy at AMR Research before I arrived. We worked together to refine it, and I used to sit in on many benchmarking calls.
I used the hierarchy in many strategy sessions. The concept was simple: improve the demand forecast to improve customer service, cost, and cash positions.
The image shown in Figure 1 is based on the benchmark analysis of 82 manufacturers at different points in time. The Hierarchy was never tested for statistical rigor. The manufacturers were never compared to their progression within their peer groups on balance sheet metrics. (Each industry has a different potential. My learning is peers need to be compared to peers.)
In reflection, I think there may be some causality in the relationships, but I think the model is flawed. Benchmarking is hard work. The snapshots were captured at different points in time with different levels of granularity. The issue is that the benchmarking never considered analysis against peer group potential. In addition, there was no analysis of forecastability, Forecast Value Added Analysis (FVA), or form/function of inventory.
Based on the work with Georgia Tech, operating margin is the most crucial metric for driving value. Margin should be at the top of a supply chain hierarchy. Demand error is an input into the process, not the independent variable.
I often used this model in strategy sessions advocating improvements in demand planning and inventory management. I had a deaf ear to the client who asked the question, “What if our demand is not forecastable?” If they did ask the question, I would talk to them about forecasting techniques for complex items. We assumed forecastability.
We can no longer assume forecastability in the current state of broader product portfolios, regional preferences, and distorted history. Or the use of traditional demand planning techniques.
We also need to focus more holistically on inventory. Safety stock today is a lower percentage of total inventory (15-20% in my recent classes). And, if the company has demand that cannot be forecasted, safety stock is not the right lever.
I have learned much from my students and their modeling of outside-in concepts and demand streams.
2. S&OP Maturity Model. In 2006, I built the model in Figure 2. The premise of the report was that top-performing supply chains achieved a balance between the “S” and the “OP” by focusing on the ampersand (&). I would tell clients that the journey on S&OP was 60% change management, 30% process, and 10% technology. However, technology was essential to achieve a feasible plan.
What is wrong with this model? It focuses on people, processes, and technology but does not mention governance. I failed to mention the critical element of leadership. I now believe, after a decade of working with companies that for large companies greater than 5B$ in annual revenue that leadership plays a significant role. I now believe that it is 40% leadership/30% change management/20% process and 10 % technology.
I was naive about the magnitude of corporate politics in the evolution of global supply chains. A clear definition of operational excellence is critical to helping companies achieve balance. Most companies I work with lack this clarity and throw the supply chain out of balance by focusing on functional goals and divisional bonus incentives. Leadership is missing to throw the S&OP process back into balance.
Most programs derail due to a lack of clear governance and process excellence. My model did not reflect the level of required leadership.
3. Supply Chain Centers of Excellence. A supply chain center of excellence sounds like a good idea, right? Over the last decade, few achieved their objectives. In contrast, the Center for Analytics was far more successful. The reason? Centers of Analytics have a more precise mission. Using advanced analytics and helping companies gain insights from data has clear value. In contrast, the mission of the Supply Chain Centers of Excellence was murky. Many did not survive downsizing.
Wrap-up
So, raise a glass this Friday and celebrate learning. Even if it includes a few Mea Culpa moments.