As global manufacturers face rising complexity across demand planning, configurable product portfolios, and multi-system operations, supply chain transformation is increasingly driven by data quality, systems thinking, and applied artificial intelligence.
Juliet Mirambo is a rising leader in integrated supply chain operations and currently part of MilliporeSigma’s Operations Leadership Development Program, a rotational initiative designed to develop the next generation of manufacturing and supply chain leaders.
With an academic background in engineering and chemistry, and hands-on responsibility for strengthening global demand planning tied to a €250 million core model, her work spans process optimization, cross-functional alignment, and the practical application of AI in forecasting and material flow.
In this interview, Mirambo discusses how rotational leadership experiences shape systems-level thinking, why data governance is foundational to AI-driven planning, and how transparent, decision-focused architectures are redefining the future of smart operations and digital manufacturing.
1. You are part of MilliporeSigma’s Operations Leadership Development Program, which is known for developing the next generation of manufacturing and supply chain leaders. How has this rotational experience shaped the way you approach integrated supply chain challenges?
The program gave me a completely new way to look at the type of impact you can have right out of the gate. While the program provided me with a new perspective on scale and impact, it also showed me the true nature of the most costly problems in the world of Supply Chain. It showed me there are the invisible handoffs between systems, the assumptions built into legacy processes, the data quality issues that compound as data flows downstream.
In one rotation, I was tasked with improving forecast accuracy, which looks like a relatively simple task, but there is more to it. Leadership wanted better numbers, but after mapping the actual data flow, I discovered that the forecasts were actually accurate, and the issue was that there were three different ERP Systems talking about the same products in various units of measure.
By the time the data reached our forecasting team, it had undergone so many manual reconciliations that we were essentially predicting fiction. The rotational experience forced me to think like an owner and not just an operator.
I had to make decisions where improving efficiency in planning might create chaos in manufacturing or where standardizing data definitions saved millions in inventory, but required convincing engineering teams to change 20 years of practice.
The program not only teaches you about different functions but also teaches you how to see your company as a system of interconnected systems where every optimization has second and third-order effects.
2. In your role as a process optimization lead, you support a planning model tied to more than two hundred fifty million euros in configurable materials. What does it take to build end-to-end process maps that truly reflect the complexity of global demand planning?
There’s a lot going on when you have to deal with €250 million over 16 sites. Everyone is looking for that all-encompassing solution. A complete process map including all the touchpoints, all the systems, and all the decision points is the holy grail. But here’s the challenge: by the time you complete the process map, the business will have changed. Products will be introduced.
Systems will have been updated. Employees will have transitioned to new positions. Therefore, I took a different approach. Rather than creating a complete process map up front, I chose to select one pain point which is causing the business serious trouble, and quickly resolve that issue. Provide tangible evidence of results. Create credibility. Then grow from there.
For example, the planning group was consistently battling forecast errors. When I dug deeper into the matter, the root cause of the problem was not the forecasting model; rather, it was bad data coming in from three separate ERP systems, which disagreed on even the unit of measurement (i.e., “case,” “each,” etc.). The forecasters did not know what to do with this mess and were essentially forecasting fiction.
I didn’t create a complete demand planning process map. I concentrated on solving one bottleneck, which I mentioned was unit of measure harmonization. I created a process utilizing machine learning and physics-based validation to automatically correct the unit of measure. I had it running in production. I documented $7.2M in savings.
Now everyone wants to discuss what other opportunities can be optimized. This first win gave me the ability to take on larger projects. For configurable materials specifically, the complexity is overwhelming. You are no longer managing one product. You are managing dozens, or even hundreds, of various configurations, each with its own Bill Of Material, each with its own supplier list, each with its own customer demand pattern.
Traditional process mapping fails because there are simply too many combinations. Instead of documenting every possible path, document the decision logic. I ask myself: What decisions are planners making repeatedly? What information do they need to make these decisions? Where does this information come from? Is it reliable?
Then I built an infrastructure that makes the right decision easier than the wrong decision. Automated Data Quality Checks. Validation Rules to prevent impossible scenarios from entering the planning system. Dashboards that provide visibility to exceptions rather than requiring planners to go hunting for problems.
3. You have been exploring ways to introduce AI into forecasting workflows. What opportunities do you see for AI to increase agility and accuracy in supply chain planning, especially for configurable product portfolios?
Traditional forecasting for configurable products treats each configuration as an individual sku with its own history of demand. If you have a product with 17 configuration options across multiple attributes, you may be developing hundreds of possible configurations, and many of these configurations will have very little historical data. Statistical models do not perform well in this type of environment.
Either the model will over-aggregate and lose valuable signals or over-segment and build a forecast on noise. What I have been focused on is developing artificial intelligence (AI) that can learn the underlying structure of the data to understand which configuration attributes drive demand variability, how customer preference for configurations relates to one another, and how the mix of configurations varies based on market conditions.
While AI for configurable products provides a means for improving the accuracy of forecasts, the primary benefit is developing a new architecture for understanding demand. However, the greatest value creation is through what I refer to as “error correction velocity” – how rapidly can you recognize and correct when reality deviates from the forecast.
In a configurable portfolio, you are making hundreds of small bets every planning cycle. Many of the bets made will be successful, while others will fail. The key is determining how rapidly you can identify the failed bets and adjust accordingly. In this regard, AI can create significant value over traditional methods.
To develop a system capable of recognizing when the forecast has failed and making autonomous corrections to the root causes of the failures, I developed a reinforcement learning system that continually monitors the difference between forecast and actual values in real time and makes autonomous adjustments to the source of the error in the underlying data quality.
As abstract as this description may seem, the practical benefits are quite tangible. A specific example includes forecast errors that occur due to unit of measure mismatch between systems (i.e., planning assumes we are ordering in cases, whereas suppliers ship in pallets and thus the forecast is incorrect prior to execution).
Through the use of my system, unit of measure mismatches are identified, corrected autonomously approximately 94% of the time and processed in excess of 10,000 records in 2-3 minutes. Prior to the development of my system, work associated with correcting these mismatches consumed 20% of the planning teams’ time.
Therefore, the development of my system represented a fundamental change in the allocation of human expertise within our company. Our planners are now spending their time planning rather than fighting data quality issues.
The business case for this technology expands when layered beneath demand planning platforms. We recently completed the implementation of Logility for demand planning across all of our $250M in operations.
Without the development of my middleware automation to harmonize forecasts between three disparate ERP systems, the entire implementation would have been non-functional. The published results — 97-99% service level and same day delivery competitive advantage — are a direct result of having clean and reliable data flowing into the planning system.
4. Integrated supply chains depend heavily on transparency and cross-functional alignment. How do you bridge the communication gap between engineering, manufacturing, procurement, and planning teams when optimizing material flow?
I was reminded early in my professional career that no matter how good an engineer you are, you can’t fix all of an organization’s problems. Similarly, while collaboration workshops may be effective at addressing issues with an organization’s structure, they are unlikely to be enough to address structural and process problems.
The biggest hurdle to cross-functional collaboration between departments isn’t a lack of desire to collaborate. Rather, each department is often trying to optimize for a different metric and therefore works from a slightly different reality. Engineering focuses on specifications.
Manufacturing focuses on cycle time and yield. Procurement focuses on cost savings and supplier diversity. Planning focuses on forecasting and service level performance. Each of these is a valid business objective; however, when everyone is working towards a different goal, using different data to assess progress toward those goals, achieving alignment is impossible.
My approach has been to separate the problem of trusting each other from the problem of telling the truth to each other. The truth problem — getting everyone to work off of the same information — can be solved with technology. I built middleware using KNIME to reconcile data from our three enterprise resource planning (ERP) systems in close to real time.
When engineering updates the Bill of Materials (BOM), manufacturing sees it. When procurement changes the lead time of a part, planning sees it. The data flows back and forth and validates against both physics and business rules automatically. While this appears to be purely technical, it is fundamentally a leadership intervention.
Prior to building this infrastructure, most cross-functional meetings began with 15 minutes of debating which number set was correct. Today, we begin with “Here are the numbers. What do they tell us? What should we do?” That type of change in conversation quality is transformative.
5. Your academic background combines engineering and chemistry. How does this multidisciplinary foundation give you an advantage when navigating technical operations and data-driven planning environments?
The chemistry foundation has instilled in me what I refer to as “precision paranoia,” and in managing $250 Million of goods and services, that paranoia is an asset. In chemistry, if you mix up milliliters and liters in a procedure, there can be disastrous results. As such, you learn quickly that precision is not optional; a hunch is not validation.
This discipline is a direct translation to data quality management. When I see a unit of measurement conversion in our system, my chemistry training tells me, “Does this math check out against all the known laws of physics?” “Can you really convert 1 kilogram to 5 liters of water?” Physics will tell you that you cannot.
Therefore, if our system is saying that conversion is valid, then there is a fundamental flaw in our system. For this reason, I designed the physics-based validation into my AI system — as a necessary layer, not as a nice-to-have layer. The machine learning portion of the system learns patterns from historical corrections, which is beneficial.
However, before executing any correction, the system performs validations against NIST (National Institute of Standards and Technology) standards. If the conversion is physically invalid, the system will flag it for human review, regardless of what the machine learning model predicted.
The multi-layer validation is what makes it possible 100% accuracy in production for twelve months straight. The engineering portion of my education provides a different advantage — systems thinking and constraint optimization. Engineers are trained to identify bottlenecks in a system, design for failures, and optimize for the limiting factors.
When I’m mapping out supply chain processes, I use the same analytical framework I use for engineering systems design and ask: what are the inputs to the process? What transformations occur during the process? What are the outputs of the process? Where is the critical path in the process? What would happen if component X failed?
This can be especially useful when designing configurable products. Each configuration of the product represents a different bill of materials that routes through the same constraint resources. Engineering taught me to model these multi-constraint optimization problems formally rather than intuitively.
6. Many organizations struggle with balancing standardization and flexibility in their planning models. What approaches have you found effective for building structure while still allowing processes to adapt as demand and product complexity evolve?
The main conflict between flexibility and consistency is at the heart of Operational Excellence. Most companies view standardization and flexibility as two extremes of an axis — you either have one, or you can’t have the other.
I disagree with this and instead argue that you can standardize the infrastructure and governance of your processes and still design for the variability of the execution of those same processes.
To help illustrate my point, let’s start with a failure. Early in my career, I built what I thought was a perfectly standardized planning process. There were documented procedures, templates that were clearly defined, and approval gateways that were defined as well.
At around three months after developing the process, half of the company was bypassing it due to the fact that there were valid differences in the way they wanted to run their businesses. While I had developed a standardized process, what I had actually created was a fragile system that looked like it was controlled.
A much better way to achieve operational excellence is by applying architectural thinking similar to what engineers apply to buildings. For example, a solidly constructed building has a fully standardized load-bearing structural frame that cannot be altered or compromised in any way. However, everything else within the building, such as the interior layouts, finishes and uses, can vary depending upon the requirements of each space.
When you design with the layers of standardization explicitly outlined, individuals understand where they have the freedom to act independently versus where they do not.
Clarity regarding the boundaries of independent action can reduce resistance and improve both compliance and innovation. Lastly, culture plays a role in successful standardization. Imposing standardization from top down without taking into account the local context is typically the reason standardization fails.
Involving the individuals doing the work in the design of the standardization is typically the best method of gaining buy-in and reducing resistance.
When planners were involved in developing the automated quality system’s decision framework, they did not see it as restrictive; they saw it as the foundation that allowed them to spend less time on repetitive data-cleansing tasks and to focus on actual planning work.
The final test of a successfully implemented standardization program is whether or not it provides individuals with the ability to manage increased complexity or forces them to pretend that complexity does not exist.
7. Beyond your core role, you volunteer with the SPARK program and lead hands-on STEM activities in the Curiosity Cube. How has mentoring students influenced the way you think about leadership, innovation, and the future workforce?
Working with students has significantly altered how I view knowledge transfer and organizational learning. Moreover, it has made me a better leader in ways I did not anticipate. This is based upon the following premise: When you are explaining supply chain optimization or data quality issues to 12-year-old students within the Curiosity Cube, you cannot hide behind jargon.
You cannot say, “We are establishing a multi-ERP harmonization framework with reinforcement learning decision architecture,” and expect to be understood. Instead, you have to describe it in terms that have some relevance.
That discipline of simplifying complex ideas into primary elements has allowed me to be much more successful in delivering presentations to executives regarding the necessity of investing in AI infrastructure.
As I am asking for investment in AI infrastructure, I use the same clarity I learned from the 12-year-olds who asked, “Why do you need three different computer systems instead of just one?” However, the larger influence has been on how I conceptualize innovation itself. Students do not know what they are “not supposed” to try.
Sometimes, students will suggest methods of solving problems that defy conventional wisdom, and occasionally—but not always—those uninhibited suggestions reveal assumptions that we have been making that are not as relevant as we thought.
For example, a student once asked me, “If the computer can learn to correct errors, why can’t it learn to prevent them in the first place?” That question prompted me to add root cause analysis to my automation system, not simply error correction. Now, we find systemic data quality problems prior to their propagation (as opposed to cleaning up after they occur).
That insight was provided by someone who had no idea that “that wasn’t how it was done”. What I fear most about the upcoming workforce is not the technical knowledge but students absorb AI concepts intuitively; they have grown up with algorithms. Rather, it is the ethical framework surrounding AI implementation that concerns me.
Daily, I struggle with the following questions: At what point should AI make decisions versus humans? How do you validate that AI is not perpetuating biases? What occurs when AI gets it wrong at a massive scale? Who is accountable? Working with students continually forces me to address those questions and hopefully, the future doesn’t depend on AI entirely to a point where they lose their unique decision-making skills.
8. Looking ahead, what trends in digital manufacturing, smart operations, or supply chain transformation excite you the most, and where do you see yourself contributing as these technologies continue to reshape the field?
I am excited and also concerned about the transition from AI that makes recommendations to AI that will act and decide on its own (without needing direct human oversight). Companies have reached a level of maturity with the technology; they have a clear business rationale for implementing AI-driven solutions; and there is growing trust within organizations to allow AI to execute decisions based on analysis, not just provide insights.
However, what is keeping me up at night in a positive manner is how most organizations are currently taking the wrong approach.
Organizations are primarily focused on determining what AI can do and then developing the AI solution, rather than starting with a critical question such as: What types of decisions are we prepared to delegate to AI? Under what circumstances would we use AI decision-making? With what safeguards or checks and balances would we employ? The UOM harmonization system that I manage has achieved 94% of automation.
The system identifies incorrect information, updates our ERP systems, and provides documentation related to corrective actions — all without human intervention. However, the 6% of errors or discrepancies that are routed to human review for correction are not failures of the AI system.
Instead, those are intentional design choices made by the organization to capture the nuances and complexities of the business environment that cannot be captured solely through data patterns or algorithms. AI can be a powerful tool in the manufacturing and supply chain industry as long as rules and guidelines in workflows are set in place.
