Rules Don’t Work: AI Predictions For Manufacturing And Supply Chain Part 2

Part 2 of the interview with Kumar Srivastava, Vice President of Engineering at, a company that builds AI solutions that bring radical efficiency to manufacturing and supply chains across industries.

Read Part 1 of the interview here.

Kumar, the Covid-19 crisis has been something difficult to predict. Can you tell us what was the impact of the crisis on the market and the demand for AI solutions?

The year 2020 was an interesting year. We came out of this crisis well prepared because our products and services are designed for the world where things change and shift so rapidly that the rules you had in your system have all been invalidated because all of the assumptions they were based on were not accurate anymore.

This year we see the change on two different levels. One, some companies could not keep up with the demand for certain products and certain types of products. Every time a lockdown was about to be announced, most locations run out of certain types of products. Then you have the other side where goods and services connected to the travel industry or the entertainment industry had a huge drop in demand. So, we had customers across the spectrum who were on one side struggling to keep up with the demand and, on the other side, would have produced excess inventory because the demand had gone down, but they have relied on the old logical business rules that were saying, “this is what you should do in July of any year.”

How do we approach that problem? We incorporate external data, and we tell you what the demand for a certain product should be, and that knowledge saves a lot of money and resources for our customers.

If their demand was spiking, we would be able to tell them how to address the demand by shifting their shipments, by shifting their manufacturing distributions to the location where you needed it the most.

So, does it mean that you’ve got ideal conditions where your solutions can be applied and bring value across different industries?

Absolutely. Noodle’s capabilities sit on top of the systems where decisions are encoded. We provide our customers with the information they need to be able to make better decisions and track the impact on our workflow. Now instead of saying, “I’m going to produce X amount of this product because that’s what the rule says,” we say, “The supply is going to change, and this is what you should do,” which is a more intelligent way of looking at the problem.

Photo by Mika Baumeister on Unsplash

What can companies do to maximize the value of AI solutions?

It’s been an impressive few years, as there was a spike in AI/ML interest. You could see that in the number of vendors in the market claiming to have AI/ML capabilities and the legacy vendors that were rebranding themselves as now doing AI. That first wave of AI was very open-ended in the sense that vendors came in and said, we have machine learning, we have AI, what problem do you want to solve? Or we have a platform where your data-scientists can be training models all day long, and they could be solving your business problems.  

In fact, many of those vendors didn’t have machine learning or AI capabilities. They had the most basic models or just the ability to leverage open-source software that was still developing and improving itself year over year at a very rapid pace. Some companies were trying to prioritize that. Other companies said, “Five years ago, we had statistical models in the black box, while today we have machine learning-powered or AI solutions, but we can’t show you that because it’s proprietary.”

The problem is that even if you had the best platform, today, some vendors produce programs for reading machine learning, it’s not enough. If you have that, you also need to have an army of trained data scientists in-house to use that platform. Do you understand what this issue was a few years ago?

There were companies that randomly purchased machine learning platforms. However, if you do not have the skills to use a platform, it doesn’t produce any value. So, instead of focusing your financial resources on solving your business problem, you are spending that time and effort building a platform. Because by the time you’re have finished that platform, the requirements that you’re designing your platform against might have changed. So now, this investment is wasted.

Many false starts have eroded some confidence in the market because people suddenly realize that they’re not getting any value or have opened up many tech-debt. The issue is that the market needs problem-centric solutions powered by machine learning in a format designed to engage users in this field; it needs to be demonstrably better than legacy products. 

When we talk to customers, we start by saying we have a product that solves a specific problem, specific sets of issues. We will prove that it works better than their current system. It is the sales target for any vendor that is doing machine learning products. Sometimes customers will say, “Can you give us the ability to configure something ourselves? ” The platform centricity comes later. Regardless of whether you’re a vendor or a customer looking for a solution, you have to start with the way to get through this noise and this jadedness to start with saying, “I have this specific problem. I need something that solves it in a better way than my current solution.” 

The hypothesis is it’s powered by machine learning. A good model is considering a lot more information and signals, so solving the same problem will produce a better answer. That’s where you start. Then you say, “I can do a test against multiple vendors with the same data set and see who predicts something more accurately.” 

Even if it’s a platform that I’m going after, this platform should enable my own in-house data science team to produce, train models, and develop models made frequently. But for that one, you have to have that business need to be able to quickly deliver these products and have the skill sets. The industry is learning. Customers are learning how to approach vendors and figure out what they need based on whether they are trying to solve a business problem or trying to solve an efficiency problem. Based on that, you pick the right products or platform strategy.

Photo by Alora Griffiths on Unsplash

What Big Data tools do you use or plan to invest in,

We are looking at problems with AI diversion from each other on the supply chain side; we are looking at data from planning systems like SIP. On the manufacturing side, we look at IoT scenarios where sensors run on various machines and equipment producing data multiple times a second, which is a completely different data scale. That means that no one solution solves the spectrum.

We are looking at the spectrum of data problems you could see in this industry, using operational data curated and done by user actions. On the other hand, you have data created by sensors reading and measuring at a very high-frequency power and then sending that data over.

When we think about data, we have to think of it holistically, and we have to look at how we capture it at the right rate; where do we charge; what’s the right place to catch it? How do we transport across various network boundaries and systems required to deliver the data to its final location, where it can be processed? Then we go to processing for how to build a new machine, learning training modules, generating those warnings, and processing to make a prediction. Those things happen all the time, often in parallel. There are the technologies required to power our applications; because of machine learning applications, the front end cannot say these are the predictions you want. 

A successful application will investigate and find the workflow that lets the user understand or send the context to help make a decision that already has many analytical credits. Once the prediction has been presented as a recommendation, they get supporting information or evidence. It is known as the transparency of AI, which lets the user see how a particular decision was. The nature of machine learning applications is you have a specific set of data that is predictions that are delivering into the user interface or an API. You have more analytical queries around that prediction to help the user understand how AI is and whether they should trust it.

The nature of the machine learning app is very analytical. It starts to make all you need to support many different kinds of creating systems, and sometimes people that the user will need to run to understand the prediction and build trust. 

For that information layer, we have to have a different set of technologies and designs. We have experimented and used everything from RDBMS systems, clusters; we have invested in technologies like charting and publishing, game systems. We also use Hadoop and Apache Spark for some instances. We look at using Turkmany services that might be equally well-designed for the kind of workloads we have, but they are cheaper for us to operate.

You can get multiple copies of AI on data sets. It is essential to understand what you need to store and the granularity level because it has a massive implication on computers. If you are not storing data, you match business problems because that will be a lot more of a computer.

Many factors go into picking AI technology. We are continually using different technologies, different levels of the stack from them to transition to training to the energy required. It all comes down to our decision. Matrix is, what is the kind of qualities, what processing you need to do. The user experience enables and puts that together to minimize development costs or operational cost and maintenance cost.

Since we are in a highly regulated industry, we also ensure that we follow all compliance requirements. Based on that, we can pick the right technology. Generally, it is a combination. Data that will be processed has to be altered in memory to create a model or even produce a prediction. I can reach across predicting an event or predicting actual data; we support both spectrums of the data technologies.

Kumar, thanks for your work and the insights from the industry.

Stay tuned for the next great interviews coming your way!