Supply Chain Planning Blog

In-Memory Computing—Only the beginning

Posted by Cyrus Hadavi on Thu, Jul 21, 2016

In-Memory_Computing.jpgIn a few recent sessions with industry analysts, we were surprised that we were asked if our software is in-memory computing! Given the fact that for over 20 years we designed our applications to have all the data in-memory for computation, our immediate response was: Is there any other way of doing it? The response was, yes, there are others which bring in the data from the database when they need it but now they are changing and they are getting orders of magnitude improvement in speed! This improvement in speed must have caught the attention of the analysts which brings us to the core subject of this article. There is more to speed of application than just having all the data in memory. The latter is the easy part. There are also some vendors, try to improve speed by abstraction and over-simplification. I am sure you are aware of quite few who deploy “Spread-sheet” type of capacity planning in their S&OP applications. That is forming weekly or monthly buckets with fixed lead-times! This approach typically either dumbs down how to deal with capacity, or ignores it altogether. It is the old method, with NO notion of product mix and real processing time, that has been around for decades but with a new user interface which makes it slightly more attractive. Therefore, any gain in speed is offset by a very inaccurate and unrealistic plan. In addition, it has no order level information OR any order level pegging functionality. You might as well use your spreadsheets since they give you even more control!

 

To gain real improvement in speed with proper representation of capacity of resources and equipment, deep modeling capability is needed and the mix of products must be taken into account. In addition, to IMC, one needs to have data representations and algorithms that provide real-time answers to very complex supply chains at order level. As an example, if one material is not available, does the system go back to search all over again for a new method of making or will it just backtrack one step to find an immediate substitute pegged to that order? If a resource is a bottleneck, will it look for a whole new routing or will it look for an alternative, process or equipment. How this data is represented and how the algorithms divide and conquer in parallel processing is what makes the application fast. Just using IMC is only the beginning, there is a lot more that goes into a comprehensive planning system that can analyze tens of millions of data points from material availability to resources and tools and skill levels, to say a few, in almost real-time.

Topics: Supply Chain Performance Management, Planning Data Integration, Supply Chain Data, Spreadsheets, Attributes, Sales & Operations Planning, SCP System, S&OP, Adexa

Worried About Integration Of Your Supply Chain Planning System?

Posted by Bill Green on Wed, Jul 28, 2010
Supply chain WorriesPlanning a supply chain is a complicated task and requires a lot of complex calculations.  However, it does not require a lot of different kinds of data.   This is an important point to remember.   In this blog posting I want to discuss what to worry about when it comes to getting data for your supply chain planning system, and what not to worry about.

Basically, planning calculations need to consider all kinds of different data elements at the same time so the system needs to utilize a data architecture that is separate from the ERP system’s.  This means that all planning systems, whether they are offered by an ERP provider or a Best-of-Breed provider, need to be bolted-on to the ERP system with integration middleware.  I know that even the thought of integration to an ERP system makes many IT folks nervous, but when it comes to supply chain planning systems the integration should not be the biggest worry.   

Let’s break it down; there are 3 data factors for IT folks to think about when a new planning system is being launched.  In the order of easiest to hardest, they are:

1)     Integration to ERP Transactional system

2)     Getting the planning data complete and clean

3)     Making sure the planning system will scale and be reliable 

Integration to the ERP Transactional System

This is the one that everybody spends most time worrying about, but it really is not that hard and it’s the easiest to get passed.  First, there are only about 15-20 data tables that are required for supply chain planning.  The ERP transactional system will only contain about 10 of the key data elements.  Things like process routes, resource alternatives, resource capacity and resource availability, demand planning hierarchies do not have strong representations in ERP transactional systems.  These need to be defined in the Manufacturing Execution System (MES) or the supply chain planning system, itself.   Second, there has been a standardization of the data integration in-and-out of ERP systems.  The data-exits have been set up and well documented.  Both Best-of-Breed and the ERP vendors use the same data-exits.  Third, there are standard interfaces for supply chain planning that have been standardized over the years in major ERP systems, like in SAP® and Oracle®. So to sum it all up, a small set of data needs to be interfaced using a method that has been worked on for over 10 years.  So, not much to worry about here.

Getting the Planning Data Complete and Clean

This is one of the factors to worry about.  You must pay attention to the Master Data Management (MDM) part of the application that comes with your supply chain planning system. The data for planning needs to be verified carefully to get the best results, so strong master data management is critical.  This means that the MDM part of the supply chain planning system needs to be able to identify errors in data, its structure, and also make sure its correct in context with other data.  The contextual checks are what make MDM for supply chain planning systems so important.  Basically, The MDM for planning needs to take the data from other systems and clean it up with input from the user.  It also needs to make it so that it is easy for the user to add data to the system that does not exist anywhere else.  Note that attribute-based master data management makes this easier, by enabling data definition based on product characteristics.

Making Sure the Data Structure Can Scale

This is another thing to worry about.  Even if the data is clean and complete, the system will not work properly if it is not structured to be efficient and reliable.  If the system is slow, the users will not be able to complete the needed steps in the process and will give up.  Make sure to check how the data is structured for scalability.  A point to remember, memory-resident cache is critically important for planning systems.  Understand how this is architected for all the planning modules needed.  Is there a single point of failure?  Can you add user-defined attributes in the memory resident cache?  Can you change the imbedded logic in the memory resident cache?  If the answers do not show that the system is scalable, configurable, and reliable, then the system will fail because the IT staff does not have the ability to change any of this. 

In summary, data integration between ERP and a new planning system is the easiest part of its implementation but most IT folks seem to focus most of their worries on it.  Maybe because its under the control of the IT staff to fix.  What you should really be concerned about is the part that is not under their control, which is the underlying power of the planning systems’ data management capabilities.   This has a big impact on the data runs and takes more effort to fix.  Even a bigger concern should be the questions related to scalability and configurability of data cache within it.  If that is not right you are dead in the water.  

About the Author:  Bill Green is the Vice President of Solutions at Adexa, for more information about him please visit http://adexa.com/company/green.asp    

 

For more information about different types of Supply Chain Planning systems visit: Demand Planning, Inventory Planning, or Sales and Operations Planning.

Topics: Planning Analytics, Attribute Based Planning, Supply Chain Data, Integration, Adexa

What is The Real Cost of SAP®-APO?

Posted by Bill Green on Thu, Oct 08, 2009
Cost of SAP APOThe question of should a company use Best-of-Breed supply chain planning system vs. SAP®-APO has been going on for a long time. Business users have typically wanted more functionality than is offered by SAP, but SAP has always sold to the executives of the IT group on the strength of the lower cost of an integrated system. If the discussion were in any other area besides "supply chain planning", the position of the IT departments would be tough to challenge, but APO is not part of the core SAP system. It is a separate system that just like any other Best-of-Breed planning system is bolted onto R/3.  Furthermore, the technology for memory resident planning is very different than for transactional computing, so it is so much harder to leverage core SAP R/3 technology to any cost advantage.

SAP has gotten a pass on the real cost of implementing APO for too long. It is time to examine the real cost of ownership for APO. It would help a lot if IT and supply chain professionals that read this open dialogue contribute to answering the following questions. What is the cost of ownership of SAP's APO? How easy was it to integrate all the data that is required to run APO?  Was the big "integration" story worth the results?  

For the sake of this analysis the purpose is to not compare functionality, but only examine the cost of getting the system running in the way that supports the desired business process.  Are there really IT and cost advantages to implementing SAP's APO?  The areas that are worth looking at include: cost of the software, cost of the hardware required to run the system, cost of implementing similar functionality, cost of implementing required supporting systems (such as using Business Warehouse to support the need for analytic views), cost and complexity of integration to and from all sources of data, cost of supporting the system once it is up and running, etc.

I am hoping that we can hold an informal discussion and feedback going on this topic, so please feel free to share your experiences or answers with us.  Or feel free to email me at wgreen@adexa.com

 

About the Author:  Bill Green is the Vice President of Solutions at Adexa, for more information about him please visit link     

 

Topics: Supply Chain Planning, Supply Chain Data, Integration, Cost of SCP, SAP APO

The Key To Handling Your Supply Chain Planning Data

Posted by kameron hadavi on Tue, Sep 23, 2008

There are always new enhancements being added to the Adexa's Supply Chain Planning suite. Some of the new things that are going on are worth taking a special look at so that you can keep up on the latest key changes. In this posting we are highlighting what has been going on with Master Data Management.

This is a big area of new development at Adexa. One of the toughest challenges that a company faces is managing the data that is required for supply chain planning. The data may come from many different sources and lack the completeness that is required for a quality supply chain planning process. The data may need to be supplemented since it does not exist at any other source in the company. When various sources of data are brought together there may be inconsistency or data elements missing from the source location. Many times people are required to manage the data, and no systems exist that enables a person to view and manage exceptions in the data. All these considerations are addressed with Adexa’s Master Data Management Module. 

Visit us at http://www.adexa.com/ for more information, or click on: contact us if you have any questions.

Topics: Supply Chain, Supply Chain Planning, Supply Chain Data, Adexa