Share with your friends


Analytics Magazine

Profit Center: Forecasting and optimization

July/August 2013

Beyond supply chain planning: the development of optimization in complex sourcing.

Arne AnderssonBy Arne Andersson

Optimization is the perfect technology for sourcing since it deals with selecting the best element that meets specified criteria from some set of available alternatives, i.e., “finding the best deal.” Its use is becoming more widespread in industry as data handling, processing power and solvers have improved to the extent that there is no other way to handle the levels of data and complexity in sourcing projects that are run today.

What constitutes the “best deal” is a discussion in its own right, but the person or organization doing the buying will determine it. If it is a commodity with a clearly defined specification it could well be a question of the lowest price. If, however, it is a service that is being sourced, “softer” criteria will more likely needed to be met, so price is only one factor that will be considered. As more criteria are introduced, the complexity increases and it is the ability to handle complexity that has seen a dramatic change in the way large organizations approach sourcing.

Fifty years ago the telephone and a notepad were the tools available for sourcing, so the levels of complexity were relatively low. Spreadsheets and e-mail dramatically increased the levels of complexity that could be handled, but even these techniques pale into insignificance with the levels of complexity that are handled by online sourcing platforms today. A typical “buying event” today will have thousands of items, and tens of thousands of offers from hundreds of suppliers. Even the simplest event will have potentially millions of combinations of goods and suppliers, so optimization is the only way to analyze this level of data.

Handling Complexity

The ability to handle large amounts of data has also seen sourcing change from a one-way process where suppliers are asked to make offers for individual items based on the sourcing companies’ criteria to a process of collecting an array of information from suppliers and analyzing the information collected in order to find the best solution. This combinatorial approach allows suppliers to express their strengths by creating their own groups of items so they can make their most competitive offers. Trade Extensions, for example, carried out the first online combinatorial auction in February 2001 when it worked with Volvo on a packaging tender, which involved 600 items, 15 suppliers and had a value of approximately $15 million.

To show how far the levels of complexity have increased, a U.S. bank recently used the platform to allocate a spend of $1 billion sourcing the materials to produce and deliver two billion items of direct mail after collecting 400,000 bids from more than 100 suppliers for 65,000 items. This level of complexity is commonplace nowadays, and many recent projects take the complexity to another level by integrating sourcing and planning.

Moving Beyond Sourcing

Companies that have become familiar using the technology for bid collection and analysis now realize that the software can be configured to solve any constraint-driven challenge. For example one of our customers is using the platform to define the manufacturing process of its products stage by stage. This customer has numerous manufacturing sites and even more assets at its disposal. In this case assets are manufacturing equipment that are owned and operated by external suppliers. Each asset has been qualified by the company to perform a certain operation for each product, so the challenge is to optimize the manufacturing process to ensure that each product goes through the correct number of processes required to produce the finished product using only qualified assets while taking into account the various costs – raw material, production, transport, warehousing, inventory etc. It’s a simple concept and a classic optimization challenge, and it is made more complex by introducing further constraints. For example, it is possible to increase the number of operations individual assets can perform on different products, but this qualification process costs time and money and there is qualification budget that cannot be exceeded.

Figure 1: Asset optimization based on monthly forecasts for each product and market, taking into account operational constraints.
Figure 1: Asset optimization based on monthly forecasts for each product and market, taking into account operational constraints.

To identify the most appropriate assets to use, the manufacturer optimizes its production based on monthly demand forecasts for each product and per market. It is an incredibly complex system in turns of data, but optimization transforms the data into tangible information that the business uses to determine its day-to-day operations. And because the data is continually updated it essentially creates a dynamic model of the supply chain on which further analysis can be carried out. For example: What happens if there is natural disaster that completely closes site Y? What happens if there is a 15 percent wage increase in China? What happens if “Supplier x” goes bust? If the data is handled in the correct way, there are no limits on the “what if?” questions companies can ask so they can see the impact of any proposed changes before implementation.

Flexibility Creates Complexity

The flexibility that is provided to organizations in terms of analysis creates its own problems, and a large proportion of the research that we are carrying out at the moment deals with improving our optimization software, both in terms of capability and user friendliness. There are many factors to consider.

First of all, one must always show respect for complexity. The type of mathematical problems that need to be tackled are known in the scientific world as NP-complete. Simply put, it is impossible to give any guarantees for how long they will take to solve. Therefore, we have carefully developed our skills and experiences of how to properly handle hard optimization problems in practice. In our experience, with a proper re-formulation or relaxation of the hardest problems, there is basically always a working solution available, with or without tweaking.

As an example, consider the following business rule: “For each product, we want two suppliers, but no individual supplier should be awarded less than 20 percent.” This seems like a quite natural rule. Let us re-formulate it slightly: “For each product, no supplier is awarded more than 80 percent, and the total number of suppliers is at most two.” Are the two rules identical? No; there are some subtle differences. For example, if there is only one available supplier on one product, the first rule would create an infeasible problem, while the second would still be able to handle. But, more importantly, the difference in execution time on a solver may be very large when these rules are combined with other rules. And, by helping clients re-formulate rules as above, we can bring significant assistance in tackling the most challenging problem instances.

Figure 2: Optimization used in conjunction with large-scale data and effective reporting is transforming sourcing and moving into areas beyond supply chain planning and asset optimization.
Figure 2: Optimization used in conjunction with large-scale data and effective reporting is transforming sourcing and moving into areas beyond supply chain planning and asset optimization.

Another example where much care is needed is related to numeric precision. It is not uncommon that very large numbers are mixed with very small numbers in the same sourcing/optimization project (e.g., when a retailer sources products where volumes differ by several orders of magnitude between different product categories). However, the small numbers are just as significant as large numbers and they cannot be ignored.

We also have to remember that we are working with people in sourcing departments and not computer scientists from academic institutions, and often users will create impossible or illogical queries to solve. Therefore, helping users to identify conflicting rules and constraints is of great importance. Not only may we face conflicting rules, but sometimes it may be very hard to understand why a particular solution is the optimal one. For example, we may ask ourselves why one particular supplier is not included in the optimal solution, and access to good automatic explanations is of vital importance. Such an explanation could be, “Supplier X not allocated because of Rule Y,” or “Not allocated because price is too high,” etc.

Alongside the challenge of formulating the correct query is the practical problem of computing power. Because the queries are NP-complete problems and it is impossible to predict how long they will take to solve, they can tie up a significant amount of computer resources. The Trade Extensions platform solves this by dynamically allocating computer resources over the cloud. While it’s commonplace for “the cloud” to be used for data storage, using it for data processing is still quite rare yet it allows an unlimited number of complex queries to be solved simultaneously.


Optimization is transforming sourcing and its influence on other areas of business is only going to increase. Data handling, equation definition, solvers and reporting are improving all the time so the number of people and organizations able to access these incredibly sophisticated tools will grow and optimization applications will only be limited by individuals’ creativity.

Arne Andersson co-founded Trade Extensions ( in June 2000. He is one of the world’s leading experts in designing algorithms for optimization, and he has published more than 50 articles in major scientific journals. In 2012 he became a member of the Royal Swedish Academy of Engineering Sciences. Previously, Andersson was a professor of computer science at Uppsala University, the oldest university in Sweden (founded in 1477) and one of the highest ranked universities in Europe.

business analytics news and articles


Analytics Blog

Electoral College put to the math test

With the campaign two months behind us and the inauguration of Donald Trump two days away, isn’t it time to put the 2016 U.S. presidential election to bed and focus on issues that have yet to be decided? Of course not.


2017 Tech Trends: the kinetic enterprise

From dark analytics to mixed reality, machine intelligence and blockchain, Deloitte’s annual Technology Trends report analyzes the trends that could disrupt businesses in the next 18-24 months. CIOs who can harness the possibilities of these technologies will be better positioned to shape the future of their business. Read more →

FICO: Financial crime trends for 2017

If the world seemed dangerous from a financial crime perspective last year, FICO experts predict an even more challenging 2017. In a new paper, four of the leaders in the company’s fraud and financial crime group laid out 17 predictions, ranging from killer devices in the home to hacked fingerprints. Read more →




2017 INFORMS Healthcare Conference
July 26-28, 2017, Rotterdam, the Netherlands


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to