Share with your friends


Analytics Magazine

Profit Center: Crawl, Walk or Run

March/April 2010


Whether you’re launching an analytics career or project, is it best to start big, small or somewhere in the middle?

E. Andrew BoydBy E. Andrew Boyd

A common question when starting down the path of analytics is whether to start big, small or somewhere in between. Like so many things in life, there’s no one simple answer. There are, however, some useful guidelines.

Starting Small

Common wisdom is to start small: find a good, specific project, make it a success and then build on that success. Management at a waste management firm might ask if it was possible to reduce the number of trucks and drivers it uses through better dispatching. The problem can be examined by an individual with a combination of operations research and computer skills and a project sponsor with the clout to get access to the necessary data and people.

Starting small can be an effective approach, but in practice it often indicates a lack of corporate interest. If the project sponsor is the CEO, the project’s on solid ground. However, if the sponsor is a mid-level manager with a new idea, the project has a good chance of going nowhere.

When starting small, it’s necessary to have a clear plan for expanding. Project participants need to spark excitement in the people they interact with, both before and during the project. It’s also necessary to get management commitment up front for moving forward if the project is successful. If the commitment is, “OK, give it a try, and we’ll look at the results and think about what to do next,” the project’s almost certainly not worth the time to begin with. If the commitment is, “OK, give it a try, and if it’s successful, I’ll help you get support to roll out the results in the Eastern region,” the project has a fighting chance. It’s important to get that level of commitment early on. If not, it’s better to look for another project or find another sponsor.

The best way to get a commitment is to define the project around a problem of clear interest to someone else. The manager of a waste disposal facility isn’t interested in reducing the number of trucks he has. More likely, he’s willing to fight to keep them. Finding out how to help the facility manager solve a problem he’s dealing with — say, having fewer angry customers through better dispatching practices — is the best way to get project commitment (assuming, of course, that the facility manager’s ideas align with corporate goals).

Starting Big

While starting small can be effective, sometimes it’s neither possible nor desirable to do so. If a big opportunity presents itself, and momentum within the organization is to move forward, it’s almost always best to grab the reins and go.

A good example outside the analytics domain is the introduction of human resource planning software into an organization. The software is designed to manage human resources throughout an entire organization. Even if it would potentially make sense to try the software on a small part of an organization before undertaking a corporate overhaul, software sales agents sell a corporate vision, and organizations purchase that same vision. Such a large project will certainly be broken into smaller, more manageable rollouts. But the concept from the very beginning is, in a word, big. A good example in the analytics domain is airline revenue management software, where decisions on what seat inventory to sell on one flight impact similar inventory decisions on all other flights throughout the network.

In big projects, the focus shifts from convincing management that analytics is of value to making the project successful. The very nature of a big project is that the value decision has been made. Nonetheless, as an analytics project, part of the value will be how much money the project makes or saves the organization based on the decision support it provides. When human resource planning software is implemented, the value in terms of improved workflow has already been accounted for. The same is true of an airline revenue management system, but airlines will nonetheless want after-the-fact demonstrations of the value. How much money did the revenue management system make last month? Is it giving results that make sense? Measurement and evaluation are common to all analytics projects, and are a topic we’ll address in a future column.

Start In The Middle

There are, of course, intermediate projects. A good example is a grocery store chain testing a new pricing strategy at an individual store. Pilot projects of this nature require the involvement of multiple individuals. The organization has enough faith in the effort to give analytics a try, but the ultimate decision has yet to be reached. In many ways, pilot projects are the most difficult, since they carry the burden of both developing corporate commitment, like small projects, and making a project with many moving parts a success.

Pilot projects seem like a good idea, and they can be. An analytics project that demonstrates value in the “laboratory” is “shaken out” in the “real world.” Where problems arise is that the real world brings analytics professionals with new ideas face to face with professionals who’ve been doing their job for many years. As a result, pilot projects are often more fractious than helpful. With no clear go/no-go decision made by management, there’s endless room for bickering. Theoretically, bringing together many different opinions is a good idea, with the best decision rising to the top. In practice, the go/no-go decision is a complex mix of facts and politics.

Some of the problems can be mitigated if upper management conveys that a decision to move forward has been made and that the pilot project is an effort to iron out details. As a result, it’s worth the effort to seek this level of commitment before the project starts. If more work needs to be done to demonstrate that the underlying analytics has sufficient value, it’s easier to do so before beginning a pilot project. That doesn’t mean ignoring those individuals in the field, outside the laboratory.

Quite the contrary. Analytics can only be successful if it deals with all the real world complexities of a problem. But it’s far more efficient to air the facts, make a decision and move forward than to be in a state of constant indecision. Too often, the decision to undertake a pilot project simply pushes problems into the future that are better dealt with now.


Crawling, walking and running all have their advantages and disadvantages when starting down the path of analytics. But even if one approach was clearly best, the approach that’s ultimately taken is largely shaped by the circumstances. Common to all approaches — and really to all managerial decision-making — is the need to understand the environment, have clear goals and choose a carefully reasoned path — a path to success.

Andrew Boyd served as executive and chief scientist at an analytics firm for many years. He can be reached at



Challenges facing supply chain execs: leadership, labor, legacy technology

While most companies recognize the value of a digitally enabled supply chain – empowered by new technologies like artificial intelligence, blockchain, big data and analytics – many chief supply chain officers (CSCOs) are not leveraging their C-suite counterparts to help reinvent the supply chain function and transform it into an engine of new growth models and customer experiences, according to new research from Accenture. Read more →

Data Science Bowl: Using AI to accelerate life-saving medical research

Imagine unleashing the power of artificial intelligence to automate a critical component of biomedical research, expediting life-saving research in the treatment of almost every disease from rare disorders to the common cold. This could soon be a reality, thanks to the fourth Data Science Bowl, a 90-day competition in which, for the first time, participants trained deep learning models to examine images of cells and identify nuclei, regardless of the experimental setup – and without human intervention. Read more →



INFORMS International Conference
June 17-20, 2018, Taipei, Taiwan

INFORMS Annual Meeting
Nov. 4-7, 2018, Phoenix


Advancing the Analytics-Driven Organization
July 16-19, noon-5 p.m.

Making Data Science Pay
July 30-31, 12:30 p.m.-5 p.m.

Predictive Analytics: Failure to Launch Webinar
Aug. 18, 11 a.m.

Applied AI & Machine Learning | Comprehensive
Sept. 10-13, 17-20 and 24-25


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to