Share with your friends


Analytics Magazine

Profit Center: Crawl, Walk or Run

March/April 2010


Whether you’re launching an analytics career or project, is it best to start big, small or somewhere in the middle?

E. Andrew BoydBy E. Andrew Boyd

A common question when starting down the path of analytics is whether to start big, small or somewhere in between. Like so many things in life, there’s no one simple answer. There are, however, some useful guidelines.

Starting Small

Common wisdom is to start small: find a good, specific project, make it a success and then build on that success. Management at a waste management firm might ask if it was possible to reduce the number of trucks and drivers it uses through better dispatching. The problem can be examined by an individual with a combination of operations research and computer skills and a project sponsor with the clout to get access to the necessary data and people.

Starting small can be an effective approach, but in practice it often indicates a lack of corporate interest. If the project sponsor is the CEO, the project’s on solid ground. However, if the sponsor is a mid-level manager with a new idea, the project has a good chance of going nowhere.

When starting small, it’s necessary to have a clear plan for expanding. Project participants need to spark excitement in the people they interact with, both before and during the project. It’s also necessary to get management commitment up front for moving forward if the project is successful. If the commitment is, “OK, give it a try, and we’ll look at the results and think about what to do next,” the project’s almost certainly not worth the time to begin with. If the commitment is, “OK, give it a try, and if it’s successful, I’ll help you get support to roll out the results in the Eastern region,” the project has a fighting chance. It’s important to get that level of commitment early on. If not, it’s better to look for another project or find another sponsor.

The best way to get a commitment is to define the project around a problem of clear interest to someone else. The manager of a waste disposal facility isn’t interested in reducing the number of trucks he has. More likely, he’s willing to fight to keep them. Finding out how to help the facility manager solve a problem he’s dealing with — say, having fewer angry customers through better dispatching practices — is the best way to get project commitment (assuming, of course, that the facility manager’s ideas align with corporate goals).

Starting Big

While starting small can be effective, sometimes it’s neither possible nor desirable to do so. If a big opportunity presents itself, and momentum within the organization is to move forward, it’s almost always best to grab the reins and go.

A good example outside the analytics domain is the introduction of human resource planning software into an organization. The software is designed to manage human resources throughout an entire organization. Even if it would potentially make sense to try the software on a small part of an organization before undertaking a corporate overhaul, software sales agents sell a corporate vision, and organizations purchase that same vision. Such a large project will certainly be broken into smaller, more manageable rollouts. But the concept from the very beginning is, in a word, big. A good example in the analytics domain is airline revenue management software, where decisions on what seat inventory to sell on one flight impact similar inventory decisions on all other flights throughout the network.

In big projects, the focus shifts from convincing management that analytics is of value to making the project successful. The very nature of a big project is that the value decision has been made. Nonetheless, as an analytics project, part of the value will be how much money the project makes or saves the organization based on the decision support it provides. When human resource planning software is implemented, the value in terms of improved workflow has already been accounted for. The same is true of an airline revenue management system, but airlines will nonetheless want after-the-fact demonstrations of the value. How much money did the revenue management system make last month? Is it giving results that make sense? Measurement and evaluation are common to all analytics projects, and are a topic we’ll address in a future column.

Start In The Middle

There are, of course, intermediate projects. A good example is a grocery store chain testing a new pricing strategy at an individual store. Pilot projects of this nature require the involvement of multiple individuals. The organization has enough faith in the effort to give analytics a try, but the ultimate decision has yet to be reached. In many ways, pilot projects are the most difficult, since they carry the burden of both developing corporate commitment, like small projects, and making a project with many moving parts a success.

Pilot projects seem like a good idea, and they can be. An analytics project that demonstrates value in the “laboratory” is “shaken out” in the “real world.” Where problems arise is that the real world brings analytics professionals with new ideas face to face with professionals who’ve been doing their job for many years. As a result, pilot projects are often more fractious than helpful. With no clear go/no-go decision made by management, there’s endless room for bickering. Theoretically, bringing together many different opinions is a good idea, with the best decision rising to the top. In practice, the go/no-go decision is a complex mix of facts and politics.

Some of the problems can be mitigated if upper management conveys that a decision to move forward has been made and that the pilot project is an effort to iron out details. As a result, it’s worth the effort to seek this level of commitment before the project starts. If more work needs to be done to demonstrate that the underlying analytics has sufficient value, it’s easier to do so before beginning a pilot project. That doesn’t mean ignoring those individuals in the field, outside the laboratory.

Quite the contrary. Analytics can only be successful if it deals with all the real world complexities of a problem. But it’s far more efficient to air the facts, make a decision and move forward than to be in a state of constant indecision. Too often, the decision to undertake a pilot project simply pushes problems into the future that are better dealt with now.


Crawling, walking and running all have their advantages and disadvantages when starting down the path of analytics. But even if one approach was clearly best, the approach that’s ultimately taken is largely shaped by the circumstances. Common to all approaches — and really to all managerial decision-making — is the need to understand the environment, have clear goals and choose a carefully reasoned path — a path to success.

Andrew Boyd served as executive and chief scientist at an analytics firm for many years. He can be reached at


Analytics Blog

Electoral College put to the math test

With the campaign two months behind us and the inauguration of Donald Trump two days away, isn’t it time to put the 2016 U.S. presidential election to bed and focus on issues that have yet to be decided? Of course not.


Survey: Despite the hype, AI adoption still in early stages

The hype surrounding artificial intelligence (AI) is intense, but for most European businesses surveyed in a recent study by SAS, adoption of AI is still in the early or even planning stages. The good news is, the vast majority of organizations have begun to talk about AI, and a few have even begun to implement suitable projects. There is much optimism about the potential of AI, although fewer were confident that their organization was ready to exploit that potential. Read more →

Data professionals spend almost as much time prepping data as analyzing it

Nearly 40 percent of data professionals spend more than 20 hours per week accessing, blending and preparing data rather than performing actual analysis, according to a survey conducted by TMMData and the Digital Analytics Association. More than 800 DAA community members participated in the survey held earlier this year. The survey revealed that data access, quality and integration present persistent, interrelated roadblocks to efficient and confident analysis across industries. Read more →



2017 Winter Simulation Conference (WSC 2017)
Dec. 3-6, 2017, Las Vegas


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to