Share with your friends


Analytics Magazine

Profit Center: Crawl, Walk or Run

March/April 2010


Whether you’re launching an analytics career or project, is it best to start big, small or somewhere in the middle?

E. Andrew BoydBy E. Andrew Boyd

A common question when starting down the path of analytics is whether to start big, small or somewhere in between. Like so many things in life, there’s no one simple answer. There are, however, some useful guidelines.

Starting Small

Common wisdom is to start small: find a good, specific project, make it a success and then build on that success. Management at a waste management firm might ask if it was possible to reduce the number of trucks and drivers it uses through better dispatching. The problem can be examined by an individual with a combination of operations research and computer skills and a project sponsor with the clout to get access to the necessary data and people.

Starting small can be an effective approach, but in practice it often indicates a lack of corporate interest. If the project sponsor is the CEO, the project’s on solid ground. However, if the sponsor is a mid-level manager with a new idea, the project has a good chance of going nowhere.

When starting small, it’s necessary to have a clear plan for expanding. Project participants need to spark excitement in the people they interact with, both before and during the project. It’s also necessary to get management commitment up front for moving forward if the project is successful. If the commitment is, “OK, give it a try, and we’ll look at the results and think about what to do next,” the project’s almost certainly not worth the time to begin with. If the commitment is, “OK, give it a try, and if it’s successful, I’ll help you get support to roll out the results in the Eastern region,” the project has a fighting chance. It’s important to get that level of commitment early on. If not, it’s better to look for another project or find another sponsor.

The best way to get a commitment is to define the project around a problem of clear interest to someone else. The manager of a waste disposal facility isn’t interested in reducing the number of trucks he has. More likely, he’s willing to fight to keep them. Finding out how to help the facility manager solve a problem he’s dealing with — say, having fewer angry customers through better dispatching practices — is the best way to get project commitment (assuming, of course, that the facility manager’s ideas align with corporate goals).

Starting Big

While starting small can be effective, sometimes it’s neither possible nor desirable to do so. If a big opportunity presents itself, and momentum within the organization is to move forward, it’s almost always best to grab the reins and go.

A good example outside the analytics domain is the introduction of human resource planning software into an organization. The software is designed to manage human resources throughout an entire organization. Even if it would potentially make sense to try the software on a small part of an organization before undertaking a corporate overhaul, software sales agents sell a corporate vision, and organizations purchase that same vision. Such a large project will certainly be broken into smaller, more manageable rollouts. But the concept from the very beginning is, in a word, big. A good example in the analytics domain is airline revenue management software, where decisions on what seat inventory to sell on one flight impact similar inventory decisions on all other flights throughout the network.

In big projects, the focus shifts from convincing management that analytics is of value to making the project successful. The very nature of a big project is that the value decision has been made. Nonetheless, as an analytics project, part of the value will be how much money the project makes or saves the organization based on the decision support it provides. When human resource planning software is implemented, the value in terms of improved workflow has already been accounted for. The same is true of an airline revenue management system, but airlines will nonetheless want after-the-fact demonstrations of the value. How much money did the revenue management system make last month? Is it giving results that make sense? Measurement and evaluation are common to all analytics projects, and are a topic we’ll address in a future column.

Start In The Middle

There are, of course, intermediate projects. A good example is a grocery store chain testing a new pricing strategy at an individual store. Pilot projects of this nature require the involvement of multiple individuals. The organization has enough faith in the effort to give analytics a try, but the ultimate decision has yet to be reached. In many ways, pilot projects are the most difficult, since they carry the burden of both developing corporate commitment, like small projects, and making a project with many moving parts a success.

Pilot projects seem like a good idea, and they can be. An analytics project that demonstrates value in the “laboratory” is “shaken out” in the “real world.” Where problems arise is that the real world brings analytics professionals with new ideas face to face with professionals who’ve been doing their job for many years. As a result, pilot projects are often more fractious than helpful. With no clear go/no-go decision made by management, there’s endless room for bickering. Theoretically, bringing together many different opinions is a good idea, with the best decision rising to the top. In practice, the go/no-go decision is a complex mix of facts and politics.

Some of the problems can be mitigated if upper management conveys that a decision to move forward has been made and that the pilot project is an effort to iron out details. As a result, it’s worth the effort to seek this level of commitment before the project starts. If more work needs to be done to demonstrate that the underlying analytics has sufficient value, it’s easier to do so before beginning a pilot project. That doesn’t mean ignoring those individuals in the field, outside the laboratory.

Quite the contrary. Analytics can only be successful if it deals with all the real world complexities of a problem. But it’s far more efficient to air the facts, make a decision and move forward than to be in a state of constant indecision. Too often, the decision to undertake a pilot project simply pushes problems into the future that are better dealt with now.


Crawling, walking and running all have their advantages and disadvantages when starting down the path of analytics. But even if one approach was clearly best, the approach that’s ultimately taken is largely shaped by the circumstances. Common to all approaches — and really to all managerial decision-making — is the need to understand the environment, have clear goals and choose a carefully reasoned path — a path to success.

Andrew Boyd served as executive and chief scientist at an analytics firm for many years. He can be reached at



Former INFORMS President Cook named to U.S. Census committee

Tom Cook, a former president of INFORMS, a founding partner of Decision Analytics International and a member of the National Academy of Engineering, was recently named one of five new members of the U.S. Census Bureau’s Census Scientific Advisory Committee (CSAC). The committee meets twice a year to address policy, research and technical issues relating to a full range of Census Bureau programs and activities, including census tests, policies and operations. The CSAC will meet for its fall 2018 meeting at Census Bureau headquarters in Suitland, Md., Sept. 13-14. Read more →

Gartner identifies six barriers to becoming a digital business

As organizations continue to embrace digital transformation, they are finding that digital business is not as simple as buying the latest technology – it requires significant changes to culture and systems. A recent Gartner, Inc. survey found that only a small number of organizations have been able to successfully scale their digital initiatives beyond the experimentation and piloting stages. “The reality is that digital business demands different skills, working practices, organizational models and even cultures,” says Marcus Blosch, research vice president at Gartner. Read more →

Innovation and speculation drive stock market bubble activity

A group of data scientists conducted an in-depth analysis of major innovations and stock market bubbles from 1825 through 2000 and came away with novel takeaways of their own as they found some very distinctive patterns in the occurrence of bubbles over 175 years. The study authors detected bubbles in approximately 73 percent of the innovations they studied, revealing the close relationship between innovation and stock market bubbles. Read more →



INFORMS Annual Meeting
Nov. 4-7, 2018, Phoenix

Winter Simulation Conference
Dec. 9-12, 2018, Gothenburg, Sweden


Applied AI & Machine Learning | Comprehensive
Sept. 10-13, 17-20 and 24-25

Advancing the Analytics-Driven Organization
Sept. 17-20, 12-5 p.m. LIVE Online

The Analytics Clinic: Ensemble Models: Worth the Gains?
Sept. 20, 11 a.m.-12:30 p.m.

Predictive Analytics: Failure to Launch Webinar
Oct. 3, 11 a.m.

Advancing the Analytics-Driven Organization
Oct. 1-4, 12 p.m.-5 p.m.

Applied AI & Machine Learning | Comprehensive
Oct. 15-19, Washington, D.C.

Making Data Science Pay
Oct. 29 -30, 12 p.m.-5 p.m.


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to