Share with your friends


Analytics Magazine

Challenges for public sector analytics

March/April 2012

E. S. LevineBy E. S. Levine

Analytics is a key function undergoing rapid growth in businesses across the globe [1], as the readers of this magazine are well aware. Mirroring the trend in business, government agencies are also incorporating analytics into more and more of their decision-making, such as evaluating outcomes in education and healthcare treatments. However, the growth of analytics in the public sector faces different challenges than in the private sector.

Ample opportunities exist to apply analytics to public sector decision-making. Most importantly, the stakes of decision-making far exceed the threshold to justify analytic effort; making an informed decision can save lives, swing billions of dollars of spending to more efficient use, or save citizens countless hours of time and effort. Furthermore, many government agencies already collect data and have well-established privacy guidelines, legal protections and oversight procedures that regulate how that data can be used. When data are not available or must be augmented, the government has many methods of accessing relevant subject matter expertise to be incorporated into the analyses.

Historically, the use of analytics in government has produced several success stories. For example, the military has long been an innovator in analytics, having pioneered the field of operations research during World War II [2]. Intelligence analysis, though not what most businesspeople would consider quantitative analytics, is an analytical process already well-integrated into decision-making. Federal agencies analyze the costs, benefits and other impacts of proposed regulations. Additionally, the public sector has been a leader in developing performance management analytics for measuring the impact of hard-to-measure programs [3]. These are just a few of many possible examples.

From my personal experience, I have seen analytics usefully applied to a variety of decisions. As an example, I will briefly describe a risk assessment used to inform resource allocation at the Tucson Sector of the United States Border Patrol (USBP), the busiest sector for alien and drug trafficking along the Southwest land border with Mexico. The assessment was carried out in anticipation of a $600 million supplemental appropriation allocated by Congress and the president to the Southwest border in August 2010. The Tucson Sector USBP expected to receive a portion of these funds and Tucson’s chief patrol agent needed to decide how to allocate the additional resources across a 262-mile border, which is divided into zones, and among a spectrum of countermeasures, including personnel, fencing, surveillance equipment, specialty units and forward operating bases.

Working in concert with a team of analysts from the Department of Homeland Security, the USBP executed an assessment that incorporated descriptive analytics to identify “hot spots” and gaps in resources and prescriptive analytics to connect the analytic conclusions to the decision-maker’s alternatives. Data was gathered from the USBP’s records and from subject matter experts; the data was then analyzed to identify the countermeasure allocations expected to produce the best risk reduction return on investment. After the assessment and its findings were presented to the chief patrol agent, he used it to inform the allocation of a thousand USBP agents, the placement of a forward operating base and the development of an operations plan for part of the sector.

Though opportunities to apply analytics in the public sector abound, cultural, organizational and technical challenges must be surmounted before government agencies can claim to be fully developed, enterprise-wide, analytically competitive organizations. Progress is being made to surmount these challenges, but much work remains to be done.

Building an analytical culture, where data is widely used to evaluate hypotheses, is critical to becoming an analytically competitive organization. Despite the successes the public sector has had with analytics in the past, analytics has not been integrated into most decision-making processes. In part, this may be due to the tremendous variety of tasks in many different subject areas that government organizations carry out. In such a varied environment, one-size-fits-all approaches to cultural change are often ineffective, and tailored training, policies and incentives are necessary. All of these potential solutions require effort, commitment and time to come to fruition.

In both the public and private sectors, another reason that an analytical culture is difficult to establish is that analytical talent is scarce. In contrast to the private sector, many components of the federal government spent the early part of the decade beginning in 2000 outsourcing analytical work to contracting companies. Only recently has there been a concerted effort to bring analytical talent in-house as federal employees (a legal requirement if analytics is to be applied to procurement [4]). Competing to hire talent into federal positions is difficult but not impossible; disadvantages in compensation packages (compared to financial and other quantitative industries that recruit analytical professionals) can be counterbalanced by an emphasis on mission importance, meaningful projects and public service. A general lack of career paths for analysts who do not transition to management roles is a serious issue for employee retention. The current budgetary climate is, with some exceptions, not friendly toward the addition of new, expensive capabilities, even if in the long run these capabilities save money and improve efficiency.

Several other challenges stem from the organization of the U.S. public sector; these are unlikely to be alleviated because they are intended consequences of the federal system’s design. Although the public sector does collect a lot of data, it is spread among federal, state and local authorities with no central repository. Furthermore, though information sharing has been prioritized in the security and intelligence communities in the wake of the 9/11 Commission Report [5], resulting in the creation of fusion centers across the country, sharing data in other realms has not been pursued to the same degree. In part, this is by design; centralized data raises a host of valid privacy, security and comparability concerns. However, there are occasions when data shared between parts of the government, incorporating proper privacy protections, would lead to improved efficiency (as it has in the intelligence community). In these cases, the necessary memoranda of understanding and legal judgments are slow to develop and are outside the skill set of most technical analysts, thus requiring coordination among teams with disparate backgrounds. The participating organizations often do not share a common chain of command, so partnerships will continue to be required to make progress on analytics.

Public sector analytics must be responsive to dissimilar groups of stakeholders. It is necessary, but not sufficient, to design analyses to satisfy the primary customer (often a decision-maker higher in the chain of command); care must also be taken to accommodate the desires of Congress and the public, which generally do not have consensus on their objectives. In addition, when analytic findings are perceived to favor one policy alternative over another, stakeholders may dispute the analytic process in order to protect their positions and belief systems. This behavior is a common response to private sector analytics as well.

Finally, measuring the impact of analytics in the public sector is a difficult technical challenge because government’s metrics for success are much more complex than simpler private sector measures such as increased profit or value. For example, counterterrorism decisions are made to minimize the number of fatalities, injuries and economic consequences of a successful attack, the cost of countermeasures and the impact on legitimate commerce, to name a few of the objectives. Reasonable people agree that all of these objectives are important, but disagree as to the values of the tradeoffs between the objectives, so it is impractical to construct a single utility function to inform these decisions because it is unclear whose tradeoffs should be used. The decision-maker’s? The president’s? Congress’s? The public’s? Sometimes alternatives can be identified that are preferred across a wide range of tradeoff values, but this is not always the case.

Outside of the federal government, state and local governments are also working toward meeting these challenges but vary in the amount of progress they have made. Only a few of these organizations have the critical mass of analytic professionals necessary to integrate analysis into all decision-making, but many organizations have enough analytical talent to make substantial advances. Some innovative approaches have begun at the local level, such as the New York City Police Department’s CompStat program, a data-driven perspective on policing that has spread to several major cities and has been expanded to include additional service functions via CitiStat systems. In the future, the “laboratories of democracy” that are an integral part of the U.S. system of governance will continue to develop novel uses and approaches for analytics.

Efforts at all levels of the public sector give hope that analytics will become better integrated into decision-making despite the challenges described in this article. The growing cadre of trained analysts, the increasing availability of analytical tools, the accumulation of data and the promulgation of success stories all point toward improved integration. Many government organizations have also prioritized the development of analytical capabilities, such as the Department of Homeland Security’s emphasis on risk analysis. I believe that public sector analytics will continue to grow, but consistent effort will be required to surmount the challenges and transform government into an analytically competitive organization.

Dr. E. S. Levine is the chief scientist of the U.S. Department of Homeland Security’s Office of Risk Management and Analysis. He can be contacted at


  1. Davenport, T.H., and Harris, J.G. Competing on Analytics. Harvard Business School Press: 2007.
  2. Gass, S.I. , and Assad, A.A., “An Annotated Timeline of Operations Research,” Springer: 2005.
  3.  For a recent example, see the “Performance and Management” section of the president’s proposed “Budget of the U.S. Government,” FY2013, Analytical Perspectives volume, available at
  4. See Office of Federal Procurement Policy (OFPP) Letter 11-01, “Performance of inherently governmental and critical functions,” Sept. 12, 2011, for more detail.
  5. Available at

business analytics news and articles


Report: One in five cloud-based user accounts may be fake

According to the Q2 2018 DataVisor Fraud Index Report, more than one in five user accounts set up through cloud service providers may be fraudulent. The report, based on information gathered between April and June, analyzes 1.1 billion active user accounts, 1.5 million email domains, 231,000 device types and 562 cloud hosting providers and data centers, among other indicators. Read more →

When managers respond to online critics, more negative reviews ensue

A new study in the INFORMS journal Marketing Science found that when managers respond to online reviews it’s possible that those responses could actually stimulate additional reviewing activity and an increased number of negative reviews. The study, “Channels of Impact: User Reviews When Quality is Dynamic and Managers Respond,” is authored by Judith Chevalier of the Yale School of Management and NBER, Yaniv Dover of the Hebrew University of Jerusalem and Dina Mayzlin of the Marshal School of Business at the University of Southern California. Read more →

IE student designs software to optimize snow removal at Penn State

It is well known among the State College and Penn State communities that it takes a lot for university officials to shut the campus down after a major snowfall. In fact, since 2010, the University Park campus has been shut down just three full days due to snowfall. Much to the chagrin of students – and faculty and staff – the snow day at Penn State may just have become even more elusive, thanks to software developed by recent industrial engineering graduate Achal Goel. Read more →



Winter Simulation Conference
Dec. 9-12, 2018, Gothenburg, Sweden

INFORMS Computing Society Conference
Jan. 6-8, 2019; Knoxville, Tenn.

INFORMS Conference on Business Analytics & Operations Research
April 14-16, 2019; Austin, Texas

INFORMS International Conference
June 9-12, 2019; Cancun, Mexico

INFORMS Marketing Science Conference
June 20-22; Rome, Italy

INFORMS Applied Probability Conference
July 2-4, 2019; Brisbane, Australia

INFORMS Healthcare Conference
July 27-29, 2019; Boston, Mass.

2019 INFORMS Annual Meeting
Oct. 20-23, 2019; Seattle, Wash.

Winter Simulation Conference
Dec. 8-11, 2019: National Harbor, Md.


Applied AI & Machine Learning | Comprehensive
Dec. 3, 2018 (live online)

Advancing the Analytics-Driven Organization
Jan. 28–31, 2019, 1 p.m.– 5 p.m. (live online)


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to