Share with your friends










Submit

Analytics Magazine

Challenges for public sector analytics

March/April 2012

E. S. LevineBy E. S. Levine

Analytics is a key function undergoing rapid growth in businesses across the globe [1], as the readers of this magazine are well aware. Mirroring the trend in business, government agencies are also incorporating analytics into more and more of their decision-making, such as evaluating outcomes in education and healthcare treatments. However, the growth of analytics in the public sector faces different challenges than in the private sector.

Ample opportunities exist to apply analytics to public sector decision-making. Most importantly, the stakes of decision-making far exceed the threshold to justify analytic effort; making an informed decision can save lives, swing billions of dollars of spending to more efficient use, or save citizens countless hours of time and effort. Furthermore, many government agencies already collect data and have well-established privacy guidelines, legal protections and oversight procedures that regulate how that data can be used. When data are not available or must be augmented, the government has many methods of accessing relevant subject matter expertise to be incorporated into the analyses.

Historically, the use of analytics in government has produced several success stories. For example, the military has long been an innovator in analytics, having pioneered the field of operations research during World War II [2]. Intelligence analysis, though not what most businesspeople would consider quantitative analytics, is an analytical process already well-integrated into decision-making. Federal agencies analyze the costs, benefits and other impacts of proposed regulations. Additionally, the public sector has been a leader in developing performance management analytics for measuring the impact of hard-to-measure programs [3]. These are just a few of many possible examples.

From my personal experience, I have seen analytics usefully applied to a variety of decisions. As an example, I will briefly describe a risk assessment used to inform resource allocation at the Tucson Sector of the United States Border Patrol (USBP), the busiest sector for alien and drug trafficking along the Southwest land border with Mexico. The assessment was carried out in anticipation of a $600 million supplemental appropriation allocated by Congress and the president to the Southwest border in August 2010. The Tucson Sector USBP expected to receive a portion of these funds and Tucson’s chief patrol agent needed to decide how to allocate the additional resources across a 262-mile border, which is divided into zones, and among a spectrum of countermeasures, including personnel, fencing, surveillance equipment, specialty units and forward operating bases.

Working in concert with a team of analysts from the Department of Homeland Security, the USBP executed an assessment that incorporated descriptive analytics to identify “hot spots” and gaps in resources and prescriptive analytics to connect the analytic conclusions to the decision-maker’s alternatives. Data was gathered from the USBP’s records and from subject matter experts; the data was then analyzed to identify the countermeasure allocations expected to produce the best risk reduction return on investment. After the assessment and its findings were presented to the chief patrol agent, he used it to inform the allocation of a thousand USBP agents, the placement of a forward operating base and the development of an operations plan for part of the sector.

Though opportunities to apply analytics in the public sector abound, cultural, organizational and technical challenges must be surmounted before government agencies can claim to be fully developed, enterprise-wide, analytically competitive organizations. Progress is being made to surmount these challenges, but much work remains to be done.

Building an analytical culture, where data is widely used to evaluate hypotheses, is critical to becoming an analytically competitive organization. Despite the successes the public sector has had with analytics in the past, analytics has not been integrated into most decision-making processes. In part, this may be due to the tremendous variety of tasks in many different subject areas that government organizations carry out. In such a varied environment, one-size-fits-all approaches to cultural change are often ineffective, and tailored training, policies and incentives are necessary. All of these potential solutions require effort, commitment and time to come to fruition.

In both the public and private sectors, another reason that an analytical culture is difficult to establish is that analytical talent is scarce. In contrast to the private sector, many components of the federal government spent the early part of the decade beginning in 2000 outsourcing analytical work to contracting companies. Only recently has there been a concerted effort to bring analytical talent in-house as federal employees (a legal requirement if analytics is to be applied to procurement [4]). Competing to hire talent into federal positions is difficult but not impossible; disadvantages in compensation packages (compared to financial and other quantitative industries that recruit analytical professionals) can be counterbalanced by an emphasis on mission importance, meaningful projects and public service. A general lack of career paths for analysts who do not transition to management roles is a serious issue for employee retention. The current budgetary climate is, with some exceptions, not friendly toward the addition of new, expensive capabilities, even if in the long run these capabilities save money and improve efficiency.

Several other challenges stem from the organization of the U.S. public sector; these are unlikely to be alleviated because they are intended consequences of the federal system’s design. Although the public sector does collect a lot of data, it is spread among federal, state and local authorities with no central repository. Furthermore, though information sharing has been prioritized in the security and intelligence communities in the wake of the 9/11 Commission Report [5], resulting in the creation of fusion centers across the country, sharing data in other realms has not been pursued to the same degree. In part, this is by design; centralized data raises a host of valid privacy, security and comparability concerns. However, there are occasions when data shared between parts of the government, incorporating proper privacy protections, would lead to improved efficiency (as it has in the intelligence community). In these cases, the necessary memoranda of understanding and legal judgments are slow to develop and are outside the skill set of most technical analysts, thus requiring coordination among teams with disparate backgrounds. The participating organizations often do not share a common chain of command, so partnerships will continue to be required to make progress on analytics.

Public sector analytics must be responsive to dissimilar groups of stakeholders. It is necessary, but not sufficient, to design analyses to satisfy the primary customer (often a decision-maker higher in the chain of command); care must also be taken to accommodate the desires of Congress and the public, which generally do not have consensus on their objectives. In addition, when analytic findings are perceived to favor one policy alternative over another, stakeholders may dispute the analytic process in order to protect their positions and belief systems. This behavior is a common response to private sector analytics as well.

Finally, measuring the impact of analytics in the public sector is a difficult technical challenge because government’s metrics for success are much more complex than simpler private sector measures such as increased profit or value. For example, counterterrorism decisions are made to minimize the number of fatalities, injuries and economic consequences of a successful attack, the cost of countermeasures and the impact on legitimate commerce, to name a few of the objectives. Reasonable people agree that all of these objectives are important, but disagree as to the values of the tradeoffs between the objectives, so it is impractical to construct a single utility function to inform these decisions because it is unclear whose tradeoffs should be used. The decision-maker’s? The president’s? Congress’s? The public’s? Sometimes alternatives can be identified that are preferred across a wide range of tradeoff values, but this is not always the case.

Outside of the federal government, state and local governments are also working toward meeting these challenges but vary in the amount of progress they have made. Only a few of these organizations have the critical mass of analytic professionals necessary to integrate analysis into all decision-making, but many organizations have enough analytical talent to make substantial advances. Some innovative approaches have begun at the local level, such as the New York City Police Department’s CompStat program, a data-driven perspective on policing that has spread to several major cities and has been expanded to include additional service functions via CitiStat systems. In the future, the “laboratories of democracy” that are an integral part of the U.S. system of governance will continue to develop novel uses and approaches for analytics.

Efforts at all levels of the public sector give hope that analytics will become better integrated into decision-making despite the challenges described in this article. The growing cadre of trained analysts, the increasing availability of analytical tools, the accumulation of data and the promulgation of success stories all point toward improved integration. Many government organizations have also prioritized the development of analytical capabilities, such as the Department of Homeland Security’s emphasis on risk analysis. I believe that public sector analytics will continue to grow, but consistent effort will be required to surmount the challenges and transform government into an analytically competitive organization.

Dr. E. S. Levine is the chief scientist of the U.S. Department of Homeland Security’s Office of Risk Management and Analysis. He can be contacted at evan.levine@dhs.gov.

REFERENCES & FURTHER READING

  1. Davenport, T.H., and Harris, J.G. Competing on Analytics. Harvard Business School Press: 2007.
  2. Gass, S.I. , and Assad, A.A., “An Annotated Timeline of Operations Research,” Springer: 2005.
  3.  For a recent example, see the “Performance and Management” section of the president’s proposed “Budget of the U.S. Government,” FY2013, Analytical Perspectives volume, available at http://www.whitehouse.gov/OMB/BUDGET/ANALYTICAL_PERSPECTIVES
  4. See Office of Federal Procurement Policy (OFPP) Letter 11-01, “Performance of inherently governmental and critical functions,” Sept. 12, 2011, for more detail.
  5. Available at http://www.911commission.gov/report/911Report.pdf.

business analytics news and articles



Headlines

Does negative political advertising actually work?

While many potential voters dread campaign season because of pervasive negative political advertising, a new study has found that negative political advertising actually works, but perhaps not in the way that many may assume. The study “A Border Strategy Analysis of Ad Source and Message Tone in Senatorial Campaigns,” which will be published in the June edition of INFORMS’ journal Marketing Science, is co-authored by Yanwen Wang of the University of British Columbia in Vancouver, Michael Lewis of Emory University in Atlanta and David A. Schweidel of Georgetown University in Washington, D.C. Read more →

Meet Summit, world’s most powerful, smartest scientific supercomputer

The U.S. Department of Energy’s Oak Ridge National Laboratory on June 8 unveiled Summit as the world’s most powerful and smartest scientific supercomputer. With a peak performance of 200,000 trillion calculations per second – or 200 petaflops – Summit will be eight times more powerful than ORNL’s previous top-ranked system, Titan. For certain scientific applications, Summit will also be capable of more than three billion billion mixed precision calculations per second, or 3.3 exaops. Read more →

Employee engagement a top concern affecting customer experience

Employee engagement has surfaced as a major concern in delivering improvements in customer experience (CX), with 86 percent of CX executives in a Gartner, Inc. survey ranking it as having an equal or greater impact than other factors such as project management and data skills. “CX is a people issue,” says Olive Huang, research vice president at Gartner. “In some instances, the best technology investments have been derailed by employee factors, such as a lack of training or incentives, low morale or commitment, and poor communication of goals." Read more →

UPCOMING ANALYTICS EVENTS

INFORMS-SPONSORED EVENTS

INFORMS Annual Meeting
Nov. 4-7, 2018, Phoenix

OTHER EVENTS

Making Data Science Pay
July 30-31, 12:30 p.m.-5 p.m.


Predictive Analytics: Failure to Launch Webinar
Aug. 18, 11 a.m.


Applied AI & Machine Learning | Comprehensive
Sept. 10-13, 17-20 and 24-25


Advancing the Analytics-Driven Organization
Sept. 17-20, 12-5 p.m. LIVE Online


The Analytics Clinic: Ensemble Models: Worth the Gains?
Sept. 20, 11 a.m. -12:30 p.m.

CAP® EXAM SCHEDULE

CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:


 
For more information, go to 
https://www.certifiedanalytics.org.