Executive Edge: Graph databases, journalists & the Panama Papers
Mining huge data sets: The powerful technology behind one of the biggest data leaks in history.
By Emil Eifrem
The Panama Papers, the unprecedented leak of 11.5 million files from the database of the global law firm Mossack Fonseca, opened up the offshore tax accounts of the rich, famous and powerful – laying bare how they have exploited secretive offshore tax regimes for decades. At 2.6 terabytes of data, the Panama Papers is the biggest data leak in history, towering over the U.S. diplomatic cables released by WikiLeaks in 2010, or more recently, intelligence documents handed over by Edward Snowden.
The investigation into the Panamanian law firm’s dealings and that of its elite clients was the direct result of work carried out by journalists at The International Consortium of Investigative Journalists (www.icij.org). More than 370 reports from 80 countries worked on the data for a year, such was its scale. As part of its endeavors, the ICIJ also released a searchable database of 300,000 entities harvested from the Panama Papers and its offshore leaks investigation.
The Panama Papers displayed the murky side of offshore accounts, identifying high-ranking government and public officials and pushing some out of office. But another major aspect that stands out is the power of the data itself and how it was sifted. It wasn’t searched and manipulated by experienced data scientists, but by a team of journalists, many of whom would not identify themselves as very technical.
How did the journalists manage to pick out meaningful data from such huge, unstructured files? The answer is graph database technology, which enabled journalists to surface connections between the data, much like joining the dots, to form a picture.
Mar Cabra, head of the data and research unit at the ICIJ, has described graph database technology as “a revolutionary discovery tool that’s transformed our investigative journalism process.”
The unique skill of graph databases is their ability to spot and understand relationships between data at huge scale. Graph databases utilize structures made up of nodes, properties and edges to store data, unlike relational databases, which store the information in rigid tables. Graph databases then map the links between required entities.
This is a boon for investigative journalists, but it is also a powerful tool for any business looking to tackle big data and connected data issues.
Graph databases are an excellent way to make sense of the terabytes of connected data in an efficient manner. Why? Because unlike relational databases, which break data down into tables, graph databases use a notational structure that mimics the way humans intuitively look at information. Once the data model is coded in a scalable architecture, a graph database is unbeatable at analyzing the connections in large, complex data sets. This enables any business to build and manipulate big data structures easily.
Tech giants such as Google, Facebook and LinkedIn have recognized the power of graph databases for some time. For example, Facebook and LinkedIn’s tools for mapping real-time networks and connections that let us walk through social networks are founded on graph technology. Now that graph database technology has started to go mainstream, this highly scalable connected data analysis is available to all organizations, from startups to blue chips and government.
Graph databases are set to come into their own with the Internet of Things (IoT), where billions of connected devices mean dealing with petabytes of data. Graph databases will enable enterprises to mine data in ways that just aren’t possible using data warehouses and relational database technology. Graph technology is increasingly becoming the tool of choice for international agencies, governments, financial services companies and enterprises looking to make real-time connections between data and discover the patterns that make up their relationships.
We will undoubtedly be hearing more about the power of graph databases in the business world as more and more organizations latch on to the unique capabilities it offers.
Emil Eifrem is co-founder and CEO of Neo Technology (http://neo4j.com/), developers of the graph database Neo4j.
- 62International Data Corporation (IDC) recently released a worldwide Big Data technology and services forecast showing the market is expected to grow from $3.2 billion in 2010 to $16.9 billion in 2015. This represents a compound annual growth rate (CAGR) of 40 percent or about seven times that of the overall…
- 59March/April 2013 By Vijay Mehrotra As described in the previous edition of Analyze This!, I am currently working on a research study with Jeanne Harris at Accenture’s Institute for High Performance. Specifically, we are seeking to develop a quantitative and qualitative understanding of the current state of analytics practice. If…
- 53Many organizations have noticed that the data they own and how they use it can make them different than others to innovate, to compete better and to stay in business. That’s why organizations try to collect and process as much data as possible, transform it into meaningful information with data-driven…
- 53A quick quiz: What is a good nine- or 10-letter description of the emerging interest in business analytics and big data that ends in “-al”? A choice that may come to mind for many is “hysterical.” This choice reflects frenzied excitement about opportunities for business analytics to solve problems often…
- 51March/April 2013 The Dow Chemical approach to leveraging time-series data and demand sensing. By Tim Rey (Left) and Chip Wells Big data means different things to different people. In the context of forecasting, the savvy decision-maker needs to find ways to derive value from big data. Data mining for forecasting…