A new survey of federal agencies suggests that some are maturing much faster than others in the harnessing of big data and that the core elements of success go well beyond technology and the availability of data scientists.
“What surprised me most is that there’s such a big and distinct difference between the high achievers and low achievers” among big data users at federal agencies, said Adelaide O’Brien, research director for IDC Government Insights, which conducted the survey. The study is among the first to establish a benchmark of maturity in the use of big data and analytic tools by federal agencies.
O’Brien, speaking June 20 at the Federal Big Data Summit, described five stages of maturity among federal agencies, beginning with the ad hoc use of big data tools and progressing through opportunistic, repeatable, managed and optimized phases of big data use. The latter phases reflect more integrated, automated and measured approaches in using big data, with the most mature agencies able to prove the value of big data within their organizations.
IDC measured the maturity of agencies with at least 5,000 employees along five dimensions: the level of intent to harness big data, technology deployed, data analyzed, commitment to staffing and process development tied to big data use.
Overall, the adoption of big data among federal agencies reflected a typical bell curve, with two-thirds (63 percent) of respondents saying their agencies were at the midpoint of the maturity continuum, generating “repeatable” results, 17 percent at the more-mature, “managed” phase and 19 percent at the less-mature, “opportunistic” stage. Only 1 percent of respondents respectively put their agency at the extremes of being at the “ad hoc” or “optimized” phase.
O’Brien noted that those agencies considered to be high achievers, or more mature, in their use of big data, tended to:
- successfully recruit, train and reward not only data scientists and statisticians, but also business and program analysts who were connected to agency end goals;
- actively collaborate and communicate with other agencies or work groups on big data and analytics initiatives;
- have senior executive involvement that contributed to resourcing big data projects;
- use pilot projects, continuous process improvement and quantitative feedback;
- use advanced predictive analytics tools and a high level of automation that helped guide agency decision making.
Agencies that tended to have more siloed strategies, lacked intra-organization coordination on data projects and lacked top level sponsorship typically fell further behind on the maturity curve, O’Brien said.
While budget constraints remain an overarching factor in the evolution of big data analytics in government, O’Brien said the survey revealed that some agencies have managed to figure out how to overcome those barriers in developing their big data capabilities.
The study indicated that agencies are demonstrating stronger intent and developing more strategies to harness big data than was the case a year ago. At the same time, O’Brien spotted an imbalance in results, with agencies having the technology to analyze big data sets, but not the management infrastructure to share the results.
The processes to harness big data and to make better decisions still remains relatively immature. “A lot of the technology is being used for individual projects, but the line-of-business folks aren’t involved in these projects,” she said.