Cloud driving ‘sea change’ in federal data utility, says State’s cloud director

Senior leaders are getting their hands on more data, thanks to the cloud, and as they do, they're getting hooked.
(Getty Images)

The advent and ubiquity of cloud computing in the federal government has largely increased the focus of federal officials — particularly senior leaders — in leveraging data as an asset, according to Brian Merrick, head of cloud programs at the State Department.

In an interview with FedScoop, Merrick said there has recently been “a sea change” in the way officials think about data, due in large part to the shift in models from data tied to an application to one with a more openly architectured environment where data can freely flow between applications.

“Our traditional sort of system-box diagram model that we’re used to in the old sort of waterfall application world, where your data is associated with an application and you have an owner of that application, everything’s tied to that and it’s about doing an interface between this box and that box. That kind of dynamic is no longer the driver,” Merrick said of data in legacy environments. “And so that frees up people to really look at that a little bit differently when it’s not necessarily tied directly to a specific application. Often what we’re finding now is there are many applications sharing data by virtue of necessity between these different environments.”

This has placed the focus on the data as the sort of lowest common denominator and most important asset in an IT environment. “It’s really about the data,” Merrick said.


On top of this, the cloud also introduced added benefits of data integration and sharing, speed of delivery, and evidence-based decision-making, he said.

“When a senior leader comes and says, ‘Look, I want to do this thing, I need an ability to get this particular task done,’ and we go back and say, ‘Well, you know what, if we have these two or three data elements from over here and combine it over here, we can get you that in a week,’ the doors open,” Merrick said. “And all of a sudden, they’re knocking on the door of the data owner saying, ‘OK, you need to make sure that this group gets access to these data elements, and we can fix this business problem.”

Such a structure has “elevated data concepts out of the IT silo to a senior management viewpoint,” Merrick said. “They now see the value in getting access to that data, which opens the doors to that data, really from a policy and an ownership standpoint, and I think that’s been a huge change.”

And as those senior leaders get more experience using data to add new value or make better decisions, they get hooked, Merrick told FedScoop.

“Leaders have realized that … they were probably not basing decisions in the past on as good of data sources they have now. And that has really increased the awareness of it and put the emphasis on it,” he said. “And I think we’re gonna see a lot more of that as time goes on. And as we start bringing in disparate data elements we hadn’t maybe thought about before, like sensor data from the Internet of Things — a whole host of other data elements from other outside data sources — it starts to paint pictures when we’re trying to solve real-world programmatic problems. And I think we will see that start to scale even faster.”

This story was featured in FedScoop Special Report: Government Powered by Data - A FedScoop Special Report

Billy Mitchell

Written by Billy Mitchell

Billy Mitchell is Senior Vice President and Executive Editor of Scoop News Group's editorial brands. He oversees operations, strategy and growth of SNG's award-winning tech publications, FedScoop, StateScoop, CyberScoop, EdScoop and DefenseScoop. After earning his journalism degree at Virginia Tech and winning the school's Excellence in Print Journalism award, Billy received his master's degree from New York University in magazine writing while interning at publications like Rolling Stone.

Latest Podcasts