Agencies must factor in users when releasing data, panel says
Federal agencies ought to do a better job reflecting on how the public will use the data they are increasingly opening up, government and outside experts said Wednesday.
At an event held by the trade group Data Transparency Coalition and the Government Accountability Office, Jerry Johnston, geospatial information officer at the Interior Department, said it isn’t enough to publish data because of a statutory mandate and see what happens.
“We haven’t done, I feel, a great job as a community of asking downstream consumers what they want,” Johnston said at the event.
But it’s something he and colleagues are working on.
Johnston said he collaborated with other members of the Federal Geographic Data Committee, an interagency group that coordinates geospatial data sharing, to identify 200 nationally significant data sets from its 80,000 records on Data.gov.
“For those, we’re going to go beyond metadata,” he said. Johnston said the group wants to publish life cycle management metrics for the data and move toward developing data content standards.
Even still, the government geospatial community needs to work more closely with open data advocates and take advantage of recent open data initiatives, Johnston said.
Thursday’s event, which focused on cultural barriers stopping the government from opening data, comes amid a major push to open up federal data.
In 2013, President Barack Obama signed an executive order for agencies to publish data in machine-readable formats. And next month, the Treasury Department and the White House Office of Management and Budget are slated to release governmentwide data standards for federal spending under the Digital Accountability and Transparency Act, or DATA Act.
Later in the discussion, the panelists touched on legal barriers that could prevent agencies from opening their data.
Joel Gurin, president of the Center for Open Data Enterprise, noted that a recent roundtable with the U.S. Patent and Trademark Office brought to light unique challenges the agency had in cleaning its data. Changing the wording on a patent to make it more searchable could affect the patent’s scope.
“I think it’s going to take some really creative public-private collaboration because there are going to be many cases where third parties are going to be able to do more than the agencies themselves,” he said.
The Interior and Agriculture departments along with several public lands agencies have been trying tap into that outside talent.
At another panel, Rick DeLappe, recreation one-stop program manager at the National Park Service, highlighted a hackathon agencies held earlier this month to encourage technologists to use public lands data. Ahead of the event, agencies released an application programming interface, or API, with booking information.
The event came after advocacy and startup groups complained that a draft request for proposals for a contractor to run Recreation.gov, the main website that books reservations for federal lands, didn’t do enough to promote open data sharing.
DeLappe would not divulge the details of the final solicitation, to be released later this year. However, he said agencies would pursue a more agile approach to the contract and include the U.S. Digital Service Playbook as an attachment to the RFP, as it did in the draft.