Report: Federal open data needs better feedback channels

A new report, the result of input from agency officials and open data experts, identifies some ideas for improving federal data.

The federal government still has a ways to go to optimize it opening of agency data, according to a report published last week.

The Center for Open Data Enterprise published its “potential best practices” in a report after convening nearly 300 federal agency officials, data experts and data users to participate in roundtables on open data, which it co-hosted with the White House Office of Science and Technology Policy.

One notable recommendation was that data providers should consider developing user feedback systems for the public to flag problems with the quality of federal data.

“Roundtable participants specifically suggested developing stronger feedback channels for,” according to the report, which notes that “while the website has a high volume of traffic, it receives very little feedback through the simple ‘report a problem’ buttons that are on each page.”


In addition to its insights on data quality, the report notes a better feedback loop on would help agencies learn what data sets users care about.

The report offers recommendations in the four different areas the roundtables focused on: data privacy, data quality, sharing research data, and public and private collaboration.

The report’s other suggestions on data quality include officials focusing on quality when data is collected, “which is more efficient and effective than quality improvement at later stages,” according to the report. Another thing that can help with quality, the report notes, is eliminating manual data entry whenever possible.

The report also urges agencies and organizations to publish “both raw data and improved data with transparency about quality and provenance.”

The report doesn’t just talk about data quality, though. It also suggests ways to make data easier to use. In that vein, federal data should also use common data standards and taxonomies, the report recommends, and international collaborations should establish standards for their data.


And if that isn’t possible, the report suggests potentially developing “an additional data layer to enhance interoperability.”

The report touted the Census Bureau’s CitySDK software development kit as an example of providing that interoperability. The program, which uses data from the Decennial Census, helps people combine different data sets to generate new insights.

Through the program, people can retrieve data from multiple sources, geocode it and more, all through “a simple programming approach,” according to the report.

The project “bridges the data gap between federal, state, and local data,” according to the recently-released White House fact sheet on open data.

“This kind of open source solution could be applied in many areas where different kinds of users need to discover, access and connect disparate standardized datasets,” the report notes.


The center is now working on a transition report, to provide recommendations on open data practices for the next White House — what Gurin called “best opportunities” for the next administration. He said the report will be released in October.

Samantha Ehlinger

Written by Samantha Ehlinger

Samantha Ehlinger is a technology reporter for FedScoop. Her work has appeared in the Houston Chronicle, Fort Worth Star-Telegram, and several McClatchy papers, including Miami Herald and The State. She was a part of a McClatchy investigative team for the “Irradiated” project on nuclear worker conditions, which won a McClatchy President’s Award. She is a graduate of Texas Christian University. Contact Samantha via email at, or follow her on Twitter at @samehlinger. Subscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here:

Latest Podcasts