An API strategy for the U.S. government
13 Jan 2014
[caption id=”attachment_16958” align=”alignnone” width=”800”] U.S. Chief Technology Officer Todd Park with President Obama (Photo: White House)[/caption]
I was asked to provides some thoughts on what is next for the U.S. government’s application programming interface strategy. I’ve put a lot of thought into it during my work and travels over the last couple months since I’ve left Washington, D.C., and I keep coming back to one thought: strengthen what we have.
I wish I had some new technology or platform for the next wave of government APIs that would ensure success with APIs in Washington, but in reality we need to do what we’ve been doing, but do it in scale, and get organized and collaborative about how we do it.
Release more data sets
There are thousands of data sets available via data.gov currently, across 176 agencies and numerous categories. We need more. When any content or data is published via a government website, that data needs to also be made available via agencies’ data repositories and data.gov. Agencies need to understand that releasing open data sets is not something you do every once in a while to meet a mandate or deadline–it is something to do always, forever.
Refine existing data sets
There is a lot of data available currently. However, much of it is in various formats, inconsistent data models and isn’t always immediately available for use in spreadsheets, applications and for analysis. There is a great deal of work to be done in cleaning, normalizing and refining of existing data, as well as deploying APIs around open data that would increase adoption and the chances it will be put to use.
Refine existing APIs
Like open data, there are many existing APIs across federal government, and these APIs could use a lot of work to make them more usable by developers. With a little elbow grease, existing APIs could be standardized by generating common API definitions like Swagger, API blueprint and RAML, which would help quantify all APIs, but also generate interactive documentation, code samples and provide valuable discovery tools for helping understand where interfaces are and what they offer. The mission up until now for agencies is to deploy APIs, and while this remains true, the need to evolve and refine APIs will go a long way towards building those valuable case studies to get to the next level.
Robust /developer areas
There are over 50 agencies who have successful launched a /developer area to support open data and API efforts. Much like open data and the APIs themselves, they represent a mishmash of approaches, and provide varied amounts of resources and necessary support materials. HowTo.gov already provides some great information on how to evolve an agency’s developer area, we just need some serious attention spent on helping each agency make it so. It doesn’t matter how valuable the open data or APIs are, if they are published without proper documentation, support and communication resources, they won’t be successful. Robust developer areas are essential to federal agencies finding success in their API initiatives.
Every successful API initiative in the private sector from Amazon to Twilio has employed evangelists to spread the word and engage developers, helping them find success in putting API resources to use. Each federal agency needs its own evangelist to help work with internal stakeholders making sure open data is published regularly, new APIs are deployed and existing resources are kept operational and up to date. Evangelists should have counterparts at OSTP / OMB / GSA levels providing feedback and guidance, as well as regular engagement with evangelists in the private sector. Evangelism is the glue that will hold things together across the agency, as well as provide the critical outreach to the private sector to increase adoption of government open data and APis.
Build public-private sector partnerships
Opening up data and APIs by the federal government is about sharing the load with the private sector and the public at large. Open data and APIs represents the resources the private sector will need to build applications, sites and fuel industry growth, and job creation. A new type of public-private sector partnership needs to be defined, allowing for companies and non-profit groups to access and use government services and resources in a self-service, scalable way. Companies should be able to build businesses around government Internet services, much like the ecosystem that has grown from the IRS e-File system, with applications like TurboTax that reaches millions of U.S. citizens and allows corporations to help government share the load while also generating necessary revenue.
Establish meaningful case studies
When it comes to open data and APIs nothing gets people on board more than solid examples of open data, APIs and the applications that are built on them have made in government. Open government proponents use weather data and GPS as solid examples of open data and technology impacting not just government, but also the private sector. We need to fold the IRS e-file ecosystem into this lineage, but also work towards establishing numerous other case studies we can showcase and tell stories about why open data and APIs are important–in ways that truly matter to everyone, not just tech folks.
Educate and tell stories within government
In order to take open data and APIs to the next level in government there needs to an organized and massive effort to educate people within government about the opportunities around open data and APis, as well as the pitfalls.
Regular efforts to educate people within government about the technology, business and politics of APIs needs to be scaled, weaving in relevant stories and case studies as they emerge around open data and APIs. Without regular, consistent education efforts and sharing of success stories across agencies, open data and APIs will never become part of our federal government DNA.
Inspire and tell stories outside government
As within government, the stories around government open data and APIs needs to be told outside the Washington echo chamber, educating citizens and companies about the availability of open data and APIs, and inspire them to take action by sharing successful stories of other uses of open data and APIs in development of applications.
The potential of using popular platforms like Amazon and Twitter spread through the word of mouth amongst developer and power user communities, the same path needs to be taken with government data and api resources.
The next 2-3 years of the API strategy for the U.S. government will be about good old-fashioned hard work, collaboration and storytelling. We have blueprints for what agencies should be doing when it comes to opening up data, deploying APIs and enticing the private sector to innovate around government data, we just need to repeat and scale until with reach the next level.
How do we know when we’ve reached the next level? When the potential of APIs is understand across all agencies, and the public instinctively knows that you can go to any government /developer domain and find the resources they need, whether they are individual or a company looking to develop a business around government services.
The only way we will get there is by achieving a solid group of strong case studies of success in making change in government using open data and APIs. Think of the IRS e-file system, and how many citizens this ecosystem reaches, and the value generated through commercial partnerships with tax professionals. We need 10-25 of similar stories of how APIs impact people’s lives, strengthened the economy and has made government more efficient, before we consider getting to the next level.
Even with the housekeeping we have, what should be next for Data.gov?
Data.gov has continued to evolve, adding data sets, agencies and features. With recent, high profile stumbles like with Healthcare.gov, it can be easy to fall prey to historical stereotypes that government can’t deliver tech very well. While this may apply in some cases, I think we can get behind the movement that is occurring at Data.gov, with 176 agencies working to add 54,723 data sets in the last 12 months.
I feel pretty strongly that before we look towards the future of what the roadmap looks like for Data.gov, we need to spend a great deal of time refining and strengthening what we currently have available at Data.gov and across the numerous government agency developer areas. Even with these beliefs, I can’t help but think about what is needed for the next couple years of Data.gov.
Maybe I’m biased, but I think the next steps for Data.gov is to set sights purely on the API. How do we continue evolving Data.gov, and prepare to not just participate, but lead in the growing API economy?
Management tools for agencies
We need to invest in management tools for agencies, commercial providers as well as umbrella. Agencies need to be able to focus on the best quality data sets and API designs, and not have to worry about the more mundane necessities of API management like access, security, documentation and portal management. Agencies should have consistent analytics, helping them understand how their resources are being accessed and put to use. If OSTP, OMG, GSA and the public expect consistent results when it comes to open data and APIs from agencies, we need to make sure they have the right management tools.
Endpoint design tools for data sets
Agencies should be able to go from data set to API without much additional work. Tools should be made available for easily mounting published datasets, then allow non-developers to design API endpoints for easily querying, filtering, accessing and transforming datasets. While data download will still be the preferred path for many developers, making high value datasets available via APIs will increase the number of people who access, that may not have the time to deal with the overhead of downloads.
Common portal building blocks
When you look through each of the 50+ federal agency /developer areas you see 50+ different approaches to delivering the portals that developers will depend on to learn about, and integrate with each agencies APIs. A common set of building blocks is needed to help agencies standardize how they deliver the developer portals. Their approach might make sense within each agency, but as a consumer when you try to work across agencies it can be a confusing maze of API interactions.
Standard developer registration
As a developer I need to establish a separate relationship with each federal agency. This quickly becomes a barrier to entry, one that will run off even the most seasoned developers. We want to incentivize developers to use as many federal APIs as possible, and by providing them with a single point of registration, and a common credential that will work across agencies will stimulate integrations and adoptions.
Standard application keys
To accompany the standard developer registration, a standard approach to user and application keys is needed across federal agencies. As a user, I should be able to create a single application definition and receive API keys that will work across agencies. The amount of work required to develop my application and manage multiple API keys will prevent developers from adopting multiple federal agency APIs. Single registration and application keys will reduce the barriers to entry for the average developer when looking to build on top of federal API resources.
Developer tooling and analytics
When integrating with private sector APIs, developers enjoy a wide range of tools and analytics that assist them in discovering, integrating, managing and monitoring their applications integration with APIs. This is something that is very rare in integration with federal government APIs. Standard tooling and analytics for developers needs to become part of the standard operating procedures for federal agency /developer initiatives, helping developers be successful in all aspects of their usage of government open data and APIs.
Swagger, API Blueprint, RAML
All APIs produced in government should be described using on of the common API definitions formats that have emerged like Swagger, API Blueprint and RAML. These all provide a machine readable description of an API and its interface that can be used in discovery, interactive documentation, code libraries and SDKs and many other uses. Many private sector companies are doing this, and the federal should follow the lead.
Discoverability, portable interfaces and machine readable by default
As with open data, APIs need to be discoverable, portable and machine readable by default. Describing APIs in Swagger, API Blueprint and RAML will do this. Allowing APIs to be presented, distributed and reused in new ways. This will allow each agency to publish their own APIs, but aggregators can take machine readable definitions from each and publish in a single location. This approach will allow for meaningful interactions such as with budget APIs, allowing a single budget API site to exist, providing access to all federal agencies budget without having to go to each /developer area, but there are many more examples like this that will increase API usage and extend the value of government APIs.
Mashape, API Hub
A new breed of API directories have emerged. API portals like Mashape and API Hub don’t just provide a directory of APIs, they simplify API management for providers and integration for API consumers. Federal agencies need to make their APIs friendly to these API hubs, maintaining active profiles on all platforms and keeping each API listed within the directories and actively engaging consumers via the platforms. Federal agencies shouldn’t depend on developers coming to their /developer areas to engage with their APIs, agencies need to reach out where developers are already actively using APIs.
Consistent API interface definition models
Within the federal government each API is born within its own agencies silo. Very little sharing of interface designs and data models is shared across agencies, resulting in APIs that may do the same thing, but can potentially do it in radically different ways. Common APIs such as budget or facility directories should be using a common API interface design and underlying data model. Agencies need to share interface designs, and work together to make sure the best patterns across the federal government are used.
In the federal government, APIs are often a one-way street that allow developers to come and pull information. To increase the value of data and other API driven resources, and help reduce the load on agencies servers, APIs need to push data out to consumers, reducing the polling on APIs and making API integration much more real-time. Technologies such as the Webhook which allows API consumer to provide a web URL, in which agencies can push newly published data, changes and other real-time events to users, are being widely used to make APIs much more of a two-way street.
As the world of web APIs evolve there are new approaches emerging to delivering the next generation of APIs, and hypermedia is one of these trends. Hypermedia brings a wealth of value, but most importantly it provides a common framework for APIs to communicate, and provide essential business logic and direction for developers, helping them better use APIs in line with AP provider goals. Hypermedia has the potential to not just make government assets and resources available, but ensure they are used in the best interest of each agency. Hypermedia is still getting traction in the private sector, but we are also seeing a handful of government groups take notice. Hypermedia holds a lot of potential for federal agencies, and the groundwork and education around this trend in APIs needs to begin.
The first thing you notice when you engage with an government API is nobody is home. There is nobody to help you understand how it works, overcome obstacles when integrating. There is no face to the blog posts, the tweets or the forum replies. Federal government APIs have no personality. Evangelists are desperately needed to bring this human element to federal government APIs. All successful private sector APIs have an evangelist or an army of evangelists, spreading the word, supporting developers, and making things work. We need open data and API evangelists at every federal agency, lettings us know someone is home.
Read / Write
I know this is a scary one for government, but as I said above in the webhooks section—APIs need to be a two way street. There are proven ways to make APIs writeable without jeopardizing the integrity of API data. Allow for trusted access, let developers prove themselves. There is a lot of expertise, “citizen cycles,” and value available in the developer ecosystem. When a private sector company uses federal government data, improves on it, the government and the rest of the ecosystem should be able to benefit as well. The federal government needs to allow for both read and write on APIs—this will be the critical next step that makes government APIs a success.
These are just 14 of my recommendations for the next steps in the API strategy for the federal government. As I said earlier, none of this should be done without first strengthening what we all have already done in the federal government around open data and APIs. However, even though we need to play catch up on what’s already there, we can’t stop looking towards the future and understand what needs to be next.
None of these recommendations are bleeding edge or technology just for technology sake. This is about standardizing how APIs are designed, deployed and managed across the federal government, emulating what is already being proven to work in the private sector. If the federal government wants to add to the OpenData500, and establish those meaningful stories needed to deliver on the promise of open data and APIs, this is what’s needed.
With the right leadership, education and evangelism, open data and APIs can become part of the DNA of our federal government. We have to put aside our purely techno-solutionism view and realize this is seriously hard work, with many pitfalls, challenges and that in reality it won’t happen overnight.
However, if we dedicate the resources needed, we can not just change how government works, making it machine readable by default, we can forever alter how the private and public sector works together.