RSS

API Partners News

These are the news items I've curated in my monitoring of the API space that have some relevance to the API definition conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is testing their APIs, going beyond just monitoring and understand the details of each request and response.

API Evangelist And Streamdata.io

Some of you in my backchannels know that I’ve been shopping around for a job lately. I’m looking to make a shift in API Evangelist, as I’ve written about some (and will write about more), and I’m also looking for a shift in how I fund my world. During a multi-week search I opened up conversations about a couple of different roles, and one particularly interesting partnership came my way from a company I’ve been working with for a while now. I’ve entered into a partnership with the Streamdata.io team, to help them chart the course for their real-time, streaming, event-sourced future, and they’ll continue to invest in me being the API Evangelist. In the past I’ve had several partners at any one time, but moving forward I’m going to limit it to being with a single partner–changing the formula a little bit.

I’ll talk about why the older model wasn’t working in other posts on my personal blog, but moving forward API Evangelist will continue to be about my research into the wider API space, but it will increasingly have a focus about what I’m also doing with Streamdata.io. I’m interested in helping Streamdata.io understand that API sector, while I am helping my readers understand what Streamdata.io does. I will be actively telling stories about what I’m up to on a daily basis via API Evangelist, and I will continue to research the wider API space–this is valuable to Streamdata.io, and hopefully it is also valuable to you. I will also be applying this knowledge to helping the Streamdata.io team help define their place in the API sector, and telling the stories in real-time on API Evangelist–this is valuable to Streamdata.io, and hopefully it is also valuable to you.

Streamdata.io is all about helping define the real-time, streaming layer of the API space–something I have been focused on for a number of years. However, they also invested into helping define the data streaming, event-sourced future of the API space, something I’ve lightly invested in with my webhooks research, but keen to further understand when it comes to Kafka, and other emerging trends. It fits well with what I”m seeing happen in the API space, as well as my interests when it comes to technology. Honestly, it also fits with my interests in having a stable income, and not having to chase stories in the name of page views, or other random projects just because they might bring in some money to pay the bills. I’m not too proud to say that I was ready for Streamdata.io’s help, and it is greatly appreciated.

This partnership wasn’t the next step I envisioned for API Evangelist, but it was the quickest, most natural, and positive offer to show itself–right when I needed it. To the API echo chamber I think what I offer via API Evangelist will change substantially, however to API providers, and API service providers, I’m guessing they probably won’t even notice a difference. I’m looking forward to partnering with Streamdata.io to help get the word out about the dead simple real-time streaming solutions they provide on top of existing APIs, and I’m also excited that they’ll continue to invest in me being the API Evangelist. Here’s to a great partnership Streamdata.io, and I look forward to the stories we can tell together in 2018. I’m looking forward to continuing to help define the API space, with a focus on the perspective that Streamdata.io brings to the table. Something that goes well beyond just streaming data–stay tuned!


Learning To Play Nicely With Others Using APIs

This is a topic I talk about often, write about rarely, but experience on a regular basis doing APIs. It has to do with encounters I have with people in companies who do not know how to share and play nicely with other companies and people, and want to do APIs. For a variety of reasons these folks approach me to learn more about APIs, but are completely unaware of what it takes, and how it involves working with external actors. Not all of these APIs are public, but many of them involve engaging with individuals outside the corporate firewall, possess a heavy focus on the technical, and business of doing APIs, but rarely ever consider the human and more political aspects of engagements with APIs.

I find that people tend to have wildly unrealistic expectations of technology, and believe that APIs will magically connect them to some unlimited pool of developers, or seamlessly connect disparate organizations across vast distances. People come to me hoping I will be able to explain it all to them in a single conversation, or provide in a short white paper, which is a state of being that also makes individual very susceptible to vendor promises. People want to believe that APIs will fix the problems they have introduced into their operations via technology, and effortlessly connect them with outside revenue streams and partnerships, unencumbered by the same human challenges they face within their firewalls.

Shortly after getting to know people it often becomes apparent that they possess a lot of baggage, legacy processes and beliefs, that they are often unaware of. Or, I guess more appropriately they are unaware that these legacy beliefs are not normal, and are something everybody faces. Where they are usually pretty uniquely dysfunctional to a specific organization. People usually have hints that these problems aren’t normal, but just aren’t equipped to operate outside their regular sphere of influence, and within days or weeks of engagement, their baggage becomes more visible. They start expecting you to jump through the same hoops they do. Respond to constraints that exist in their world. See data, content, and algorithms through the same lens as their organization. At this point folks are usually unaware they’ve opened the API door on their world, and the light is shining in on their usually insulated environment–revealing quite a mess.

This is why you’ll hear me talk of “dating” folks when it comes to API engagements. Until you start opening, and leaving the door open on an organization for multiple days at a time, you really can’t be sure what is happening behind those closed doors. I am also not always confident that someone can keep internal dysfunction from finding its way out. I don’t expect that all organization be dysfunction free, I’m just looking for folks who are aware of their dysfunction, and are looking to actively control, rate limit, and minimize the external impacts. As well as being willing to work through some of this dysfunction, as they engage with external actors. I have to also acknowledge that this game is a two way street, and I also encounter API consumers who are unable to leave their baggage at home, and bring their seemingly natural internal dysfunction to the table when looking to put external resources to work across their operations. In general, folks aren’t always equipped to play nicely with others by default, it takes some learning, and evolving, but this is why we do APIs. Right?

This is why I suggest people become API consumers, before they become API providers, so that they can truly understand what it is like to be on both sides of the equation. API providers who haven’t experienced what its like to be on the receiving end of an external API, rarely make for empathetic API providers. Our companies, organizations, institutions, and government agencies often insulate us from the outside world, which works in many positive ways, but as you are aware, it also works in some negative, and very damaging political ways. The trick with doing APIs, is to slowly change this behavior, and being able to let in outside ideas, and resources, without too much disruption, internally or externally. Then over time, learn to play nicely with others, and build relationships with other groups, and individuals who might benefit our organizations, and we might also help bring value to their way of doing things as well.


Google Partner API As A Blueprint For Other APIs

I've been tracking on how API providers operate their partner programs for a while now, in hopes of pulling together some sort of blueprint that other API providers can follow. I'm always happy to find stop along the API life cycle where an API provider has already developed a robust operational API, like the Google Partner API.

The Google Partner API provides the essential building blocks of a pretty sophisticated partner program API including company, messaging, company, lead, offer, status, profile, user, relations, and analytics. It is a nicely designed APIs, providing a complete set of paths, with lots of detail and robustness when it comes to the surface area of the API, the data it returns. 

When you look at the documentation for the Google Partner API, it provides a good example of where Google is taking their API design across the company, by providing a REST, as well as a gRPC version of the resource. All future APIs at the search giant will be following this approach, providing simpler REST APIs, then higher performance editions if you want with gRPC.

Next time I update my partner API research, this story will be present. I will look for other partner API examples, and then take the design patterns present in the Google Partner API, and publish the OpenAPI for the project, as a suggestion for how you could design your partner API. Maybe I can even convince someone to craft some server-side implementations of the API, on a variety of platforms.


Complimentary APIs For The Oxford Dictionaries API

Many API providers I meet have the "build it and they will come" mentality, thinking that if they build an API, developers will come and use it. It does happen, but many APIs only have so many direct uses, and will have a limited number of resulting implementations. This is one of the reasons I recommend companies do APIs in the first place, to get beyond the obvious and direct implementations, and incentivize entirely new applications that a provider may not have considered.

Developing innovative applications an API provider may not have considered is the primary focus of companies I talk to, but only a handful have begun thinking about the other APIs that are out there that might compliment an API. An example of this is with my friends over at the Oxford Dictionary APIs, where building applicaitons with built-in dictionaries is pretty obvious. However, the complimentary API partnering and usage might not be as obvious, something I'm encouraging their team to think more about.

What are some examples of complimentary APIs for the Oxford Dictionaries API?

  • API Design - This example is more meta API, than complimentary API, but I'd like to see more API design tooling to begin to use dictionaries as part of their GUI and IDE interfaces. Providing more structure to the way we design our APIs, helping ensure that the design of leading APIs are more intuitive, and coherent--achieving a less technical interface, and something that speaks to humans.
  • Machine Learning - I'm using a number of machine learning APIs to accomplish a variety of business tasks from object recognition in images, to text and language pattern recognition in content I produce or curate. There are a number of opportunities to extend the responses I get back from these APIs using a dictionary, helping increase the reach of the machine learning services I employ, and making them more effective and meaningful in my business operations.
  • Search - I use the Algolia API to build a variety of search indexes for the API space, helping me curate and recall a variety of information that I track on as part of my API monitoring. I'm considering building a augmented synonym layer tha thelps me search for terms, then expand those searches based upon suggestions from the Oxford Dictionaries API. I'm looking to augment and enrich the Algolia search capabilities with the suggestions from the dictionary API, expanding beyond any single key word or phrase I identify relevant to any industry or topic being served by the API sector.
  • Bots - If you are going to build convincing bots using the Slack or Facebook APIs you will need a robust vocabulary to work with. You won't just need a single downloaded and stored dictionary, you will need one like the Oxford Dictionaries API that evolves and changes with the world around us to enrich your bot's presence. 
  • Voice - Similar to bots, if you are going to deliver a robust voice interface via Alexa, or other voice enabled platform, you will need a rich dictionary to work from. Similar to how I'm expanding and enriching the other suggestions listed above, a dictionary layer will expand on how you will be able to interpret the text that is derived from each voice interface engagement.

These are just a couple examples of how one API could provide complimentary features to other APIs out there. It is important for API providers to be reaching out to other API providers and their communities when evangelizing their own APIs. Sure, you should be incentivizing developers to build applications on top of your APIs, but your definition of "application" should include other APIs that your resources compliment, and enrich the solutions being developed on top their platforms.

For a space where integration and interoperability appears to be priority number one, the API sector is notoriously closed, and many API providers I meet have their blinders on. As an API provider you should always be studying the rest of the API sector, looking for good examples to follow, as well as bad examples to avoid--similar to what I do as the API Evangelist. Along this journey you should also be looking for complimentary APIs to your platform, and vice versa. Think of SendGrid and Twilio as an example--email, SMS, and voice all go together very well, and their communities significantly overlap, providing the makings for a ripe partnership.

What are a couple of existing APIs that your APIs complement? This is your homework assignment for the week! ;-)


Moving APIs Out Of The Partner Realm And Making Them More Public

It is common for API providers to be really private with their APIs, and we often hear about providers restricting access as time goes by. So, when API providers loosen up restrictions on their APIs, inviting wider use by developers, making them public--I think it is worth taking notice. 

A recent example of this in the wild is from the API poster child Twitter, with their Periscope video service. Twitter has announced that they are slowly opening up access to the Periscope video API, something that has been only available to a handful of trusted partners, and via the mobile application--there was no way to upload a video without using your mobile device. Twitter is still "limiting access to fewer strategic partners for a period", but at least you can apply to see if your interests overlap with Twitters interests. It also sounds like Twitter will continue to widen access as time goes on, and as it makes sense to their Periscope strategy.

While I wished we lived in a world where API developers were all well behaved, and API providers could open up services to the public from day one, this isn't the reality we find on the web today. Twitter's cautious approach to rolling out the Periscope API should provide other API providers with an example of how you can do this sensibly, only letting in the API consumers that are in alignment with your goals. Slowly opening up, making sure the service is stable, and meets the needs of consumers, partners, as well as helps meet a platform's business objectives.

Hopefully, Twitter's approach to launching the Periscope API provides us with a positive example of how you can become more public APIs, instead of always locking things down. There is no single way to launch your APIs, but I'd say that Twitter's cautious approach is probably a good idea when you operate in such a competitive space, or when you don't have a lot of experience operating a fully public API--start with a handful of trusted partners, and open things up to a slightly wider group as you feel comfortable.

Thanks, Twitter for providing me with a positive example I can showcase--it is good to see that after a decade you can still launch new APIs, and understand the value of them to your core business--even amidst the often intense and volatile environment that is the Twitter platform.


Looking For Partners At Every Turn When Planning Your API Evangelism

One reason for having a well thought out, comprehensive API strategy, is that you are thinking about all the moving parts, and at ever turn you can weave things together, and potentially amplify the forward motion of your API operations. With every new release you should be considering all other areas of your API strategy, how you can include your API consumers, your partners, and where all of the opportunities lie when it comes to evangelism, and storytelling.

This approach was present in the latest release from Postman, with the release of their Run in Postman, embeddable button. As part of the release of the new embeddable button for their API client tooling, Postman also made a call for partners. It was a savvy marketing technique to reach out to partners like this, something that will not just generate implementations in the wild, but will also establish new partnerships, strengthen existing ones, while also generating new blog posts, and press releases around API platform operations.

When it comes to the API Evangelist operations, the approach by Postman is gold--when I consider across many areas of my own strategy. Postman is showcased as part of my API client research, their product release exists within my embeddable API research, and also lives within my API evangelism and API partners research. The moral of this story is Postman thought across their strategy when crafting the release strategy for their new feature, and I also put in similiar thoughts across my own strategy while reading their blog post, giving us both an opportunity for establishing biggest bang possible, around a single API platform release.


How Do I Provide A List Of Certified Applications To My API Ecosystem Partners

I was emailed by someone working in government, asking some pretty interesting questions around using an application showcase, to make trusted applications available to an ecosystem of partners. I'm not going to talk specifically about the agency, as I have not gotten approval to talk publicly, but I think the question is an interesting mix of several areas I am researching, that I wanted to explore a little further, in an anonymous fashion.  

This is a a heavily edited, summarized version of what was asked, but it essentially went like this:

There are 400 apps that want to get data from an organization but only some portion of them meet or exceed the governance criteria to be deemed “trustworthy”.  This usually involves certain legal commitments are followed and other privacy requirements are satisfied--these 400 apps that will be evaluated, and if they qualify, they will be put into the an application registry. Another aspect of it is that rather than each of the 400 apps having to go to each of the partners to get authorized and access the partners API that it would be more efficient to rely on the application registry to determine if they can expose their APIs to that apps request(s) as well.

Application showcases, that share approved applications with an ecosystem is common, but what I found interesting about this agencies question is that they want to also use the application approval process, as a sign of trust for other partners, when it comes to accessing their own API resources. As I said, this conversation spans three key areas of what I'm seeing occur modern API ecosystems:

  • 3rd Party Applications (ie. approval, showcase, case studies, etc.)
  • Partner Programs (i.e. certification, access tiers, etc.)
  • Service Composition (ie. plans, monetization, rate limiting, etc.)

Modern approaches to API planning, and well thought-out API driven partner programs, provide the approaches you need to approve partners (ie. companies), and the applications they build. Then your service composition, using API management infrastructure 3Scale, is how you craft the partner tiers of access, and provide the API infrastructure APIs to your partners for knowing which applications, have which levels of access. I've written about service composition many times, but my partner research is just getting going, so I wanted to take a look at the API pioneers, as potential blueprints in these areas--let's start with Amazon's approach: 

As you can see both the individual, and group level offer several layers of certification, depending on individual, as well as the software being sold (ie. government, hardware, AWS marketplace, SaaS, etc).  For some more examples of partner programs, I am taking a look at Salesforce with their certification program, the Ebay’s Solutions Directory, with Twitter's official Partner Program, over at Facebook with their Marketing Partners program, and the Edmunds Certified Developer Network, for a start.

I felt this conversation provided me an interesting look on certifying partners (companies, organizations, agencies, individuals), and the software integration(s) they produce (app showcase, directory, etc.), then using that as a trust driver for your API service composition for your own platform, but also extending that identity, trust, and authentication for your partners to use in their own API service composition.

The application showcase portion of this story is interesting, but I think the use of partner program certification, and extending this to your partners, so that they can use in their API service is unique. Something that is totally possible, if your API management, planning, and partner layers all have APIs. You can then use your infrastructure APIs, provide facades for your partners to access users, applications, partner program details, and other aspects of API operations, in a seamless way using APIs. *mind blown*


Ever Notice Some Companies Only Will Do APIs When It Comes To Advertsing Or Partners


Striking The Right Balance With API Evangelist Partners

I get a lot of requests from individuals and companies who want to "partner" with me, with many meanings to what this actually means. As a one person operation I have to be very careful who I engage with, because it is very easy for large organizations, with more resources to drown me in talk, meetings, emails, and other alternatives to actually doing what it is that I do.

Another big challenge in partnering with new startups is that they often do not have a lot of money, lots of potential value, yet an unproven track record. I love believing in new startups, but in a world where they need me in the beginning, and not so much once they've made, I have to be wary any company that walks in the door. You may be a great bunch of folks, but once you pile on enough investors, and changes in executive management along the way--things change. 

I have a lot of startups, VCs, and other companies who love to engage with me, "pick my brain", "understand my expertise", "get my take on where things are going", "craft their strategy", and many other informal ways to tap the experience and perspective I bring to the table. Honestly I thrive on doing this, but after 5 years, and being screwed by numerous startups and open source projects, I am starting to get very weary about who I help.

I understand you are super excited about your new API idea, or service. I can easily get that way too! You see that excitement will fade once you get traction, become attractive to investors, and your willingness to work with me, share your ideas, tools, and resources with the community will fade. Twitter is the best example of this in the wild. No I didn't help Twitter get going, but I've seen the same syndrome play out with 50+ APIs and service provider startups in the last five years of operating.

Startups offer me equity to alleviate my concerns. Yawwwwn! Means nothing. I have a file in the filing cabinet of worthless options. Others offer me some sort of return on traffic I send to them, and conversions I generate. I guess this is a start, but it doesn't touch on the big partner deals I can often help in bringing to the table and educating, and general awareness I work to build in API sector. Traditional tracking mechanisms will only capture a small portion of the value I can potentially generate for an API startup, and honestly is a waste of our time.

This leaves me with just getting to know startups, and dating, as I like to say, for quite a while, before I engage too heavily. My only defense is to be really public with all my partnerships from day one. State publicly, that I am partnering with company X. Then tag each post, white paper, or research project that I do on behalf of a relationship. Don't get me wrong, I get a lot of value of this work I do, otherwise I wouldn't be doing it. However the line I want to draw in the sand, is just a public acknowledgement that I am helping this company figure out their strategy, tell stories, and help shape their layer of the API space.

This post is just me brainstorming a new tier of my partner program, which I'm thinking I will call "strategy and storytelling partners". When I am approached by, or discover a new API company that I find interesting, and I begin investing time and resources into this relationship, I will publish this company on my list of partners. These relationships almost always do not involve money, and usually involve a heavy investment on my part when it comes to helping them walk through strategy, and storytelling around their products, services, and potentially helping define and evolve the sector they are looking to operate within.

In the end, after several weeks of mulling over this subject I do not see any contractual, or technological solution to tracking how I help API startups in the space--it has to be a human solution. I will almost always share the existence of a partnership with a company from day one, and it is up to me to modulate how much "investment" I give, and as this benefits (or not) the startup, it will be up to the company itself to kick-back to me (and the community) as a payback. If you don't, and you end up taking or keeping the lion share of value generate by my work, and the community I bring to the table, it is on you. The timeline will be there for everyone else to judge. #karma


The New Mind Control APIs That Salesforce Is Testing On Conference Attendees Is Available To Premier Partners

The Dreamforce conference is happening this week in San Francisco, a flagship event for the Platform as a Service (PaaS) company. Salesforce is one of the original pioneers in API technology, allowing companies to empower their sales force, using the the latest in technology. Something that in 2015, Salesforce is taking this to the next level, with a handful of attendees, and partners in attendance at the conference.

Using smart pillow technology, Salesforce will be testing out a new set of subliminal mind control APIs. All attendees of the Dreamforce conference have agreed to be part of the tests, through their acceptance of the event terms of service, but only a small group of 500 individuals will actually be targeted. Exactly which attendees are selected will be a secret, even from the handful of 25 partners who will be involved in the test. 

Through carefully placed hotel pillows, targeted attendees will receive subliminal messages, transmitted via smart pillow APIs developed by Salesforce. Messages will be crafted in association with partners, testing out concepts of directing attendees what they will eat the next day, which sessions they are attending, where they will be going in the exhibit hall, and who they will be networking with The objective is to better understand how open the conference attendees are open to suggestion, in an conference environment.

While some partners of this mind control trial are just doing random tests to see if the technology works, others are looking to implement tasks that are in sync with their sales objectives. Ernst Stavro Blofeld, CEO of Next Generation Staffing Inc, says "the Salesforce test represents the future of industry, and the workforce--this weeks test is about seeing what we can accomplish at a conference, but represents what we will be able to achieve in our workforce on a daily basis."

Salesforce reminded us that this is just a simple test, but an important one that reflects the influence the company already has over its constituents. The company enjoys one of the most loyal base of business users, out of all leading software companies in the world, and this new approach to targeting a loyal base of users, is just the beginning of a new generation of API engineered influence.


SnapLogic and 3scale Announce Partnership to Bring API Management Capabilities to Integration Platform as a Service

SnapLogic, an industry leader in enterprise integration platform as a service (iPaaS), and 3scale, the leading API Management Platform, today announced a strategic partnership that aims at simplifying the development, publication and execution of any integration process. Under the terms of the partnership, the two companies have certified their respective platforms to interoperate seamlessly together, making it easy for application and data integration experts to expose SnapLogic’s Elastic Integration Platform dataflow pipelines as web APIs. These APIs, compliant with the REST architectural style, can be invoked by any authorized user, application, web backend or mobile app through a simple and standardized HTTP call in order to trigger the execution of the SnapLogic pipeline.

SnapLogic’s unified integration platform as a service (iPaaS) allows citizen integrators and developers to easily author multi-point integration pipelines that connect cloud and on-premise applications as well as disparate enterprise data sources for big data analytics and expose them as RESTful APIs. Once these pipelines are built, they can be released to developers through 3scale’s API Management Platform, accelerating the development and dissemination of public and private APIs.  

3scale’s distributed architecture and self-serve platform offer flexibility, performance and ability to scale. Powerful API access, policy and traffic controls make it simple to authenticate traffic, restrict by policy, protect backend services, impose rate limits and create access tiers. API documentation is friendly, interactive, intuitive, and clear with 3scale ActiveDocs, based on the Swagger Framework. Built-in analytics help API owners understand and control their traffic, identify the most active users, applications and methods and can help pinpoint traffic patterns.

“3scale shares our modern standards approach to application and data integration with a no-compromise, highly scalable API management platform,” said Jack Kudale, vice president of field operations at SnapLogic. “This partnership expands the enterprise integration possibilities for our joint solutions and allows us to deliver greater value to our customers.”  

About SnapLogic
SnapLogic is the industry’s first unified data and application integration platform as a service (iPaaS) that allows enterprise IT organizations and lines of business to connect faster and gain a better return on their cloud application and big data investments. SnapLogic’s modern architecture is powered by more than 300 Snaps, pre-built integration components that simplify and automate complex enterprise integration processes. Funded by leading venture investors, including Andreessen Horowitz and Ignition Partners, and co-founded by Gaurav Dhillon, co-founder and former CEO of Informatica, SnapLogic is run by prominent companies in the Global 2000. For more information call +1.888.494.1570 or visit www.snaplogic.com.

About 3scale
3scale is the leading self-serve, high performance API management platform, powering more than 600 customer APIs. API providers can easily package, distribute, manage and monetize APIs through a SaaS infrastructure that is powerful, flexible, secure and Web scalable. The 3scale platform enables the distribution of a company's data, content or services to multiple devices or mobile/Web applications, as well as the ability to easily productize APIs. Customers span the Fortune 500, government, academia, and startups. 3scale customers include Coldwell Banker, Johnson Controls, SITA, Crunchbase, Campbell’s Soup, UC Berkeley, Wine.com among others. The company also powers APItools for API consumers and APIs.io, the world’s first open source API search engine. For more information, visit http://www.3scale.net.


Hack/Reduce partners with state, universities for big-data ambassador program | BetaBoston


When Cort Johnson made the move from Terrible Labs to Atlas Venture, some wondered what his specific role at the venture capital firm might be. With the launch of hack/reduce’s “College Ambassador Program,” that becomes a little more clear. Johnson will spearhead the initiative to connect hack/reduce, the MassTech Collaborative, and local colleges and universities in order to build a robust big-data community and mentor the potential leaders of the next generation of big-data companies. The program plans to have two students designated as hack/reduce big-data ambassadors on the campus of a number of Boston-area schools. The ambassadors’ role will be to promote the growth of big-data ideas and startups at the university level, through events, hackathons, and connecting early stage data-focused companies with space to work and learn at hack/reduce. The benefit for the ambassadors is they access they will have to the hack/reduce community (big-data startups like Sqrrl, Nutonian, and others) as well as access to venture capitalists, angel investors, and other experienced members of Boston’s tech ecosystem. Johnson said that there are plans in the works to bring the students together with successful entrepreneurs and investors for intimate dinners throughout the year. “There is no organization right now that is doing a good job bringing the college community in sync with the Boston tech community,” Johnson said. “What we want to do is create great content and events for students to get involved in the community.” The hack/reduce program is an extension of Governor Deval Patrick’s push for more innovation using big data in Massachusetts. In April, as part of the larger state-wide big-data initiative, Patrick announced a $3 million investment in the Massachusetts Open Cloud project, which was described as “a university-industry collaboration designed to create a new public cloud computing infrastructure to spur big data innovation.” The announcement coincided with a report that predicts that 3,000 new big-data jobs will be created in Massachusetts in the next year. The entire initiative is being driven by the Massachusetts Tech Leadership Council and hack/reduce. Located just outside Kendall Square, hack/reduce is a nonprofit workspace where people and companies working with big data can share the projects they are working on, resources, and help mentor other startups. The program was founded by Atlas Venture partner Chris Lynch who is a key investor in many local big-data startups. Shanan Kumar, a Tufts student who is working with Cort Johnson and Lynch on the ambassador program, explained the relationship between the initiative and hack/reduce. “A lot of the activities we want the ambassadors to do on their own campus,” she said. “There is not going to be that many events at hack/reduce but we will come out here to meet with each other, network, and share ideas.” Johnson said he was excited by what Kumar has done so far and looking forward to working with her on the hack/reduce Ambassador Program. “She is going to be helping us to promote this program across all the local campuses,” he said. That was a comment appended to a column I wrote in August about two app start-ups in Boston, and I thought it raised a good question: Who is creating those new manufacturing jobs? If you want to weld, rivet, or stamp for a living, rather than code and upload, where will you work?

URL: http://betaboston.com/news/2014/09/17/hackreduce-partners-with-state-universities-for-big-data-ambassador-program/

An API Evangelism Strategy To Map The Global Family Tree

In my work everyday as the API Evangelist, I get to have some very interesting conversations, with a wide variety of folks, about how they are using APIs, as well as brainstorming other ways they can approach their API strategy allowing them to be more effective. One of the things that keep me going in this space is this diversity. One day I’m looking at Developer.Trade.Gov for the Department of Commerce, the next talking to WordPress about APIs for 60 million websites, and then I’m talking with the The Church of Jesus Christ of Latter-day Saints about the Family Search API, which is actively gathering, preserving, and sharing genealogical records from around the world.

I’m so lucky I get to speak with all of these folks about the benefits, and perils of APIs, helping them think through their approach to opening up their valuable resources using APIs. The process is nourishing for me because I get to speak to such a diverse number of implementations, push my understanding of what is possible with APIs, while also sharpening my critical eye, understanding of where APIs can help, or where they can possibly go wrong. Personally, I find a couple of things very intriguing about the Family Search API story:

  1. Mapping the worlds genealogical history using a publicly available API — Going Big!!
  2. Potential from participation by not just big partners, but the long tail of genealogical geeks
  3. Transparency, openness, and collaboration shining through as the solution beyond just the technology
  4. The mission driven focus of the API overlapping with my obsession for API evangelism intrigues and scares me
  5. Have existing developer area, APIs, and seemingly necessary building blocks but failed to achieve a platform level

I’m open to talking with anyone about their startup, SMB, enterprise, organizational, institutional, or government API, always leaving open a 15 minute slot to hear a good story, which turned into more than an hour of discussion with the Family Search team. See, Family Search already has an API, they have the technology in order, and they even have many of the essential business building blocks as well, but where they are falling short is when it comes to dialing in both the business and politics of their developer ecosystem to discover the right balance that will help them truly become a platform—which is my specialty. ;-)

This brings us to the million dollar question: How does one become a platform?

All of this makes Family Search an interesting API story. The scope of the API, and to take something this big to the next level, Family Search has to become a platform, and not a superficial “platform” where they are just catering to three partners, but nourishing a vibrant long tail ecosystem of website, web application, single page application, mobile applications, and widget developers. Family Search is at an important reflection point, they have all the parts and pieces of a platform, they just have to figure out exactly what changes need to be made to open up, and take things to the next level.

First, let’s quantify the company, what is FamilySearch? “ For over 100 years, FamilySearch has been actively gathering, preserving, and sharing genealogical records worldwide”, believing that “learning about our ancestors helps us better understand who we are—creating a family bond, linking the present to the past, and building a bridge to the future”.

FamilySearch is 1.2 billion total records, with 108 million completed in 2014 so far, with 24 million awaiting, as well as 386 active genealogical projects going on. Family Search provides the ability to manage photos, stories, documents, people, and albums—allowing people to be organized into a tree, knowing the branch everyone belongs to in the global family tree.

FamilySearch, started out as the Genealogical Society of Utah, which was founded in 1894, and dedicate preserving the records of the family of mankind, looking to "help people connect with their ancestors through easy access to historical records”. FamilySearch is a mission-driven, non-profit organization, ran by the The Church of Jesus Christ of Latter-day Saints. All of this comes together to define an entity, that possesses an image that will appeal to some, while leave concern for others—making for a pretty unique formula for an API driven platform, that doesn’t quite have a model anywhere else.

FamilySearch consider what they deliver as as a set of record custodian services:

  • Image Capture - Obtaining a preservation quality image is often the most costly and time-consuming step for records custodians. Microfilm has been the standard, but digital is emerging. Whether you opt to do it yourself or use one of our worldwide camera teams, we can help.
  • Online Indexing - Once an image is digitized, key data needs to be transcribed in order to produce a searchable index that patrons around the world can access. Our online indexing application harnesses volunteers from around the world to quickly and accurately create indexes.
  • Digital Conversion - For those records custodians who already have a substantial collection of microfilm, we can help digitize those images and even provide digital image storage.
  • Online Access - Whether your goal is to make your records freely available to the public or to help supplement your budget needs, we can help you get your records online. To minimize your costs and increase access for your users, we can host your indexes and records on FamilySearch.org, or we can provide tools and expertise that enable you to create your own hosted access.
  • Preservation - Preservation copies of microfilm, microfiche, and digital records from over 100 countries and spanning hundreds of years are safely stored in the Granite Mountain Records Vault—a long-term storage facility designed for preservation.

FamilySearch provides a proven set of services that users can take advantage of via a web applications, as well as iPhone and Android mobile apps, resulting in the online community they have built today. FamilySearch also goes beyond their basic web and mobile application services, and is elevated to software as a service (SaaS) level by having a pretty robust developer center and API stack.

Developer Center
FamilySearch provides the required first impression when you land in the FamilySearch developer center, quickly explaining what you can do with the API, "FamilySearch offers developers a way to integrate web, desktop, and mobile apps with its collaborative Family Tree and vast digital archive of records”, and immediately provides you with a getting started guide, and other supporting tutorials.

FamilySearch provides access to over 100 API resources in the twenty separate groups: Authorities, Change History, Discovery, Discussions, Memories, Notes, Ordinances, Parents and Children, Pedigree, Person, Places, Records, Search and Match, Source Box, Sources, Spouses, User, Utilities, Vocabularies, connecting you to the core FamilySearch genealogical engine.

The FamilySearch developer area provides all the common, and even some forward leaning technical building blocks:

To support developers, FamilySearch provides a fairly standard support setup:

To augment support efforts there are also some other interesting building blocks:

Setting the stage for FamilySearch evolving to being a platform, they even posses some necessary partner level building blocks:

There is even an application gallery showcasing what web, mac & windows desktop, and mobile applications developers have built. FamilySearch even encourages developers to “donate your software skills by participating in community projects and collaborating through the FamilySearch Developer Network”.

Many of the ingredients of a platform exist within the current FamilySearch developer hub, at least the technical elements, and some of the common business, and political building blocks of a platform, but what is missing? This is what makes FamilySearch a compelling story, because it emphasizes one of the core elements of API Evangelist—that all of this API stuff only works when the right blend of technical, business, and politics exists.

Establishing A Rich Partnership Environment

FamilySearch has some strong partnerships, that have helped establish FamilySearch as the genealogy service it is today. FamilySearch knows they wouldn’t exist without the partnerships they’ve established, but how do you take it to the next and grow to a much larger, organic API driven ecosystem where a long tail of genealogy businesses, professionals, and enthusiasts can build on, and contribute to, the FamilySearch platform.

FamilySearch knows the time has come to make a shift to being an open platform, but is not entirely sure what needs to happen to actually stimulate not just the core FamilySearch partners, but also establish a vibrant long tail of developers. A developer portal is not just a place where geeky coders come to find what they need, it is a hub where business development occurs at all levels, in both synchronous, and asynchronous ways, in a 24/7 global environment.

FamilySearch acknowledge they have some issues when it comes investing in API driven partnerships:

  • “Platform” means their core set of large partners
  • Not treating all partners like first class citizens
  • Competing with some of their partners
  • Don’t use their own API, creating a gap in perspective

FamilySearch knows if they can work out the right configuration, they can evolve FamilySearch from a digital genealogical web and mobile service to a genealogical platform. If they do this they can scale beyond what they’ve been able to do with a core set of partners, and crowdsource the mapping of the global family tree, allowing individuals to map their own family trees, while also contributing to the larger global tree. With a proper API driven platform this process doesn’t have to occur via the FamiliySearch website and mobile app, it can happen in any web, desktop, or mobile application anywhere.

FamilySearch already has a pretty solid development team taking care of the tech of the FamilySearch API, and they have 20 people working internally to support partners. They have a handle on the tech of their API, they just need to get a handle on the business and politics of their API, and invest in the resources that needed to help scale the FamilySearch API being just a developer area, to being a growing genealogical developer community, to a full blow ecosystem that span not just the FamilySearch developer portal, but thousands of other sites and applications around the globe.

A Good Dose Of API Evangelism To Shift Culture A Bit

A healthy API evangelism strategy brings together a mix of business, marketing, sales and technology disciplines into a new approach to doing business for FamilySearch, something that if done right, can open up FamilySearch to outside ideas, and with the right framework manage to allow the platform to move beyond just certification, and partnering to also investment, and acquisition of data, content, talent, applications, and partners via the FamilySearch developer platform.

Think of evangelism as the grease in the gears of the platform allowing it to grow, expand, and handle a larger volume, of outreach, and support. API evangelism works to lubricate all aspects of platform operation.

First, lets kick off with setting some objectives for why we are doing this, what are we trying to accomplish:

  • Increase Number of Records - Increase the number of overall records in the FamilySearch database, contributing the larger goals of mapping the global family tree.
  • Growth in New Users - Growing the number of new users who are building on the FamilySearch API, increase the overall headcount fro the platform.
  • Growth In Active Apps - Increase not just new users but the number of actual apps being built and used, not just counting people kicking the tires.
  • Growth in Existing User API Usage - Increase how existing users are putting the FamilySearch APIs. Educate about new features, increase adoption.
  • Brand Awareness - One of the top reasons for designing, deploying and managing an active APIs is increase awareness of the FamilySearch brand.
  • What else?

What does developer engagement look like for the FamilySearch platform?

  • Active User Engagement - How do we reach out to existing, active users and find out what they need, and how do we profile them and continue to understand who they are and what they need. Is there a direct line to the CRM?
  • Fresh Engagement - How is FamilySearch contacting new developers who have registered weekly to see what their immediate needs are, while their registration is fresh in their minds.
  • Historical Engagement - How are historical active and / or inactive developers being engaged to better understand what their needs are and would make them active or increase activity.
  • Social Engagement - Is FamilySearch profiling the URL, Twitter, Facebook LinkedIn, and Github profiles, and then actively engage via these active channels?

Establish a Developer Focused Blog For Storytelling

  • Projects - There are over 390 active projects on the FamilySearch platform, plus any number of active web, desktop, and mobile applications. All of this activity should be regularly profiled as part of platform evangelism. An editorial assembly line of technical projects that can feed blog stories, how-tos, samples and Github code libraries should be taking place, establishing a large volume of exhaust via the FamlySearch platform.
  • Stories - FamilySearch is great at writing public, and partner facing content, but there is a need to be writing, editing and posting of stories derived from the technically focused projects, with SEO and API support by design.
  • Syndication - Syndication to Tumblr, Blogger, Medium and other relevant blogging sites on regular basis with the best of the content.

Mapping Out The Geneology Landscape

  • Competition Monitoring - Evaluation of regular activity of competitors via their blog, Twitter, Github and beyond.
  • Alpha Players - Who are the vocal people in the genealogy space with active Twitter, blogs, and Github accounts.
  • Top Apps - What are the top applications in the space, whether built on the FamilySearch platform or not, and what do they do?
  • Social - Mapping the social landscape for genealogy, who is who, and who should the platform be working with.
  • Keywords - Established a list of keywords to use when searching for topics at search engines, QA, forums, social bookmarking and social networks. (should already be done by marketing folks)
  • Cities & Regions - Target specific markets in cities that make sense to your evangelism strategy, what are local tech meet ups, what are the local organizations, schools, and other gatherings. Who are the tech ambassadors for FamilySearch in these spaces?

Adding To Feedback Loop From Forum Operations

  • Stories - Deriving of stories for blog derived from forum activity, and the actual needs of developers.
  • FAQ Feed - Is this being updated regularly with stuff?
  • Streams - other stream giving the platform a heartbeat?

Being Social About Platform Code and Operations With Github

  • Setup Github Account - Setup FamilySearch platform developer account and bring internal development team into a team umbrella as part of.
  • Github Relationships - Managing of followers, forks, downloads and other potential relationships via Github, which has grown beyond just code, and is social.
  • Github Repositories - Managing of code sample Gists, official code libraries and any samples, starter kits or other code samples generated through projects.

Adding To The Feedback Loop From The Bigger FAQ Picture

  • Quora - Regular trolling of Quora and responding to relevant [Client Name] or industry related questions.
  • Stack Exchange - Regular trolling of Stack Exchange / Stack Overflow and responding to relevant FamilySearch or industry related questions.
  • FAQ - Add questions from the bigger FAQ picture to the local FamilySearch FAQ for referencing locally.

Leverage Social Engagement And Bring In Developers Too

  • Facebook - Consider setting up of new API specific Facebook company. Posting of all API evangelism activities and management of friends.
  • Google Plus - Consider setting up of new API specific Google+ company. Posting of all API evangelism activities and management of friends.
  • LinkedIn - Consider setting up of new API specific LinkedIn profile page who will follow developers and other relevant users for engagement. Posting of all API evangelism activities.
  • Twitter - Consider setting up of new API specific Twitter account. Tweeting of all API evangelism activity, relevant industry landscape activity, discover new followers and engage with followers.

Sharing Bookmarks With the Social Space

  • Hacker News - Social bookmarking of all relevant API evangelism activities as well as relevant industry landscape topics to Hacker News, to keep a fair and balanced profile, as well as network and user engagement.
  • Product Hunt - Product Hunt is a place to share the latest tech creations, providing an excellent format for API providers to share details about their new API offerings.
  • Reddit - Social bookmarking of all relevant API evangelism activities as well as relevant industry landscape topics to Reddit, to keep a fair and balanced profile, as well as network and user engagement.

Communicate Where The Roadmap Is Going

  • Roadmap - Provide regular roadmap feedback based upon developer outreach and feedback.
  • Changelog - Make sure the change log always reflects the roadmap communication or there could be backlash.

Establish A Presence At Events

  • Conferences - What are the top conferences occurring that we can participate in or attend--pay attention to call for papers of relevant industry events.
  • Hackathons - What hackathons are coming up in 30, 90, 120 days? Which would should be sponsored, attended, etc.
  • Meetups - What are the best meetups in target cities? Are there different formats that would best meet our goals? Are there any sponsorship or speaking opportunities?
  • Family History Centers - Are there local opportunities for the platform to hold training, workshops and other events at Family History Centers?
  • Learning Centers - Are there local opportunities for the platform to hold training, workshops and other events at Learning Centers?

Measuring All Platform Efforts

  • Activity By Group - Summary and highlights from weekly activity within the each area of API evangelism strategy.
  • New Registrations - Historical and weekly accounting of new developer registrations across APis.
  • Volume of Calls - Historical and weekly accounting of API calls per API.
  • Number of Apps - How many applications are there.

Essential Internal Evangelism Activities

  • Storytelling - Telling stories of an API isn’t just something you do externally, what stories need to be told internally to make sure an API initiative is successful.
  • Conversations - Incite internal conversations about the FamilySearch platform. Hold brown bag lunches if you need to, or internal hackathons to get them involved.
  • Participation - It is very healthy to include other people from across the company in API operations. How can we include people from other teams in API evangelism efforts. Bring them to events, conferences and potentially expose them to local, platform focused events.
  • Reporting - Sometimes providing regular numbers and reports to key players internally can help keep operations running smooth. What reports can we produce? Make them meaningful.

All of this evangelism starts with a very external focus, which is a hallmark of API and developer evangelism efforts, but if you notice by the end we are bringing it home to the most important aspect of platform evangelism, the internal outreach. This is the number one reason APIs fail, is due to a lack of internal evangelism, educating top and mid-level management, as well as lower level staff, getting buy-in and direct hands-on involvement with the platform, and failing to justify budget costs for the resources needed to make a platform successful.

Top-Down Change At FamilySearch

The change FamilySearch is looking for already has top level management buy-in, the problem is that the vision is not in lock step sync with actual platform operations. When regular projects developed via the FamilySearch platform are regularly showcased to top level executives, and stories consistent with platform operations are told, management will echo what is actually happening via the FamilySearch. This will provide a much more ongoing, deeper message for the rest of the company, and partners around what the priorities of the platform are, making it not just a meaningless top down mandate.

An example of this in action is with the recent mandate from President Obama, that all federal agencies should go “machine readable by default”, which includes using APIs and open data outputs like JSON, instead of document formats like PDF. This top down mandate makes for a good PR soundbite, but in reality has little affect on the ground at federal agencies. In reality it has taken two years of hard work on the ground, at each agency, between agencies, and with the public to even begin to make this mandate a truth at over 20 of the federal government agencies.

Top down change is a piece of the overall platform evolution at FamilySearch, but is only a piece. Without proper bottom-up, and outside-in change, FamilySearch will never evolve beyond just being a genealogical software as a service with an interesting API. It takes much more than leadership to make a platform.

Bottom-Up Change At FamilySearch

One of the most influential aspects of APIs I have seen at companies, institutions, and agencies is the change of culture brought when APIs move beyond just a technical IT effort, and become about making resources available across an organization, and enabling people to do their job better. Without an awareness, buy-in, and in some cases evangelist conversion, a large organization will not be able to move from a service orientation to a platform way of thinking.

If a company as a whole is unaware of APIs, either at the company or organization, as well as out in the larger world with popular platforms like Twitter, Instagram, and others—it is extremely unlikely they will endorse, let alone participate in moving from being a digital service to platform. Employees need to see the benefits of a platform to their everyday job, and their involvement cannot require what they would perceive as extra work to accomplish platform related duties. FamilySearch employees need to see the benefits the platform brings to the overall mission, and play a role in this happening—even if it originates from a top-down mandate.

Top bookseller Amazon was already on the path to being a platform with their set of commerce APIs, when after a top down mandate from CEO Jeff Bezos, Amazon internalized APIs in such a way, that the entire company interacted, and exchange resources using web APIs, resulting in one of the most successful API platforms—Amazon Web Services (AWS). Bezos mandated that if an Amazon department needed to procure a resource from another department, like server or storage space from IT, it need to happen via APIs. This wasn’t a meaningless top-down mandate, it made employees life easier, and ultimately made the entire company more nimble, and agile, while also saving time and money. Without buy-in, and execution from Amazon employees, what we know as the cloud would never have occurred.

Change at large enterprises, organizations, institutions and agencies, can be expedited with the right top-down leadership, but without the right platform evangelism strategy, that includes internal stakeholders as not just targets of outreach efforts, but also inclusion in operations, it can result in sweeping, transformational changes. This type of change at a single organization can effect how an entire industry operates, similar to what we’ve seen from the ultimate API platform pioneer, Amazon.

Outside-In Change At FamilySearch

The final layer of change that needs to occur to bring FamilySearch from being just a service to a true platform, is opening up the channels to outside influence when it comes not just to platform operations, but organizational operations as well. The bar is high at FamilySearch. The quality of services, and expectation of the process, and adherence to the mission is strong, but if you are truly dedicated to providing a database of all mankind, you are going to have to let mankind in a little bit.

FamilySearch is still the keeper of knowledge, but to become a platform you have to let in the possibility that outside ideas, process, and applications can bring value to the organization, as well as to the wider genealogical community. You have to evolve beyond notions that the best ideas from inside the organization, and just from the leading partners in the space. There are opportunities for innovation and transformation in the long-tail stream, but you have to have a platform setup to encourage, participate in, and be able to identify value in the long-tail stream of an API platform.

Twitter is one of the best examples of how any platform will have to let in outside ideas, applications, companies, and individuals. Much of what we consider as Twitter today was built in the platform ecosystem from the iPhone and Android apps, to the desktop app TweetDeck, to terminology like the #hashtag. Over the last 5 years, Twitter has worked hard to find the optimal platform balance, regarding how they educate, communicate, invest, acquire, and incentives their platform ecosystem. Listening to outside ideas goes well beyond the fact that Twitter is a publicly available social platform, it is about having such a large platform of API developers, and it is impossible to let in all ideas, but through a sophisticated evangelism strategy of in-person, and online channels, in 2014 Twitter has managed to find a balance that is working well.

Having a public facing platform doesn’t mean the flood gates are open for ideas, and thoughts to just flow in, this is where service composition, and the certification and partner framework for FamilySearch will come in. Through clear, transparent partners tiers, open and transparent operations and communications, an optimal flow of outside ideas, applications, companies and individuals can be established—enabling a healthy, sustainable amount of change from the outside world.

Knowing All Of Your Platform Partners

The hallmark of any mature online platform is a well established partner ecosystem. If you’ve made the transition from service to platform, you’ve established a pretty robust approach to not just certifying, and on boarding your partners, you also have stepped it up in knowing and understanding who they are, what their needs are, and investing in them throughout the lifecycle.

First off, profile everyone who comes through the front door of the platform. If they sign up for a public API key, who are they, and where do they potentially fit into your overall strategy. Don’t be pushy, but understanding who they are and what they might be looking for, and make sure you have a track for this type of user well defined.

Next, quality, and certify as you have been doing. Make sure the process is well documented, but also transparent, allowing companies and individuals to quickly understand what it will take to certified, what the benefits are, and examples of other partners who have achieved this status. As a developer, building a genealogical mobile app, I need to know what I can expect, and have some incentive for investing in the certification process.

Keep your friends close, and your competition closer. Open the door wide for your competition to become a platform user, and potentially partner. 100+ year old technology company Johnson Controls (JCI) was concerned about what the competition might do it they opened up their building efficient data resources to the public via the Panoptix API platform, when after it was launched, they realized their competition were now their customer, and a partner in this new approach to doing business online for JCI.

When Department of Energy decides what data and other resource it makes available via Data.gov or the agencies developer program it has to deeply consider how this could affect U.S. industries. The resources the federal agency possesses can be pretty high value, and huge benefits for the private sector, but in some cases how might opening up APIs, or limiting access to APIs help or hurt the larger economy, as well as the Department of Energy developer ecosystem—there are lots of considerations when opening up API resources, that vary from industry to industry.

There are no silver bullets when it comes to API design, deployment, management, and evangelism. It takes a lot of hard work, communication, and iterating before you strike the right balance of operations, and every business sector will be different. Without knowing who your platform users are, and being able to establish a clear and transparent road for them to follow to achieve partner status, FamilySearch will never elevate to a true platform. How can you scale the trusted layers of your platform, if your partner framework isn’t well documented, open, transparent, and well executed? It just can’t be done.

Meaningful Monetization For Platform

All of this will take money to make happen. Designing, and executing on the technical, and the evangelism aspects I’m laying out will cost a lot of money, and on the consumers side, it will take money to design, develop, and manage desktop, web, and mobile applications build around the FamilySearch platform. How will both the FamilySearch platform, and its participants make ends meet?

This conversation is a hard one for startups, and established businesses, let alone when you are a non-profit, mission driven organization. Internal developers cost money, server and bandwidth are getting cheaper but still are a significant platform cost--sustaining a sale, bizdev, and evangelism also will not be cheap. It takes money to properly deliver resources via APIs, and even if the lowest tiers of access are free, at some point consumers are going to have to pay for access, resources, and advanced features.

The conversation around how do you monetize API driven resources is going on across government, from cities up to the federal government. Where the thought of charging for access to public data is unheard of. These are public assets, and they should be freely available. While this is true, think of the same situation, but when it comes to physical public assets that are owned by the government, like parks. You can freely enjoy many city, county, and federal parks, there are sometimes small fees for usage, but if you want to actually sell something in a public park, you will need to buy permits, and often share revenue with the managing agency. We have to think critically about how we fund the publishing, and refinement of publicly owned digital assets, as with physical assets there will be much debate in coming years, around what is acceptable, and what is not.

Woven into the tiers of partner access, there should always be provisions for applying costs, overhead, and even generation of a little revenue to be applied in other ways. With great power, comes great responsibility, and along with great access for FamilySearch partners, many will also be required to cover costs of compute capacity, storage costs, and other hard facts of delivering a scalable platform around any valuable digital assets, whether its privately or publicly held.

Platform monetization doesn’t end with covering the costs of platform operation. Consumers of FamilySearch APIs will need assistance in identify the best ways to cover their own costs as well. Running a successful desktop, web or mobile application will take discipline, structure, and the ability to manage overhead costs, while also being able to generate some revenue through a clear business model. As a platform, FamilySearch will have to bring to the table some monetization opportunities for consumers, providing guidance as part of the certification process regarding what are best practices for monetization, and even some direct opportunities for advertising, in-app purchases and other common approaches to application monetization and sustainment.

Without revenue greasing the gears, no service can achieve platform status. As with all other aspects of platform operations the conversation around monetization cannot be on-sided, and just about the needs of the platform providers. Pro-active steps need to be taken to ensure both the platform provider, and its consumers are being monetized in the healthiest way possible, bringing as much benefit to the overall platform community as possible.

Open & Transparent Operations & Communications

How does all of this talk of platform and evangelism actually happen? It takes a whole lot of open, transparent communication across the board. Right now the only active part of the platform is the FamilySearch Developer Google Group, beyond that you don’t see any activity that is platform specific. There are active Twitter, Facebook, Google+, and mainstream and affiliate focused blogs, but nothing that serves the platform, contributed to the feedback loop that will be necessary to take the service to the next level.

On a public platform, communications cannot all be private emails, phone calls, or face to face meetings. One of the things that allows an online service to expand to become a platform, then scale and grow into robust, vibrant, and active community is a stream of public communications, which include blogs, forums, social streams, images, and video content. These communication channels cannot all be one way, meaning they need to include forum and social conversations, as well as showcase platform activity by API consumers.

Platform communications isn’t just about getting direct messages answered, it is about public conversation so everyone shares in the answer, and public storytelling to help guide and lead the platform, that together with support via multiple channels, establishes a feedback loop, that when done right will keep growing, expanding and driving healthy growth. The transparent nature of platform feedback loops are essential to providing everything the consumers will need, while also bringing a fresh flow of ideas, and insight within the FamilySearch firewall.

Truly Shifting FamilySearch The Culture

Top-down, bottom-up, outside-in, with constantly flow of oxygen via vibrant, flowing feedback loop, and the nourishing, and sanitizing sunlight of platform transparency, where week by week, month by month someone change can occur. It won’t all be good, there are plenty of problems that arise in ecosystem operations, but all of this has the potential to slowly shift culture when done right.

One thing that shows me the team over at FamilySearch has what it takes, is when I asked if I could write this up a story, rather than just a proposal I email them, they said yes. This is a true test of whether or not an organization might have what it takes. If you are unwilling to be transparent about the problems you have currently, and the work that goes into your strategy, it is unlikely you will have what it takes to establish the amount of transparency required for a platform to be successful.

When internal staff, large external partners, and long tail genealogical app developers and enthusiasts are in sync via a FamilySearch platform driven ecosystem, I think we can consider a shift to platform has occurred for FamilySearch. The real question is how do we get there?

Executing On Evangelism

This is not a definitive proposal for executing on an API evangelism strategy, merely a blueprint for the seed that can be used to start a slow, seismic shift in how FamilySearch engages its API area, in a way that will slowly evolve it into a community, one that includes internal, partner, and public developers, and some day, with the right set of circumstances, FamilySearch could grow into robust, social, genealogical ecosystem where everyone comes to access, and participate in the mapping of mankind.

  • Defining Current Platform - Where are we now? In detail.
  • Mapping the Landscape - What does the world of genealogy look like?
  • Identifying Projects - What are the existing projects being developed via the platform?
  • Define an API Evangelist Strategy - Actually flushing out of a detailed strategy.
    • Projects
    • Storytelling
    • Syndication
    • Social
    • Channels
      • External Public
      • External Partner
      • Internal Stakeholder
      • Internal Company-Wide
  • Identify Resources - What resource currently exist? What are needed?
    • Evangelist
    • Content / Storytelling
    • Development
  • Execute - What does execution of an API evangelist strategy look like?
  • Iterate - What does iteration look like for an API evangelism strategy.
    • Weekly
    • Review
    • Repeat

AS with many providers, you don’t want to this to take 5 years, so how do you take a 3-5 year cycle, and execute in 12-18 months?

  • Invest In Evangelist Resources - It takes a team of evangelists to build a platform
    • External Facing
    • Partner Facing
    • Internal Facing
  • Development Resources - We need to step up the number of resources available for platform integration.
    • Code Samples & SDKs
    • Embeddable Tools
  • Content Resources - A steady stream of content should be flowing out of the platform, and syndicated everywhere.
    • Short Form (Blog)
    • Long Form (White Paper & Case Study)
  • Event Budget - FamilySearch needs to be everywhere, so people know that it exists. It can’t just be online.
    • Meetups
    • Hackathons
    • Conferences

There is nothing easy about this. It takes time, and resources, and there are only so many elements you can automate when it comes to API evangelism. For something that is very programmatic, it takes more of the human variable to make the API driven platform algorithm work. With that said it is possible to scale some aspects, and increase the awareness, presence, and effectiveness of FamilySearch platform efforts, which is really what is currently missing.

While as the API Evangelist, I cannot personally execute on every aspect of an API evangelism strategy for FamilySearch, I can provide essential planning expertise for the overall FamilySearch API strategy, as well as provide regular checkin with the team on how things are going, and help plan the roadmap. The two things I can bring to the table that are reflected in this proposal, is the understanding of where the FamilySearch API effort currently is, and what is missing to help get FamilySearch to the next stage of its platform evolution.

When operating within the corporate or organizational silo, it can be very easy to lose site of how other organizations, and companies, are approach their API strategy, and miss important pieces of how you need to shift your strategy. This is one of the biggest inhibitors of API efforts at large organizations, and is one of the biggest imperatives for companies to invest in their API strategy, and begin the process of breaking operations out of their silo.

What FamilySearch is facing demonstrates that APIs are much more than the technical endpoint that most believe, it takes many other business, and political building blocks to truly go from API to platform.


My Response To How Can the Department of Education Increase Innovation, Transparency and Access to Data?

I spent considerable time going through the Department of Education RFI, answering each question in as much detail as I possibly could. You can find my full response below. In the end I felt I could provide more value by summarizing my response, eliminating much of the redundancy across different sections of the RFI, and just cut through the bureaucracy as I (and APIs) prefer to do.

Open Data By Default
All publicly available data at the Department of Education needs to be open by default. This is not just a mandate, this is a way of life. There is no data that is available on any Department of Education websites that should not be available for data download. Open data downloads are not separate from existing website efforts at Department of Education, they are the other side of the coin, making the same content and data available in machine readable formats, rather than available via HTML—allowing valuable resources to be used in systems and applications outside of the department’s control.

Open API When There Are Resources
The answer to whether or not the Department of Education should provide APIs is the same as whether or not the agency should deploy websites—YES! Not all individuals and companies will have the resources to download, process, and put downloadable resources to use. In these situations APIs can provide much easier access to open data resources, and when open data resources are exposed as APIs it opens up access to a much wider audience, even non-developers. Lightweight, simple, API access to open data inventory should be default along with data downloads when resources are available. This approach to APIs by default, will act as the training ground for not just 3rd party developers, but also internally, allowing Department of Education staff to learn how to manage APIs in a safe, read-only environment.

Using A Modern API Design, Deployment, and Management Approach
As the usage of the Internet matured in 2000, many leading technology providers like SalesForce and Amazon began using web APIs to make digital assets available to 3rd party partners, and 14 years later there are some very proven approaches to designing, deploying and management APIs. API management is not a new and bleeding edge approach to making assets available in the private sector, there are numerous API tools and services available, and this has begun to extend to the government sector with tools like API Umbrella from NREL, being employed by api.data.gov and other agencies, as well as other tools and services being delivered by 18F from GSA. There are many proven blueprints for the Department of Education to follow when embarking on a complete API strategy across the agency, allowing innovation to occur around specific open data, and other program initiatives, in a safe, proven way.

Use API Service Composition For Maximum Access & Control
One benefit of 14 years of evolution around API design, deployment, and management is the establishment of sophisticated service composition of API resources. Service composition refers to the granular, modular design and deployment of APIs, while being able to manage who has access to these resources. Modern API access is not just direct, public access to a database. API service composition allows for designing exactly the access to resources that is necessary, one that is in alignment with business objectives, while protecting the privacy and security of everyone involved. Additionally service composition allows for real-time awareness of how all data, content, and other resources at the Department of Education are accessed and put to use, allowing new APIs to be designed to support specific needs, and existing APIs to evolved based upon actual demand, not just speculation.

Deeper Understanding Of How Resources Are Used
A modern API service composition layer opens up possibility for a new analytics layer that is not just about measuring and reporting of access to APIs, it is about understanding precisely how resources are accessed in real-time, allowing API design, deployment and management processes to be adjusted in a more rapid and iterative way, that contributes to the roadmap, while providing the maximum enforcement of security and privacy of everyone involved. When the Department of Education internalizes a healthy, agency-wide API approach, a new real-time understanding will replace this very RFI centered process that we are participating in, allowing for a new agility, with more control and flexibility than current approaches. A RFI cycle takes months, and will contain a great deal of speculation about what would be, where API access, coupled with healthy analytics and feedback loops, answers all the questions being addressed in this RFI, in real-time, reducing resource costs, and wasted cycles.

APIs Open Up Synchronous and Asynchronous Communication Channels
Open data downloads represents a broadcast approach to making Department of Education content, data and other resources available, representing a one way street. APIs provide a two-way communication, bringing external partners and vendors closer to Department of Education, while opening up feedback loops with the Department of Education, reducing the distance between the agency and its private sector partners—potentially bringing valuable services closer to students, parents and the companies or institutions that serve them. Feedback loops are much wider currently at the Department of Education occur on annual, monthly and at the speed of email or phone calls , with the closest being in person at events, something that can be a very expensive endeavor. Web APIs provide a real-time, synchronous and asynchronous communication layer that will improve the quality of service between Department of Education and the public, for a much lower cost than traditional approaches.

Building External Ecosystem of Partners
The availability of high value API resources, coupled with a modern approach to API design, deployment and management, an ecosystem of trusted partners can be established, allowing the Department of Education to share the workload with an external partner ecosystem. API service composition allows the agency to open up access to resources to only the partners who have proven they will respect the privacy and security of resources, and be dedicated to augmenting and helping extend the mission of the Department of Education. As referenced in the RFI, think about the ecosystem established by the IRS modernized e-file system, and how the H&R Blocks, and Jackson Hewitt’s of the world help the IRS share the burden of the country's tax system. Where is the trusted ecosystem for the Department of Education? The IRS ecosystem has been in development for over 25 years, something the Department of Education has to get to work on theirs now.

Security Fits In With Existing Website Security Practices
One of the greatest benefits of web APIs is that they utilize existing web technologies that are employed to deploy and manage websites. You don’t need additional security approaches to manage APIs beyond existing websites. Modern web APIs are built on HTTP, just like websites, and security can be addressed right alongside current website security practices—instead of delivering HTML, APIs are delivering JSON and XML. APIs even go further, and by using modern API service composition practices, the Department of Education gains an added layer of security and control, which introduces granular levels of access to all resource, something that does not exist for website. With a sensible analytics layer, API security isn’t just about locking down, it is about understanding who is access resources, how they are using them, striking a balance between the security and access of resources, which is the hallmark of APIs.

oAuth Gives Identity and Access Control To The Student
Beyond basic web security, and the heightened level of control modern API management deliver, there is a 3rd layer to the security and privacy layer of APis that does not exist anywhere else—oAuth. Open Authentication or oAuth provides and identity and access layer on top of API that gives end-users, and owner of personal data control over who access their data. Technology leaders in the private sector are all using oAuth to give platform users control over how their data is used in applications and systems. oAuth is the heartbeat of API security, giving API platforms a way to manage security, and how 3rd party developers access and put resources to use, in a way that gives control to end users. In the case of the Department of Education APIs, this means putting the parent and student at the center of who accesses, and uses their personal data, something that is essential to the future of the Department of Education.

How Will Policy Be Changed?
I'm not a policy wonk, nor will I ever be one. One thing I do know is you will never understand the policy implications in one RFI, nor will you change policy to allow for API innovation in one broad stroke--you will fail. Policy will have to be changed incrementally, a process that fits nicely with the iterative, evolutionary life cyce of API managment. The cultural change at Department of Education, as well as evolutionary policy change at the federal level will be the biggest benefits of APIs at the Department of Education. 

An Active API Platform At Department of Education Would Deliver What This RFI Is Looking For
I know it is hard for the Department of Education to see APIs as something more than a technical implementation, and you want to know, understand and plan everything ahead of time—this is baked into the risk averse DNA of government.  Even with this understanding, as I go through the RFI, I can’t help but be frustrated by the redundancy, bureaucracy, over planning, and waste that is present in this process. An active API platform would answer every one of your questions you pose, with much more precision than any RFI can ever deliver.

If the Department of Education had already begun evolving an API platform for all open data sets currently available on data.gov, the agency would have the experience in API design, deployment and management to address 60% of the concerns posed by this RFI. Additionally the agency would be receiving feedback from existing integrators about what they need, who they are, and what they are building to better serve students and institutions. Because this does not exist there will be much speculation about who will use Department of Education APIs, and how they will use them and better serve students. While much of this feedback will be well meaning, it will not be rooted in actual use cases, applications and existing implementations. An active API ecosystem answers these questions, while keeping answers rooted in actual integrations, centered around specific resources, and actual next steps for real world applications.

The learning that occurs from managing read-only API access, to low-level data, content and resources would provide the education and iteration necessary for the key staff at Department of Education to reach the next level, which would be read / write APIs, complete with oAuth level security, which would be the holy grail in serving students and achieving the mission of the Department of Education. I know I’m biased, because of my focus on APIs, but read / write access to all Department of Education resources over the web and via mobile devices, that gives full control to students, is the future of the agency. There is no "should we do APIs", there is only the how, and I’m afraid we are wasting time, and we need to just do it, and learn to ask these questions along the way.

There is proven technology and processes available to make all Department of Education data, content and resources available, allowing both read and write access in a secure way, that is centered around the student. The private sector is 14 years ahead of the government in delivering private sector resources in this way, and other government agencies are ahead of the Department of Education in doing this as well, but there is an opportunity for the agency to still lead and take action, by committing the resources necessary to not just deploy a single API, but internalize APIs in a way that will change the way learning occurs in the coming decades across all US institutions.


A. Information Gaps and Needs in Accessing Current Data and Aid Programs

1. How could data sets that are already publicly available be made more accessible using APIs? Are there specific data sets that are already available that would be most likely to inform consumer choice about college affordability and performance?

Not everyone has the resources download, process and put open datasets to use. APIs can make all of the publicly available datasets more available to the public, allowing for easy URL access, deployment of widgets, visualizations as well as integration with existing tools like Microsoft Excel. All datasets should have option of being published in this way, but ultimately the Dept. of Ed API ecosystem should speak to which datasets would be most high value, and warrant API access.

2. How could APIs help people with successfully and accurately completing forms associated with any of the following processes: FAFSA; Master Promissory Note; Loan Consolidation; entrance and exit counseling; Income-Driven Repayment (IDR) programs, 15 such as Pay As You Earn; and the Public Student Loan Forgiveness program?

APIs will help decouple each data point on a form. Introductory information, each questions, and other supporting resources can be broken up and delivered via any website, and mobile applications. Evolving a form into a linear, 2-dimensional form into an interactive application that people can engage with, providing the assistance needed to properly achieve the goals surrounding a form.

Each form initiative will have its own needs, and a consistent API platform and strategy from the department of Education will help identify each forms unique requirements, and the custom delivery of just the resources that are needed for a forms target audience.

3. What gaps are there with loan counseling and financial literacy and awareness that could be addressed through the use of APIs to provide access to government resources and content?

First, APIs can provide access to the content that educates students about the path they are about to embark on, before they do, via web and mobile apps they frequent already, not being required to visit the source site and learn. Putting the information students need into their hands, via their mobile devices will increase the reach of content and increase the chances that students will consume.

Second, APIs plus oAuth will give students access over their own educational finances, forcing them to better consider how they will manage all the relationships they enter into, the details of loans, grants and with the schools they attend. With more control over data and content, will come a forced responsibility in understanding and managing their finances.

Third, this process will open up students eyes to the wider world of online data and information, and that APIs are driving all aspects of their financial life from their banking and credit cards to managing their online credit score.

APIs are at the heart of all of the API driven digital economy, the gift that would be given to students when they first leave home, in the form of API literacy would carry with them throughout their lives, allowing them to better manage all aspects of their online and financial lives—and the Department of Education gave them that start.

4. What services that are currently provided by title IV student loan servicers could be enhanced through APIs (e.g., deferment, forbearance, forgiveness, cancellation, discharge, payments)?

A consistent API platform and strategy from the department of Education would provide the evolution of a suite of verified partners, such as title IV student loan services. A well planned partner layer within an ecosystem would allow student loan services to access data from students in real-time, with students having a say in who and how they have access to the data. These dynamics introduced by, and unique to API platforms that employ oAuth, provide new opportunities for partnerships to be established, evolve and even be terminated when not going well.

API platform using oAuth provide a unique 3-legged relationship between the data platform, 3rd party service providers and students (users), that can be adopted to bring in existing industry partners, but more importantly provide a rich environment for new types of partners to evolve, that can improve the overall process and workflow a student experiences.

5. What current forms or programs that already reach prospective students or borrowers in distress could be expanded to include broader affordability or financial literacy information?

All government forms and programs should be evaluated for the pros / cons of an API program. My argument within this RFI response will be focused on a consistent API platform and strategy from the department of Education. APIs should be be part of every existing program change, and new initiatives in the future.

B. Potential Needs to be Filled by APIs

1. If APIs were available, what types of individuals, organizations, and companies would build tools to help increase access to programs to make college more affordable?

A consistent API platform and strategy from the department of Education will have two essential components, partner framework, and service composition. A partner framework defines which external, 3rd party groups can work with Department of Education API resources. The service composition defines how these 3rd party groups can can access and ultimately use Department of Education API resources.

All existing groups that the Department of Education interacts with currently should be evaluated for where in the API partner framework they exists, defining levels of access for general public, student up to certified and trusted developer and business partnerships.

The partner framework and service composition for the Department of Education API platform should be applied to all existing individuals, organizations and companies, while also allow for new actors to enter the game, and potentially redefining the partner framework and add new formulas for API service composition, opening up the possibilities for innovation around Department of Education API resources.

2. What applications and features might developers, schools, organizations, and companies take interest in building using APIs in higher education data and services?

As with which Department of Education forms and programs might have APIs apply, which individuals, organizations and companies will use APIs, the only way to truly understand what applications might developers, schools, organizations and companies put APIs cannot be know, until it is place. These are the questions an API centric company or institution asks of its API platform in real-time. You can’t define who will use an API and how they will use it, it takes iteration and exploration before successful applications will emerge.

3. What specific ways could APIs be used in financial aid processes (e.g., translation of financial aid forms into other languages, integration of data collection into school or State forms)?

When a resource is available via an API, it is broken down into the smallest possible parts and pieces possible, allowing them to be re-used, and re-purposed into every possible configuration management. When you make form questions independently available via an API, it allows you to possible reorder, translate, and ask in new ways.

This approach works well with forms, allowing each entry of a form to be accessible, transferable, and open up for access, with the proper permissions and access level that is owned by the person who owns the format data. This opens up not just the financial aid process, but all form processes to interoperate with other systems, forms, agencies and companies.

With the newfound modularity and interoperability introduced by APIs, the financial aid process could be broken down, allowing parents to take part for their role, schools for theirs, and allow multiple agencies to be engaged such as IRS or Department of Veterans Affairs (VA). All of this allows any involved entity or system to do its part for the financial aid process, minimizing the friction throughout the entire form process, even year over year.

4. How can third-party organizations use APIs to better target services and information to low-income students, first-generation students, non-English speakers, and students with disabilities?

Again, this is a questions that should be asked in real-time of a Department of Education platform. Examples of how 3rd party organizations can better target services and information to students, is the reason for an API platform. There is no way to no this ahead of time, I will leave to domain experts to attempt at answering.

5. Would APIs for higher education data, processes, programs or services be useful in enhancing wraparound support service models? What other types of services could be integrated with higher education APIs?

A sensibly design,deployed, managed and evangelized API platform would establish a rich environment for existing educational services to be augmented, but also allow for entirely new types of services to be defined. Again I will leave to domain experts to speak of specific service implantations based upon their goals, and understanding of the space.

C. Existing Federal and Non-Federal Tools Utilizing APIs

1. What private-sector or non-Federal entities currently offer assistance with higher education data and student aid programs and processes by using APIs? How could these be enhanced by the Department’s enabling of additional APIs?

There are almost 10K public APIs available in the private sector. This should be viewed as a pallet for developers, and something that developers use as they are developing (painting) their apps (painting). It is difficult for developers to know what they will be painting with, without knowing what resources are available. The open API innovation process rarely is able to articulate what is needed, then make that request for resources—API innovations occurs when valuable, granular resources are available fro multiple sources, ad developers assemble them, and innovate in new ways.

2. What private-sector or non-Federal entities currently work with government programs and services to help people fill out government forms? Has that outreach served the public and advanced public interests?

Another question that should be answered by the Department of of Education, and providing us with the answers. How would you know this without a properly definitely partner framework? Stand up an API platform, and you will have the answer.

3. What instances or examples are there of companies charging fees to assist consumers in completing otherwise freely available government forms from other agencies? What are the advantages and risks to consider when deciding to allow third parties to charge fees to provide assistance with otherwise freely available forms and processes? How can any risks be mitigated?

I can't speak to what is already going on in the space, regarding companies charging feeds to consumers, I am not expert on the education space at this level. This is just such a new paradigm made possible via APIs and open data, there just aren’t that many examples in the space, built around open government data.

First, the partner tiers of API platforms help verify and validate individuals and organizations who are building applications and charging for services in the space. A properly design, managed and policed partner tier can assist in mitigating risk in the evolution of such business ecosystems.

Second API driven security layers using oAuth give access to end-users, allowing students to take control over which applications and ultimately service providers have access to their data, revoking when services are done or a provider is undesirable. With proper reporting and rating systems, policing of the API platform can be something that is done within the community, and the last mile of policing being done by the Department of Education.

Proper API management practices provide the necessary identity, access and control layers necessary to keep resources and end-users safe. Ultimately who has access to data, can charge fees, and play a role in the ecosystem is up to Department of education and end-users when applications are built on top of APIs.

4. Beyond the IRS e-filing example, what other similar examples exist where Federal, State, or local government entities have used APIs to share government data or facilitate participation in government services or processes - particularly at a scale as large as that of the Federal Student Aid programs?

This is a new, fast growing sector, and there are not a lot of existing examples, but there area few:

Open311
An API driven system that allows citizens to report and interact with municipalities around issues within communities. While Open311 is deployed in specific cities such as Chicago and Baltimore, it is an open source platform and API that can be deployed to serve any size market.

Census Bureau
The US Census provides open data and APIs, allowing for innovation around government census survey data, used across the private sector in journalism, healthcare, and many other ways. The availability of government census data is continually spawning new applications, visualizations and other expressions, that wouldn’t be realized or known, if the platform wasn’t available.

We The People
The We The People API allows for 3rd-Party integration with the White House Petition process. Currently only allowing for read only access to the information, and the petition process, but is possibly one way that write APIs will emerge in federal government.

There are numerous examples of open APIs and data being deployed in government, even from the Department of Education. All of them are works in progress, and will realize their full potential over time, maturation and much iteration and engagement with the public.

D. Technical Specifications

1. What elements would a read-write API need to include for successful use at the Department?

There are numerous building blocks can be employed in managing read-write APIs, but there are a couple that will be essential to successful read-write APIs in government:

Partner Framework
Defined access tiers for consumers of API data, with appropriate public, partner and private (internal) levels of access. All write methods are only accessible by partner and internal levels of access, requiring verification and certification of companies and individuals who will be building on top of API resources.

Service Management
The ability to compose many different types of API resource access, create service bundles that are made accessible to different levels of partners. Service management allows for identity and access management, but also billing, reporting, and other granular level control over how services are composed, accessed and managed.

Open Authentication (oAuth 2.0)
All data made available via Department of Education API platforms and involves personally identifiable information will require the implementation of an open authentication or oAuth security layer. oAuth 2.0 provides an identity layer for the platform, requiring developers to use token that throttle access to resources for applications, a process that is initiated, managed and revoked by end-users—providing the highest level of control over who has access to data, and what they can do with it, by the people who personal data is involved.

Federated API Deployments
Not all APIs should be deployed and managed within the Department of Education firewall. API platforms can be made open source so that 3rd party partners can deploy within their own environments. Then via a sensible partner framework, the Department of Education can decide which partners they should not just allow to write to APIs, but also pull data from their trusted systems and open API deployments.

APIs provide the necessary access to all of federal government API resources, and a sensible partner framework, service management layer in conjunction with oAuth will provide the necessary controls for a read / write API in government. If agencies are looking to further push risk outside the firewall, federated API deployments with trusted partners will have to be employed.

2. What data, methods, and other features must an API contain in order to develop apps accessing Department data or enhancing Department processes, programs, or services?

There are about 75 common building blocks for API deployments (http://management.apievangelist.com/building-blocks.html), aggregated after looking at almost 10K public API deployments. Each government API will have different needs when it comes to other supporting building blocks.

3. How would read-only and/or read-write APIs interact with or modify the performance of the Department’s existing systems (e.g., FAFSA on the Web)? Could these APIs negatively or positively affect the current operating capability of such systems? Would these APIs allow for the flexibility to evolve seamlessly with the Department’s technological developments?

There are always risks with API access to resources, but a partner framework, service management, oAuth, and other common web security practices these risks can be drastically reduce, and mitigated in real-time

Isolated API Deployments
New APIs should rarely be deployed and directly connected to existing systems. APIs can be deployed as an isolated interface, with an isolated data store. Existing systems can use the same API interface to read / write data into the system and keep in sync with existing internal systems. API developers will never have access to existing system and data stores, just isolated, defined API interfaces as part of a secure partner tier, only accessing the services they have permission to, and the end-user data that has been given access to by end-users themselves.

Federated Deployments
As described above, if government agencies are looking to further reduce risk, API deployments can be designed and deployed as open source software, allowing partners with the ecosystem to download and deploy. A platform partner framework can provide a verification and certification process for federal API deployments, allowing the Department of Education to decide who they will pull data from, reducing the risk to internal systems, providing a layer of trust for integration.

Beyond these approaches to deploying APIs, one of the biggest benefits of web API deployments is they use the same security as other government websites, just possessing an additional layer of securing determining who has access, and to what.

It should be the rare instance when an existing system will have an API deployed with direct integration. API automation will provide the ability to sync API deployments with existing systems and data stores.

4. What vulnerabilities might read-write APIs introduce for the security of the underlying databases the Department currently uses?

As stated above, there should be no compromise in how data is imported into existing databases at the Department of Education. It is up to the agency to decide which APIs they pull data from, and how it is updated as part of existing systems.

5. What are the potential adverse effects on successful operation of the Department’s underlying databases that read-write APIs might cause? How could APIs be developed to avoid these adverse effects?

As stated above, isolated and external, federated API deployments will decouple the risk from existing systems. This is the benefit of APIs, is they can deployed as isolated resources, then integration and interoperability, internally and externally is up to the consumer to decide what is imported and what isn’t.

6. How should APIs address application-to-API security?

Modern API partner framework, service management and oath provide the necessary layer to identify who has access, and what resources can be used by not just a company and user, but by each application they have developed.

Routing all API access through the partner framework plus associated service level, will secure access to Department of Education resources by applications, with user and app level logging of what was accessed and used within an application.

OAuth provides a balance to this application to API security layer, allowing the Department of Education to manage security of API access, developers to request access for their applications, but ultimately control is in the hand of end users to define which applications have access to their data.

7. How should the APIs address API-to-backend security issues? Examples include but are not limited to authentication, authorization, policy enforcement, traffic management, logging and auditing, TLS (Transport Layer Security), DDoS (distributed denial-of-service) prevention, rate limiting, quotas, payload protection, Virtual Private Networks, firewalls, and analytics.

Web APIs use the exact same infrastructure as websites, allowing for the re-use of existing security practices employed for websites. However APIs provide the added layer of security, logging, auditing and analytics provided through the lens of the partner framework, service composition and only limited by the service management tooling available.

8. How do private or non-governmental organizations optimize the presentation layer for completion and accuracy of forms?

Business rules. As demonstrated as part of a FAFSA API prototype, business rules for each form field, along with rejection codes can also be made available via an API resources, allowing for developers to build in a form validation layer into all digital forms.

After submission, and the first line of defense provide red by API developers building next generation forms, platform providers can provide further validation, review and ultimately a status workflow that allows forms to be rejected or accepted based upon business logic.

9. What security parameters are essential in ensuring there is no misuse, data mining, fraud, or misrepresentation propagated through use of read- only or read-write APIs?

A modern API service management layer allows the platform provider to see all API resources that are being access, by whom, and easily establish patterns for healthy usage, as well as patterns for misuse. When misuse is identified, service management allows providers to revoke access, and take action against companies and individuals.

Beyond the platform provider, APIs allow for management by end-users through common oAuth flows and management tools. Sometimes end-users can identify an app is misusing their data, even before a platform provider might. oAuth gives them the control to revoke access to their data, via the API platform.

oauth, combined with API service management tooling has allowed for a unique security environment in which the platform can easily keep operations healthy, but end-users and developers can help police the ecosystem as well. If platform providers give users the proper rating and reporting tools, they can help keep API and data consumers in check.

10. With advantages already built into the Department’s own products and services (e.g., IRS data retrieval using FAFSA on the Web), how would new, third-party API-driven products present advantages over existing Department resources?

While existing products and services developed within the department do provide great value, the Department of Education cannot do everything on their own. Because of the access the Department has, some features will be better by default, but this won’t be the case in all situations.

The Department of Education and our government does not have unlimited resources, and with access to ALL resources available via the department the private sector can innovate, helping share the load of delivering vital services. Its not whether or not public sector products and services are better than private sector or vice vera, it is about the public sector and private sector partnering wherever and whenever it make sense.

11. What would an app, service or tool built with read-write API access to student aid forms look like?

Applications will look like turbotax and tax act developed within the IRS ecosystem, and look like the tools developed by the Sunlight Foundation on top of government open data and APIs.

We will never understand what applications are possible until the necessary government resources are available. All digital assets should be open by default, with consistent API platform and strategy from the department of Education, and the platform will answer this question.

E. Privacy Issues

1. How could the Department use APIs that involve the use of student records while ensuring compliance with potentially applicable statutory and regulatory requirements, such as the Family Educational Rights and Privacy Act (20 U.S.C. § 1232g; 34 CFR Part 99) and the Privacy Act (5 U.S.C. § 552a and 34 CFR Part 5b)?

As described above the partner framework, service management and oAuth layer provides the control and logging necessary to execute and audit as part of any application statutory and regulatory requirement.

I can’t articulate enough how this layer provides a tremendous amount of control over how these resources are access, giving control to the involved parties who matter the most—end-users. All API traffic is throttled, measured and reviewed as part of service management, enforcing privacy that in a partnership between the Department of Education, API consumers and end-users.

2. How could APIs ensure that the appropriate individual has provided proper consent to permit the release of privacy-protected data to a third party? How can student data be properly safeguarded to prevent its release and use by third parties without the written consent often required?

As articulated above the partner framework, service management and oAuth address this. This is a benefit of API deployment, breaking down existing digital access, providing access and granular control, combined with oAuth and logging of all access—APIs take control to a new level.

oAuth has come to represent this new balance in security and control of digital resources, allowing the platform, developers and end-users to execute within their defined role on the platform. This balance introduced by APIs and oAuth, allow data to be safeguarded, while also opening up for the widest possible use in the next generation applications and other implementations.

3. How might read-only or read-write APIs collect, document, and track individuals’ consent to have their information shared with specific third parties?

oAuth. Period.

4. How can personally identifiable information (PII) and other financial information (of students and parents) be safeguarded through the use of APIs?

Access of personally identifiable information (PII) via Department of Education APIs will be controlled by students and their parents. The most important thing you can do to protect PII is to give the owner of that data, education about how to allow developer access to it in responsible ways that will benefit them.

APIs open up access, while oAuth will give the students and parents the control they need to integrate with apps, and existing system to achieve their goals, while retaining the greatest amount of over safeguarding their own data.

5. What specific terms of service should be enabled using API keys, which would limit use of APIs to approved users, to ensure that information is not transmitted to or accessed by unauthorized parties?

A well designed partner layer would define multiple level of access, combined with sensible service packages, will establish the terms of service levels that will be bundled with API keys and oAuth level identity and access to personally identifiable information.

Common approaches to deploying partner layers with appropriate service tiers, using oAuth have been well established over the last 10 years in the private sector. Controlling access to API resources at a granular level, providing the greatest amount of access that makes sense, while knowing who is access data and how they are using is what APIs are designed for.

6. What are the relative privacy-related advantages and disadvantages of using read-only versus read-write APIs for student aid data?

You will face many of the similar privacy concerns whether an API is read or write. If it is personably identifiable information, read or write access to the wrong parties violates a student's privacy. Just ensure that data is updated via trusted application providers is essential.

A properly defined partner layer will separate who has read and who has write access. Proper logging and versioning of data is essential to ensure data integrity, allowing end-users to manage their data via an application or system with confidence.

F. Compliance Issues

1. What are the relative compliance-related advantages and disadvantages of using read-only versus read-write APIs for student aid data?

APIs provide a single point of access to student aid data. With the implementation of proper partner framework, service management and oAuth every single action via this doorway is controlled and logged. When it comes to auditing ALL operations whether it is from the public, partners or internal, APIs excel in satisfying compliance concerns.

2. How can the Department prevent unauthorized use and the development of unauthorized products from occurring through the potential development of APIs? How might the Department enforce terms of service for API key holders, and prevent abuse and fraud by non-API key holders, if APIs were to be developed and made available?

As described above the partner framework, service management and oAuth will provide the security layer needed to manage 99% of potential abuse, but overall enforcement via the API platform is a partnership between the Department of Education, API consumers as well as end-users. The last mile of enforcement will be executed by the Department of Education, but it will be up to the entire ecosystem and platform to police and enforce in real-time.

3. What kind of burden on the Department is associated with enforcing terms and conditions related to APIs?

The Department of Education will handle the first line of defense, in defining partner tiers and service composition that wraps all access to APis. The Department will also be the last mile of decision making and enforcement when violations occur. The platform should provide the data needed by the department to make decision as well as the enforcement necessary in the form of API key and access revocation, and banning apps, individuals and business from the ecosystem.

4. How can the Department best ensure that API key holders follow all statutory and regulatory provisions of accessing federal student aid funds and data through use of third-party products?

First line of define to ensure that API key holders follow all statutory and regulatory provision will be verification and validation of partners upon registration, applications going into production and availability in application galleries and other directories in which students discover apps.

Second line of defense will be reporting requirements and usage patterns of API consumers and their apps. If applications regular meet self-reporting requirements and real-time patterns establishing healthy or unhealthy behavior, they can retain their certification. If partners fail to comply they will be restricted from the API ecosystem.

Last line of defense is the end-users, the students and parents. All end-users need to be educated regarding the control they have, given reporting and ranking tools that allow them file complaints and rank the applications that are providing quality services.

As stated several times, enforcement will be a community effort, something the Department of Education has ultimate control of, but requires giving the community agency as well.

5. How could prior consent from the student whom the data is about be provided for release of privacy- protected data to third party entities?

An API with oAuth layer is this vehicle. Providing the access, logging all transactions, and holding all partners to a quality of service. All the mechanism are there, in a modern API implementation, the access just needs to be defined.

6. How should a legal relationship between the Department and an API developer or any other interested party be structured?

I’m not a lawyer. I’m not a policy person. Just can’t contribute to this one.

7. How would a legal relationship between the Department and an API developer or any other interested party affect the Department’s current agreements with third-party vendors that operate and maintain the Department’s existing systems?

All of this will be defined in each partner tier, combined with appropriate service levels. With isolated API deployments, this should not affect currently implementations.

However a benefit of consistent API strategy is that existing vendors can access resources via APis, increasing the agility and flexibility of existing contracts. APIs are a single point of access, not just for the public, but 3rd party partners as well as internal access. Everyone involved can participate and receive benefits of API consumption.

8. What disclosures should be made available to students about what services are freely available in government domains versus those that could be offered at a cost by a third party?

A partner tier for the API platform will define the different levels of partners. Trusted, verified and certified partners will get different recommendation levels and access than lesser known services, and applications from 3rd party with lesser trusted levels of access.

9. If the Department were to use a third-party application to engage with the public on its behalf, how could the Department ensure that the Department follows the protocols of OMB Memorandum 10-23?

Again, the partner tier determines the level of access to the partner and the protocols of all OMB memorandum call be built in. Requiring all data, APIs and code is open sourced, and uses appropriate API access tiers showing how data and resources are accessed and put to use.

API service management provides the reporting necessary to support government audits and regulations. Without this level of control on top of an API, this just isn’t possible in a scalable way, that APIs plus web and mobile applications offer.

G. Policy Issues

1. What benefits to consumers or the Department would be realized by opening what is currently a free and single-point service (e.g., the FAFSA) to other entities, including those who may charge fees for freely-available services and processes? What are the potential unintended consequences?

Providing API access to government resources is an efficient and sensible use of taxpayers money, and reflect the mission of all agencies, not just the Department of Education. APIs introduce the agility and flexibility needed to deliver the next generation government application and services.

The economy in a digital age will require a real-time partnership between the public sector and the private sector, and APIs are the vehicle for this. Much like it has done for private sector companies like Amazon and Google, APIs will allow the government to create new services and products that serve constituents with the help of the private sector, while also stimulating job growth and other aspects of the economy.

APIs will not all be an up-side, each program and initiative will have its own policy problems and unintended consequences. One problem that plagues API initiatives is enough resources in the form of money and skilled works to make sure efforts are successful. Without the proper management, poorly executed APIs can open up huge security holes, introduce privacy concerns at a scale never imagined.

APIs need to be managed properly, with sensible real-time controls for keeping operations in check.

2. How could the Department ensure that access to title IV, HEA student aid programs truly remains free, even amidst the potential development of third-party apps that may charge a fee for assistance in participating in free government programs, products, and services with or without providing legitimate value-added services?

Partner Framework + Service Management = Quality of Service Across Platform

3. What other policy concerns should the Department consider with regard to the potential development of APIs for higher education data and student aid processes at the Department?

Not a policy or education expert, I will leave this to others to determine. Also something that should be built into API operations, and discovered on a program by program basis.

4. How would APIs best interact with other systems already in use in student aid processes (e.g., within States)?

The only way you will know is if you do it. How is the IRS-efile system helping with this, but it isn’t even a perfect model to follow. We will never know the potential here until a platform is stood up, and resources are made available. All signs point to APIs opening up a huge amount of interoperability between not just states and the federal government, but also with cities and counties.

5. How would Department APIs benefit or burden institutions participating in title IV, HEA programs?

If APIs aren’t given the proper resources to operate it can introduce security, privacy and support concerns that would not have been there before. A properly run API initiative will provide support, while an underfunded, undermanned initiative will just further burden institutions.

6. While the Department continues to enhance and refine its own processes and products (e.g., through improvements to FAFSA or the IDR application process), how would third-party efforts using APIs complement or present challenges to these processes?

These two things should not be separate. The internal efforts should be seen as just another partner layer within the API ecosystem. All future service and products developed internally within the Department of Education should use the same API infrastructure developed for partners and the public.

If APIs are not used internally, API efforts will always fail. APIs are not just about providing access to external resources, it is about opening up the Department to think about its resources in an external way that benefits the public, partners as well as within the government.


The Future Of Public Private Sector Partnerships Being Negotiated At The API oAuth Scope Level

A couple of weeks ago I attended a two day API specification session between major California utilities, Southern California Edison (SCE), San Diego Gas & Electric (SDG&E), and Pacific Gas and Electric (PG&E), that was organized by Hypertek Inc. for National Institute of Standards and Technology (NIST), that is looking to push forward the Green Button data and API agenda of the White House and Department of Energy.

The Green Button API and open data model already exists, but the current hurdle for the initiative is to get leading utilities to agree to specific implementation definitions that serve their customers, so it can be ratified as a standard. This entire two day session was dedicated to walking through line by line, and establish the oAuth scope that each of the three utility companies would implementing when providing Green Button data via the API to 3rd party developers, who are building solutions for utility customers.

An example of this in the wild, would be if a utility customer wanted to get a quote for a rooftop solar installation, the installer could use a software solution that was developed by a 3rd party vendor, that pulls that customers energy usage via the Green Button API, and generate a precise quote for the number of solar panels they’d need to cover their energy usage. Now this is just one example of how energy data could be used, but a very powerful and relevant one in 2014, that give customers access and control over their data.

Before we even get there, the Green Button API definition, data model, and oAuth scope has to be finalized with utility providers, before it can be submitted as a standard—with the final oAuth scope being what will be agreed to by each 3rd party developer that integrates with each utility's Green Button API. This oAuth scope sets the tone for the conversation that 3rd party software providers, and vendors can ultimately have with utility customers around their private utility data.

In my opinion this is one potential model for how industry operations will be negotiated in the future. While not all API definitions, data models and oAuth scopes will be defined by the government then negotiated with industries, but oAuth will be where this scope is negotiated between industry leaders and governments, no matter which side leads the conversation. Governments and industries working together to define API standards is nothing new, but what is new, is the presence of the oAuth layer which gives end-users, and citizens an active role. 3rd party developers and end-users do not get a place in the negotiations, but they do when it comes to putting valuable industry data to use in new ways--or not, which will flow back upstream and influence future iterations of APIs, data models and oAuth scope.

This introduction of the oAuth layer in how industries operate, potentially coupled with the access and innovation that can occur via API platforms when they are executed in alignment with modern API implementations, will change how industries grow and ultimately how power flows (pun intended). This evolution in how the public and private sector partner will not always positive by default, I’m not an API solutionist in that I believe APIs bless everything as good, but with the right players at the table, a little sunlight and transparency in the process, and a feedback loop that includes 3rd party developers and end-users, we might be able to implement meaningful change across many different industries around the globe.


Work With Your Partners To Iterate On Your APIs

I received a press release from AT&T last week, about a new partnership with Sabre to develop what is called location information services (LIS). Ben Kepes has a good summary of the partnership over at Forbes, so I won't add my own analysis of what the partnership means. After speaking with the AT&T team today, and I’d prefer to focus on the importance of partnerships when developing your own APIs--sharing AT&T's wisdom with my audience

When listening to Laura Merling and Chris Aron of AT&T talk about the Sabre partnership, I couldn't help but think about the importance of working with trusted partners, in developing your valuable API resources. In the open API world, many companies open themselves up to iterating on API resources with a large group of public developers, a process that can make your API strategy feel very schizophrenic—in contrast making designing, developing and iterating on API design in closed environment with trusted partners much more, well sane.

Location Information Services With Transportation and Travel Partners
The location information services that AT&T is partnering with Sabre on, began as a set of location services they first worked with AAA on to help them locate drivers in need of roadside assistance. Now AT&T is working with Sabre to evolve these same location APIs and expand not just to new business sectors like travel or retail, but improve accuracy of a mobile user's location by not just using a cell signal, but also wifi and bluetooth.

Mobile Identity Toolkit With Banking and Financial Partners
In addition to location APIs, AT&T recently developed a mobile identity toolkit and what better way to help define the most valuable toolkit possible, then to work with banking and financial partners to design and deploy a toolkit that reflect exactly what they need. The result is a set of API fueled services that makes apps more secure, and reduces fraud via mobile devices.

Sponsored Data With Multiple Partners
Beyond location and identity, AT&T has also been working on the APIs that drive their new sponsored data initiative. This new program is much more sensitive than other API resources, as it directly touches AT&T user's bills, and required working with trusted partners to define a set of services that worked properly. AT&T worked with existing partners in defining everything from how the APIs will work to how these sponsored data services would end up looking on AT&T customers bills.

These are three examples of how AT&T is leveraging partners to plan, design, develop and evolve their catalog of API resources. While AT&T has their open developer ecosystem, a partner centric approach keeps things on track, by defining APIs that have proven needs, via existing partners, and represent expanding on existing revenue streams.

As with the rest of us in the space, AT&T is working to figure out the best path forward with APIs. Their approach to leveraging their partner ecosystem to provide fuel for their API initiatives is something we can all learn from. I’m not saying stop evolving your public APIs out in the open, but for newer, or possibly higher value API resources, you may want to consider kicking things off with your trusted partners.


Open Data And API Efforts Rendered Useless When Privacy Is Ignored

On the second anniversary of the Open Government Partnership (OGP), where we are celebrating a "global effort to encourage transparent, effective, and accountable governance", and that:

OGP has grown to 60 countries that have made more than 1000 commitments to improve the governance of more than two billion people around the globe. OGP is now a global community of government reformers, civil society leaders, and business innovators working together to develop and implement ambitious open government reforms and advance good governance.

That is some pretty significant platform growth! While reading this I'm reminded of how any amount of perceived growth and value delivered via an "open data or API platform" can be immediately muted by the omission of very fundamental building blocks like privacy.

Let's review the building blocks of the Open Government Alliance:

  • Expand Open Data - Open Data fuels innovation that grows the economy and advances government transparency and accountability. Government data has been used by journalists to uncover variations in hospital billings, by citizens to learn more about the social services provided by charities in their communities, and by entrepreneurs building new software tools to help farmers plan and manage their crops. Building upon the successful implementation of open data commitments in the first U.S. National Action Plan, the new Plan will include commitments to make government data more accessible and useful for the public, such as reforming how Federal agencies manage government data as a strategic asset, launching a new version of Data.gov, and expanding agriculture and nutrition data to help farmers and communities.
  • Modernize the Freedom of Information Act (FOIA) - The FOIA encourages accountability through transparency and represents a profound national commitment to open government principles. Improving FOIA administration is one of the most effective ways to make the U.S. Government more open and accountable. Today, the United States announced a series of commitments to further modernize FOIA processes, including launching a consolidated online FOIA service to improve customers’ experience and making training resources available to FOIA professionals and other Federal employees.
  • Increase Fiscal Transparency - The Administration will further increase the transparency of where Federal tax dollars are spent by making federal spending data more easily available on USASpending.gov; facilitating the publication of currently unavailable procurement contract information; and enabling Americans to more easily identify who is receiving tax dollars, where those entities or individuals are located, and how much they receive.
  • Increase Corporate Transparency - Preventing criminal organizations from concealing the true ownership and control of businesses they operate is a critical element in safeguarding U.S. and international financial markets, addressing tax avoidance, and combatting corruption in the United States and abroad. Today we committed to take further steps to enhance transparency of legal entities formed in the United States.
  • Advance Citizen Engagement and Empowerment - OGP was founded on the principle that an active and robust civil society is critical to open and accountable governance. In the next year, the Administration will intensify its efforts to roll back and prevent new restrictions on civil society around the world in partnership with other governments, multilateral institutions, the philanthropy community, the private sector, and civil society. This effort will focus on improving the legal and regulatory framework for civil society, promoting best practices for government-civil society collaboration, and conceiving of new and innovative ways to support civil society globally.
  • More Effectively Manage Public Resources - Two years ago, the Administration committed to ensuring that American taxpayers receive every dollar due for the extraction of the nation’s natural resources by committing to join the Extractive Industries Transparency Initiative (EITI). We continue to work toward achieving full EITI compliance in 2016. Additionally, the U.S. Government will disclose revenues on geothermal and renewable energy and discuss future disclosure of timber revenues.

How can you argue with that? Its very sensible set of open government platform building blocks right? However, when you look at the bigger picture you realize there is a significant building block, that us in the tech sector have realized is essential to a healthy platform ecosystem missing:

  • Citizen Data Privacy - Ensuring that government respects the online privacy of each and every U.S. citizen, preventing unwanted harvesting of private data or meta data that exists in cloud environments, computer and mobile devices as well as transported across telecommunications infrastructure locally or abroad. When privacy is compromised in the name of law enforcement or national security, the laws, rules and procedures around these accepted situations are made publicly accessible.

It is great that our government is committed to expanding open data, increasing transparency and efficiently engaging citizens, and sensibly manage public resources. However if our government wants to act as an open platform, just like any private sector platform, they must respect user privacy.

Without ensuring privacy for users, it doesn't matter how forward thinking your open data, information and API strategy is. Privacy and security are essential building blocks any private or public sector entity looking to build an open platform.

Nice work around the Open Government Partnership, but without addressing the privacy of citizens it is rendered pretty useless.


IRS Modernized e-File (MeF): A Blueprint For Public & Private Sector Partnerships In A 21st Century Digital Economy (DRAFT)

Download as PDF

The Internal Revenue Service is the revenue arm of the United States federal government, responsible for collecting taxes, the interpretation and enforcement of the Internal Revenue code.

The first income tax was assessed in 1862 to raise funds for the American Civil War, and over the years the agency has grown and evolved into a massive federal entity that collects over $2.4 trillion each year from approximately 234 million tax returns.

While the the IRS has faced many challenges in its 150 years of operations, the last 40 years have demanded some of the agency's biggest transformations at the hands of technology, more than any time since its creation.

In the 1970s, the IRS began wrestling with the challenge of modernizing itself using the latest computer technology. This eventually led to a pilot program in 1986 of an new Electronic Filing System (EFS), which aimed in part to gauge the acceptance of such a concept by tax preparers and taxpayers.

By the 1980s, tax collection had become very complex, time-consuming, costly, and riddled with errors, due to what had become a dual process of managing paper forms while also converting these into a digital form so that they could be processed by machines. The IRS despereatly needed to establish a solid approach that would enable the electronic submission of tax forms.

It was a rocky start for the EFS, and Eileen McCrady, systems development branch and later marketing branch chief, remembers, “Tax preparers were not buying any of it--most people figured it was a plot to capture additional information for audits." But by 1990, IRS e-file operated nationwide, and 4.2 million returns were filed electronically. This proved that EFS offered a legitimate approach to evolving beyond a tax collection process dominated by paper forms and manual filings.

Even Federal Agencies Can't Do It Alone

Even with the success of early e-file technology, the program did not get the momentum it needed without the support of two major tax preparation partnerships--H&R Block and Jackson-Hewitt. These helped change the tone of EFS efforts, making it more acceptable and appealing to tax professionals. It was clear that e-File needed to focus on empowering a trusted network of partners to submit tax forms electronically, sharing the load of tax preparation and filing with 3rd party providers. And this included not just the filing technology, but a network of evangelists spreading the word that e-File was a trustworthy and viable way to work with the IRS.

Bringing e-File Into The Internet Age

By 2000, Congress had passed IRS RRA 98, which contained a provision setting a goal of an 80% e-file rate for all federal tax and information returns. This, in effect, forced the IRS to upgrade the e-File system for the Internet age, otherwise they would not be able meet this mandate. A working group was formed, comprised of tax professionals and software vendors that would work with the IRS to design, develop and implement the Modernized e-File(MeF)-Program-Information) system which employed the latest Internet technologies, including a new approach to web services which used XML that would allow 3rd party providers to submit tax forms in a real-time, transactional approach (this differed from the batch submissions required in a previous version of the EFS).

Moving Beyond Paper One Form At A Time

Evolving beyond a 100 years of paper process doesn't happen overnight. Even with the deployment of the latest Internet technologies, you have to incrementally bridge the legacy paper processes to a new online, digital world. After the deployment of the MeF, the IRS worked year by year to add the myriad of IRS forms to the e-File web service, allowing software companies, tax preparers, and corporations to digitally submit forms into IRS systems over the Internet. Form by form, the IRS was being transformed from a physical document organization to a distributed network of partners that could submit digital forms through a secure, online web service.

Technological Building Blocks

The IRS MeF solution represents a new approach to using modern technology by the federal government in the 21st century Internet age. In the last 15 years, a new breed of Internet enabled software standards have emerged that enable the government to partner with the private sector, as well as other government agencies, in ways that were unimaginable just a decade ago.

Web Services

Websites and applications are meant for humans. Web services, also known as APIs, are meant for other computers and applications. Web services has allowed the IRS to open up the submission of forms and data into central IRS systems, while also transmitting data back to trusted partners regarding errors and the status of form submissions. Web services allow the IRS to stick with what it does best, receiving, filing and auditing of tax filings, while trusted partners can use web services to deliver e-Filing services to customers via custom developed software applications.

Web services are designed to utilize existing Internet infrastructure used for everyday web operations as a channel for delivering trusted services to consumers around the country, via the web.

An XML Driven Communication Flow

XML is a way to describe each element of IRS forms, and its supporting data. XML makes paper forms machine readable so that the IRS and 3rd party systems can communicate using a common language, allowing IRS to share a common set of logic around each form, then use what is known as schemas, to validate the XML submitted by trusted partners against a set of established business rules that provide enforcement of the IRS code. XML gives the ability for IRS to communicate with 3rd party systems using digital forms, applying business rules to reject or accept the submitted forms, which then can be stored in an official IRS repository in a way that can be viewed and audited by IRS employees (using stylesheets which make the XML easily readable by humans).

Identity and Access Management (IAM)

When you expose web services publicly over the Internet, secure authentication is essential. The IRS MeF system is a model for securing the electronic transmission of data between the government and 3rd party systems. The IRS has employed a design of the Internet Filing Application (IFA) and Application to Application (A2A) which are features of the Web Services-Interoperability (WS-I) security standards. Security of the MeF system is overseen by the IRS MITS Cyber Security organization which ensures all IRS systems receive, process, and store tax return data in a secure manner. MeF security involves an OMB mandated Certification and Accreditation (C&A) Process, requiring a formal review and testing of security safeguards to determine whether the system is adequately secured.

Business Building Blocks

To properly extend e-File web services to partners isn't just a matter of technology. There are numerous building blocks required that are more business than technical, ensuring a healthy ecosystem of web service partners. With a sensible strategy, web services need to be translated from tech to business, allowing partners to properly translate IRS MeF into e-filing products that will deliver required services to consumers.

Four Separate e-Filing Options

MeF provided the IRS with a way to share the burden of filing taxes with a wide variety of trusted partners, software developers and corporations who have their own software systems. However MeF is just one tool in a suite of e-File tools. These include Free File software that any individual can use to submit their own taxes, as well as free fillable digital forms that individuals can use if they do not wish to employ a software solution.

Even with these simple options, the greatest opportunities for individuals and companies is to use commercial tax software that walks one through what can be a complex process, or to depend on a paid tax preparer who employ their own commercial versions of tax software. The programmatic web service version of e-file is just one option, but it is the heart of an entire toolkit of software that anyone can put to use.

Delivering Beyond Technology

The latest evolution of the e-file platform has technology at heart, but it delivers much more than just the transmission of digital forms from 3rd party providers, in ways that also make good business sense:

  • Faster Filing Acknowledgements - Transmissions are processed upon receipt and acknowledgements are returned in near real-time, unlike the once or twice daily system processing cycles in earlier versions
  • Integrated Payment Option - Tax-payers can e-file a balance due return and, at the same time, authorize an electronic funds withdrawal from their bank accounts, with payments being subject to limitations of the Federal Tax Deposit rules
  • Brand Trust - Allowing MeF to evolve beyond just the IRS brand, allowing new trusted commercial brands to step up and deliver value to consumer, like TurboTax and TaxAct.

Without improved filing results for providers and customers, easier payment options and an overall set of expectations and trust, MeF would not reach the levels of e-Filing rates mandated by Congress. Technology might be the underpinning of e-File, but improved service delivery is the thing that will seal the deal with both providers and consumers.

Multiple Options for Provider Involvement

Much like the multiple options available for tax filers, the IRS has established tiers of involvement for partners to be involved with the e-File ecosystem. Depending on the model and capabilities, e-File providers can step up and be participate in multiple ways:

  • Electronic Return Originators (EROs) - ERO prepare returns for clients or have collected returns from taxpayers who have prepared their own, then begin the electronic transmission of returns to the IRS
  • Intermediate Service Providers - These providers process tax return data, that originate from an ERO or an individual taxpayer, and forward to a transmitter.
  • Transmitters - Transmitters are authorized to send tax return data directly to the IRS, from custom software that connect directly with the IRS computers
  • Online Providers - Online providers are a type of transmitter that sends returns filed from home by taxpayers using tax preparation software to file common forms
  • Software Developers - write the e-file software programs that follow IRS specifications for e-file.
  • Reporting Agents - An accounting service, franchiser, bank or other person that is authorized to e-file Form 940/941 for a taxpayer.

The IRS has identified the multiple ways it needed help from an existing, evolving base of companies and organizations. The IRS has been able to design its partner framework to best serve its mission, while also delivering the best value to consumers, in a way that also recognizes the incentives needed to solicit participation from the private sector and ensure efforts are commercially viable.

Software Approval Process

IRS requires all tax preparation software used for preparing electronic returns to pass the requirements for Modernized e-File Assurance Testing (ATS). As part of the process software vendors notify IRS via an e-help Desk, that they plan to commence testing, then provide a list of all forms that they plan to include in their tax preparation software, but do not require that vendors support all forms. MeF integrators are allowed to develop their tax preparation software based on the needs of their clients, while using pre-defined test scenarios to create test returns that are formatted in the specified XML format. Software integrators then transmit the XML formatted test tax returns to IRS, where an e-help Desk assister checks data entry fields on the submitted return. When IRS determines the software correctly performs all required functions, the software is approved for electronic filing. Only then are software vendors allowed to publicly market their tax preparation software as approved for electronic filing -- whether for usage by corporations, tax professionals and individual users.

State Participation

Another significant part of the MeF partnership equation is providing seamless interaction with the electronic filing of both federal and state income tax returns at the same time. MeF provides the ability for partners to submit both federal and state tax returns in the same "taxpayer envelope", allowing the IRS to function as an "electronic post office" for participating state revenue services -- certainly better meeting the demands of the taxpaying citizen. The IRS model provides an important aspect of a public / private sector partnership with the inclusion of state participation. Without state level participation, any federal platform will be limited in adoption and severely fragmented in integration.

Resources

To nurture an ecosystem of partners, it takes a wealth of resources. Providing technical, how-to, guides, templates and other resources for MeF providers is essential to the success of the platform. Without proper support, MeF developers and companies are unable to keep up with the complexities and changes of the system. The IRS has provided the resources needed for each step of the e-Filing process, from on-boarding, to how to understanding the addition of the latest forms, and changes to the tax code.

Market Research Data

Transparency of the MeF platform goes beyond individual platform operations, and the IRS acknowledges this important aspect of building an ecosystem of web service partners. The IRS provides valuable e-File market research data to partners by making available e-file demographic data and related research and surveys. This important data provides valuable insight for MeF partners to use in their own decision making process, but also provides the necessary information partners need to educate their own consumers as well as the general public about the value the e-File process delivers. Market research is not just something the IRS needs for its own purposes; this research needs to be disseminated and shared downstream providing the right amount of transparency that will ensure healthy ecosystem operations.

Political Building Blocks

Beyond the technology and business of the MeF web services platform, there are plenty of political activities that will make sure everything operates as intended. The politics of web service operations can be as simple as communicating properly with partners, providing transparency, or all the way up to security, proper governance of web service, and enforcement of federal laws.

Status

The submission of over 230 million tax filings annually requires a significant amount of architecture and connectivity. The IRS provides real-time status of the MeF platform for the public and partners, as they work to support their own clients. Real-time status updates of system availability keeps partners and providers in tune with the availability of the overall system, allowing them to adjust availability with the reality of supporting such a large operation. Status of availability is an essential aspect of MeF operations and overall partner ecosystem harmony.

Updates

An extension of MeF platform status is the ability to keep MeF integrators up-to-date on everything to do with ongoing operations. This includes providing alerts when the platform needs to tune-in platform partners to specific changes with tax law, resource additions, or other relevant news of operations. The IRS also provides updates via an e-newsletter, providing a more asynchronous way for the IRS MeF platform to keep partners informed about ongoing operations.

Updates over the optimal partner channels are an essential addition to real-time status and other resources that are available to platform partners.

Roadmap

In addition to resources, status and regular updates of platform status of the overall MeF system, the IRS provides insight into where the platform is going next, keeping providers apprised with what is next for the e-File program. Establishing and maintaining the trust of MeF partners in the private sector is constant work, and requires a certain amount of transparency -- allowing partners to anticipate what is next and make adjustments on their end of operations. Without insight into what is happening in the near and long term future, trust with partners will erode and overall belief in the MeF system will be disrupted, unraveling over 30 years of hard work.

Governance

The Modernized e-File (MeF) programs go through several stages of review and testing before they are used to process live returns. When new requirements and functionality are added to the system, testing is performed by IRS's software developers and by IRS's independent testing organization. These important activities ensure that the electronic return data can be received and accurately processed by MeF systems. Every time an IRS tax form is changed and affects the XML schema, the entire development and testing processes are repeated to ensure quality and proper governance.

Security

Secure transmissions by 3rd parties with the MeF platform is handled by the Internet Filing Application (IFA) and Application to Application (A2A), which are part of the IRS Modernized System Infrastructure, providing access to trusted partners through the Registered User Portal (RUP). Transmitters using IFA are required to use their designated e-Services user name and password in order to log into the RUP. Each transmitter also establishes a Electronic Transmitter Identification Number (ETIN) prior to transmitting returns. Once the transmitter successfully logs into the RUP, a Secure Socket Layer (SSL) Handshake Protocol allows the RUP and transmitter to authenticate each other, and negotiate an encryption algorithm, including cryptographic keys before any return data is transmitted. The transmitter’s and the RUP negotiate a secret encryption key for encrypted communication between the transmitter and the MeF system. As part of this exchange, MeF will only accommodate one type of user credentials for authentication and validation of A2A transmitters; username and X.509 digital security certificate. Users must have a valid X.509 digital security certificate obtained from an IRS authorized Certificate Authority (CA), such as like VeriSign or IdenTrust, then have their certificates stored in the IRS directory using an Automated Enrollment process.

The entire platform is accredited by the Executive Level Business Owner, who is responsible for the operation of the MeF system, with guidance provided by the National Institute of Standards (NIST). The IRS MITS Cyber Security organization and the business system owner are jointly responsible and actively involved in completing the IRS C&A Process for MeF, ensuring complete security of all transmissions with MeF over the public Internet.

A Blueprint For Public & Private Sector Partnerships In A 21st Century Digital Economy

The IRS MeF platform provides a technological blueprint that other federal agencies can look to when exposing valuable data and resources to other agencies as well as the private sector. Web services, XML, and proper authentication can open up access and interactions between trusted partners and the public in ways that were never possible prior to the Internet age.

While this web services approach is unique within the federal government, it is a common way to conduct business operations in the private sector -- something widely known as Service Oriented Architecture (SOA), an approach that is central to a healthy enterprise architecture. A services oriented approach allows organizations to decouple resources and data and open up very wide or granular levels of access to trusted partners. The SOA approach makes it possible to submit forms, data, and other digital assets to government, using XML as a way to communicate and validate information in a way that supports proper business rules, wider governance, and the federal law.

SOA provides three essential ingredients for public and private sector partnership:

  • Technology - Secure usage of modern approaches to using compute, storage and Internet networking technology in a distributed manner
  • Business - Adherence to government lines of business, while also acknowledging the business needs and interest of 3rd party private sector partners
  • Politics - A flexible understanding and execution of activities involved in establishing a distributed ecosystem of partners, and maintaining an overall healthy balance of operation

The IRS MeF platform employs this balance at a scale that is unmatched in federal government currently. MeF provides a working blueprint can be applied across federal government, in areas ranging from the veterans claims process to the financial regulatory process.

The United States federal government faces numerous budgetary challenges and must find new ways to share the load with other federal and state agencies as well as the private sector. A SOA approach like MeF allows the federal government to better interact with existing contractors, as well as future contractors, in a way that provides better governance, while also allowing for partnership with the private sector in ways that goes beyond simply contracting. The IRS MeF platform encourages federal investment in a self-service platform that enable trusted and proven private sector partners to access IRS resources in predefined ways -- all of which support the IRS mission, but provide enough incentive that 3rd party companies will invest their own money and time into building software solutions that can be fairly sold to US citizens.

When an agency builds an SOA platform, it is planting the seeds for a new type of public / private partnership whereby government and companies can work together to deliver software solutions that meet a federal agency's mission and the market needs of companies. This also delivers value and critical services to US citizens, all the while reducing the size of government operations, increasing efficiencies, and saving the government and taxpayers money.

The IRS MeF platform represents 27 years of the laying of a digital foundation, building the trust of companies and individual citizens, and properly adjusting the agency's strategy to work with private sector partners. It has done so by employing the best of breed enterprise practices from the private sector. MeF is a blueprint that cannot be ignored and deserves more study, modeling, and evangelism across the federal government. This could greatly help other agencies understand how they too can employ an SOA strategy, one that will help them better serve their constituents.

You Can View, Edit, Contribute Feedback To This Research On Github


The Right Partnership for the API Strategy & Practice Conference

The API Strategy & Practice Conference is kicking off in 12 days in New York City. The event will be keynotes, sessions and panels from over 60 leading individuals across all sectors of the API industry.

I’ve wanted to have an API industry event that was about ideas, practice and strategy, not just about companies, their products and clients for a while now. This is something that I could never achieve on my own. I just don’t have the money or all the connections to bring everyone together.

To make API Strategy & Practice happen it took the right partnership. That partnership is with API service provider 3Scale. The 3Scale team views the industry much like I do, that by openly educating people about the benefits of APIs and discussing the technical, business and political challenges we face in the industry--we can truly make change across all business sectors.  Allowing all of us to be succesful in whatever products and services we provide.

Through the regular discussion I have with the 3Scale team, we discovered our shared desire to have such an API industry event. We discussed what the event would look like, then quickly started figuring out what it would take to make it happen. We felt we both had resources necessary to pull it off, and even though there is a lot of risk, that it was something that had to happen, for the benefit of the industry.

Here we are, 8 weeks later, and the event is a reality. I just wanted to say thanks to 3Scale for not just sharing my vision of the API industry, but also partnering with me to make this event a reality. I hope all my readers can make it to the event and participate in the discussions about APIs, platforms, mobile, and how we can all work to solve the biggest challenges in the space.  This discussion will allow all of us to make significant change across all business sectors and at all levels of government using APIs.

Make sure you are there November 1st and 2nd in New York City for the API Strategy & Practice Conference, so your voice can be part of the solution.

Thanks 3Scale!


Landscape Analysis and Monitoring as Part of Your API Evangelism

One very important aspect of API Evangelism is landscape analysis. When you launch an API, you need to have an intimate understanding of the landscape where you will be evangelizing your API and engaging with potential consumers.

I always start by establishing potential 10K foot target areas that I will focusing on in my research. While these may vary from industry to industry, my starting list is:

  • Individuals - Who are the key individual influencers in your target landscape
  • Competitors - Know your competitors, you can learn a lot from what they do right or they do wrong
  • Platforms - There are an endless amount of platforms that may benefit your company, such as Drupal, Wordpress and Salesforce
  • Media - Who are the relevant media and blogs you should be following and engaging with
  • Partners - Existing partners who you should be in tune with on the landscape

After tackling these areas and identifying who the players are, you will identify other areas that are unique to your API evangelism efforts, so don't stress over whether you have the right areas in the first week. While researching these areas of my landscape I try to identify specific channels for getting plugged in with landscape targets:

  • Website URL - What is the targets root URL for their website
  • Blog URL - What is the blog for a target
  • Blog RSS - Secondary to the blog, what is the RSS feed so I can programmatically monitor
  • Twitter - What is a companies or individuals Twitter handle
  • LinkedIn - Identify and sometimes connect with a target’s LinkedIn account
  • Facebook - Identify and sometimes connect with a target’s Facebook account
  • Github - Identify, follow and engage with a companies Github profile and appropriate repositories

After getting the primary channels for each target on my landscape identified, I will go deeper sometimes and actually identify key individuals associated with target companies, either the founders and / or key employees. I look for Twitter, LinkedIn, Facebook and Github for individuals as well, and sometimes even personal blogs when available.

While landscape analysis is an initial project in the first couple weeks of designing your API evangelism strategy, it will be ongoing iniative, in monitoring all target channels in real-time, but also identifying new target groups, companies and individuals or even specific channels not listed above like Tumblr or Google+ each week.

Each industry will be different when identifying the landscape for your API evangelism efforts, but this is a good first round blueprint and has worked well for me on multiple projects. Let me know if there are any landscape analysis and monitoring tricks you use.


If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.