LinkedIn Announces Open Source API Hub

Today, the LinkedIn engineering team will announce the open source release of its API Hub. LinkedIn released as open source just over a year ago. In that time, adoption has been SWIFT and now over half of remote calls use In a nutshell, constitutes a REST Framework for building scalable RESTful architectures. On the eve of these announcements, ProgrammableWeb caught up with LinkedIn engineers Karan Parikh and Joe Betz to learn more.

For starters, we asked for the thirty second elevator pitch regarding the framework:

Of the many REST frameworks, fills an important niche for applying REST at scale. REST scales on two main dimensions: it scales across large developer teams and it scales to large, high performance, production deployments.

When we started the project, we were unable to find any existing framework that scaled on both of these dimensions, and so we built one. is opinionated enough about how REST APIs should be built to foster strong uniformity, consistency and REST best practices, even across large developer teams. also includes the whole toolset of technologies used to scale modern, high performance, distributed architectures, all packed cleanly with the REST framework.

As mentioned, in less than a year, has already ramped up to around half of remote calls to LinkedIn services. We asked the team if they had an explanation for the quick adoption rate:

The developer productivity gains of became very obvious early on, and this resulted in strong support both from leadership and the engineering community at LinkedIn. Here are some of the reasons why has become popular within LinkedIn:

(1) It allowed uniform modeling of our data that can be produced or consumed by online systems (like our web services) and offline systems (like our Machine Learning systems).

(2) It allows teams to develop clients and servers in different languages. The team provides bindings and infrastructure for JVM based languages, but since the protocol is simply JSON over HTTP it is easy to write non-JVM based applications. Within LinkedIn teams have written clients in Python and Node.js, and servers in Python.

(3) It promotes uniform APIs (since it is based on REST). This reduces the learning curve for having to communicate with a new service.

Keeping up the rate of adoption seems like it might be a task. However, the team believes the rate will do more than continue. LinkedIn believes the adoption rate will continue to increase over time:

The adoption rate has been accelerating for the last few quarters and we expect to finish the bulk (90%+) of the migrations in mid-year. Given that trajectory, it makes more sense to go to 100% and remove our legacy RPC frameworks from our code completely than to continue to maintain them. The last few percent will be tough; getting from almost complete to fully complete is always a challenge at our size,  but we’re going to make it happen. There will still be other protocols in use for data systems and queueing systems, but all remote calls previously handled by our RPC frameworks will be migrated to

To get a better understanding of where plays in the current LinkedIn environment, the team gave us a little insight into its current use:

Several services at LinkedIn use for production traffic. This includes external products such as the Home Page Feed, and internal services such as our A/B testing service as well as our recommendations service. is also used heavily to build our more recent mobile applications. Mobile developers find the REST APIs far easier to integrate with.

Once we understood where is deployed today, we asked for some use case scenarios that the team thought were a fit for adoption:

Developers and organizations interested in transitioning to a modern Java development Stack seem particularly attracted to fits well with the other tools and techniques that are forming the modern Java development stack, such as the Gradle build system, light-weight server frameworks, and non-blocking request handling.

We believe organizations building large service architectures will benefit the most dramatically from But any Java developer building service architectures or publishing public APIs on the web should give it a try. certainly seems straightforward and attractive. However, how realistic can it be to migrate a large scale environment to Would the hassle be worth the pains associated with a complete API framework migration? The team gave us a pretty clear, straightforward set of steps to address such a move: is built for a modern Java stack. While it does support Java versions as old as 1.6, it does require the Gradle build system and works best with a servlet Container or as a standalone Netty server.

Developers already using Gradle and building REST APIs should find the transition fairly straightforward.  The basic workflow is:   

(1) Define the data schemas for your RESTful Resource’s JSON representations using a simple schema language.

(2) Write REST resource implementations as Java classes using data bindings that are generated from the schemas defined in step #1.

(3) Write client code to call the REST resource using client bindings generated from the resource implementation defined in step #2.

(4) Transition clients from calling the old APIs they already use to the new APIs.

Due to the scale of migrations that we’ve done at LinkedIn, this developer workflow has been repeatedly refined.

At this point, is mature and stable. As far as enhancements and future plans go, the team will continue working on building out the ecosystem. Building the ecosystem will include partnering with other companies on the overall project, developing support for Python and JavaScript  and offering more open source tools and plugins to ease Integration with other software. Those interested in learning more should check out some of the current ecosystem projects currently in the works (e.g. API Hub, SBT Plugin, and Skeleton Generator).

Be sure to read the next REST article: Espresso Logic Extends RESTful API Reach to NoSQL Databases