Kimono Labs Acquisition Leaves Devs Hanging

This Monday, scraping-as-a-service provider Kimono Labs announced that it is being acquired by Palantir Technologies and will be shuttering its service.

According to a message posted on the Kimono Labs website, the Kimono team will be joining Palantir, a heavily-funded private software and services firm best known for its work with clients in the intelligence world and financial services industry.

"When we started two years ago, we had a clear mission to help people everywhere structure and extract value from data," the company explained. "Since then, we've realized that continuing to work in isolation on a general data collection tool simply won’t allow us to make the impact we want. When we met the team at Palantir, we were instantly excited by the potential — the incredible talent and access to the world's most important data problems – even if it meant no longer working on the kimono product."

That's bad news for the more than 125,000 developers that were using Kimono's hosted cloud service. And it gets worse: Kimono is shutting that service down at the end of February, giving developers just two weeks to make alternative plans. 

In an effort to make things right with developers, Kimono has created an application, Kimono for Desktop, a Mac OSX and Windows app that will enable developers to do data collection using their local machines. It will also give them until March 31 to transfer the APIs they've created using Kimono to their local environments for use with Kimono for Desktop.

What About Best Practices?

There are few hard and fast rules for shutting down a cloud service, but it's hard not to argue that Kimono's two week notice fails the reasonable test. As one Hacker News poster commented, "That is an abysmally small amount of time and if I was a paying customer, [I'd] be furious."

While Kimono for Desktop, which is free, appears to be a respectable olive branch, a desktop app that cannot function in a server environment is unlikely to meet the needs of Kimono's most technical users, and may be too complex for Kimono's least technical users, some of whom may have selected the service because of its ability to integrate with Google Sheets, an integration that will also stop working at the end of the month.

What's not clear is why Kimono has to shutter its service so rapidly. Palantir has raised more than $2 billion in funding and was reportedly valued at as much as $20 billion late last year, so it ostensibly has the financial resources to support Kimono for a more period of time even if would prefer not to.

Alternative Solutions Abound

The good news for developers is that the thirst for data, coupled with the reality that not every company will offer an official API, has led to the development of a multitude of solutions that serve the same purpose as Kimono.

Hosted alternatives include ParseHub,,,, Diffbot, Scrapinghub and Apifier. There are also a number of open-source solutions that developers can deploy themselves. These include Gargl, Portia, Scrapy and Apache Nutch.

For developers and businesses that opt to switch to a hosted service like Kimono, the Kimono shutdown is a reminder that outsourcing core functionality to a third-party cloud service is still a risky proposition. For that reason, it's no surprise that a number of the hosted providers, such as Scrapinghub, are behind the open-source solutions, giving the developers that use them slightly greater comfort that they can prepare for a Plan B scenario well before it happens.

Patricio Robles Follow me on Google+




We ended up moving to Feedity. So far so good. Their online tool is simple to use and support is fast & friendly.


I looked at several options on the market for both content capture and also browser automation, ie logging in to systems and inserting data, ie interacting with a CRM system. I found there are plenty of options out there that do scrapping of public facing sites. My requirement where to interact with a site - log in and go to a specific page enter details and then grab the content and being this back in a structured form. I went for Content Grabber ( why..

Price - one off fee with on going support if you need it, lots of others are per month and per scrape pricing.

Support - I emailed them asking for something complex to see how they would react - within a few hours I got a file I imported and it worked instantly. – Support here is awesome they are always very helpful and willing to get what’s needed working.

API - The API allows you to execute the scrape/ interaction with ease from any coding platform. Bringing back data in CSV, XML, Excel Json etc or posting the data to any database such as MySql, MSSql or Oracle.

Internal applications - My requirement where to interact with a internal system of which wasn’t internet facing and was never going to be, so 90% of what was out there was removed from the short list because they're internet hosted. I installed on a windows server and within 40 minutes the 'agent' (that’s what its called to process of getting / inserting data) was up and running.

We now have 50+ agents running data interactions internally making systems interact where systems previously didn’t and was a manual job to update records, other agents get data from site on a schedule and stored in a database, this is now giving us business intelligence un rivalled to data we had previously.  


ParseHub is very similar to Kimono and just as beautiful. It's a bit more of a learning curve but does a much better job extracting data. With Kimono, I found you can only get data from static sites - with ParseHub I was able to deal with all sorts of interactivity.


All are welcome at with a free account incl. tech support and extra resources at very fair prices! 

We can also help you transport all your Kimono robots for a small fee!

Kind regards

Jacob Laurvigen


Executive Officer