How to Build a Free, Scalable API Testing Solution With Microsoft Excel

One of the most overlooked steps in designing, developing, and maintaining APIs has to do with testing them for data consistency, response times, and availability. Automated API testing is very useful for achieving testing objectives during the API software development lifecycle, and also when the APIs are out in the wild in production. When it comes to testing APIs throughout their normal lifecycle, there are a variety of approaches that API providers can take. They could manually write code that makes the specific API calls, or run manual API executions using a REST client like POSTman. Another options is that they could write their own code using CURL with bash scripts or Python. There are no shortage of ways to develop your own API testing framework. And more recently, given how the need for API testing has given rise to companies like API Science and Runscope, API providers can turn to a third party solution. One other approach is to develop your own testing framework as I have done with tools that already exist on every Windows desktop that has Microsoft Excel installed. Microsoft Excel?! Say what?! The big advantage of this approach is that it scales relatively easily to manage hundreds of tests (try to manage hundreds of tests using a Web-based solution, you will see what I mean), relies on a product that is installed on millions of computers (let's face it, Excel is everywhere, and we depend on it more than we realize or like to acknowledge), and costs nothing (compared to a commercial solution).

In this article, I'm going to show you how to continually test your APIs using a framework I developed called

webtesting api is a data-driven, automated API testing solution that allows developers and testers to create automated tests for JSON Web Services. uses Microsoft Excel as the front-end client to manage the automated test scripts, identify expected results, capture the actual results, log API response times, and track historical test metrics for each test cycle. The JSON Web Services Automated Testing solution is free to download, and use. Click here to download.

  • Doesn't require any coding experience
  • Supports the test early and often software engineering tenet
  • Reduces the number of manual tests while increasing test accuracy and reliability
  • Promotes the creation of quality test data and reusable test scripts
  • Keeps test scripts, test results, and test metrics in one place
  • Automates QA test metrics collection and reporting

Creating An Automated API Test with

The following tutorial will create an automated API test that verifies a calculated subtotal value for an import duties and taxes JSON-based web service. The documentation for this API is available here. The documentation is helpful to understand the API Target Endpoint URL, the required JSON elements for both the Web Service's Request and Response representations.

To get started, download the JSON API Automated Testing software. Click here to download.

Next, open the software in the same manner you would open any Microsoft Excel Spreadsheet. Enable macros for this spreadsheet, if prompted.

Once the spreadsheet is opened, create a test case called Landed Cost API - Sub Total Test, and then enter this value in the Test Name cell. Explanation: As with any test case in any test plan, you should uniquely label your automated tests for reference purposes.

webtesting test name

Enter in the Target URL cell. Explanation: Each automated test will have a testable target API endpoint. In order for you to properly test your APIs, you will need to know its URL, otherwise you will be unable to test it. Your API documentation should clearly identify the target API endpoints, which will help expose the available API URLs that should be tested.

webtesting target URL

Enter POST in the HTTP VERB cell. Explanation: Two commonly used HTTP methods applied during a typical HTTP Request-Response exchange are POST and GET. POST submits data to be processed by the Web Service (such as our tutorial here). GET requests data from the Web Service.

webtesting http verb

Enter the following code snippet into the JSON Document (HTTP POST Only) cell. Explanation: This is the data required to satisfy this Web Service's incoming data requirements. ( Additional datails: This JSON data contains the user_id and password values of a test account. We have written permission to use this account, and post this information publicly.)

"merchant":"demo_merchant", "cart":[
"name":" Boys Shirt",
"product_id":"731056", "price":"23.99",
"category":"Button-Up", "department":"Kid’s Casual Shirt", "attributes":"Cotton",
"weight":"1", "country_of_origin":"US"
"name":"Kid’s Backpack",
"product_id":"1171547", "price":"20.39",
"quantity":"1", "category":"Backpacks", "department":" Street Gear Bags ",
"weight":"1", "country_of_origin":"US"

After entering the above JSON Document code snippet, the worksheet will look like the following:

webtesting json document


Be sure to read the next Testing article: Runscope Simplifies API Test Management With New Import/Export Functionality


Comments (7)


well this work in MAC OSX? Will it work in other application supporting excel files?



This platform will not currently work on MAC OSX.  A MAC-compatabile solution is on our product roadmap for Q2 2016. Hope this helps. -Marc


Why is the VBA source code protected with a password?   


VBA Newbee...


[Transparency: I work for SmartBear, I cover SoapUI open source and other tools]

I agree that minimalist solutions people are already familiar with are worth considering, not wrong to entertain the thought of Excel, and most people don't care if something's open source so long as it's free to them. And API testing at any level is better than none at all.

But for those who know and live the API testing space, it seems like a bit of an omission to not mention SoapUI, the open source tool that has been fulfilling this need for a decade, for free, at the top of its class.

It's great that there's finally some competition, honestly, though the new kid solutions are overly simplistic for most of the SOAP and REST APIs my community uses SoapUI to test. Real API testing happens after you unit test your endpoints. Professional testers will understand that point, so I guess the Excel solution works for...other people then.

Sorry for the sour grapes, but it seems strange for SoapUI to be omitted, especially due to behind-the-scenes competitive dynamics between API vendors in the space. I'd love to hear professional testers' opinions on this, even if it counters my points, but it's their experience that really matters.



Hi Paul,

Nice to meet you. Appreciate your comments. I am a long-time user/customer of SmartBear products. SoapUI was not intentionally omitted. This was article was written for tutorial purposes, and not to cover the current testing tools market. The testing solution mentioned in this article focuses on integration testing, and not unit testing. JUnit and other tools are more appropriate for unit testing. I hope software testers will give a sincere look for integration and black-box testing as it has been a very useful solution for our API solutions. Thank you for your comments. 



Well try to bespoke one to each customer to start with .. Let's assume a 20k seat call centre looking to migrate to an automation tool for a chunk of the call for instance....Too many people employed to perform the job of instrument matching and tool building are too analytical and not creative enough. With voice, for instance, it is too difficult to measure true sentiment utilising existing tools, so there are floored tools people know very well, but a prejudice towards features not true accuracy. The first step should be to acknowledge the weighting that should be given to each tool in the market place you think is valid for the project and that only a few should be used (and create/blend a new tool if the weightage is too low). Whether it's Big Data or Live Data the tone needs to be adequately recognizable into mood, from there into some phonetic / musical frequency adjudication system for every key consumer segment then then matched out to an agreed group size of that consumer demographic. Psycographic segmentation might need to be established early in the experience through the testing of various voice styles and tones as an introduction to each segmentation. Once a true spectrum of audience is understood you can start cross referencing and refining to establish an even more quantified customer type. Then the challenge is to either a) change the experience for every identified segment or b) cheaper, simply pitch one of 100 researched preferences to the audience who match and then and only THEN starting product development and identification. Perhaps a simpler goal (starting point) would be just the agents in call centre's and taking a measurement of their quantitative test results, KPI's and tests and match that to a database with best phonetic representation of say 7-10 criteria. From there the data sets can commence taking on some form. Each critera tool is used for standard QA scoring based on their own capability and audience, then we see if the difference between the quantitive interpretation and the phonetic judgement tool/s based upon those 7-10 agreed criteria is closely alligned. We establish our best research team based on industry voice and analytic instrument calibration as a panel for show, listening in on calls to further refine and re-calibrate the slice and dice of the call and recognisable variances in pitch at various stages and final mood sentiment score and cost savings against existing and live consumer mood and engagement score.


Voice 2.0 built. Next a client who may use Voice 2.22


This is not even half my pitch. It's already changed and better...


Just spent 20 minutes on it twice and it had an error send. 


Point is we create better tools, by first measuring properly what the existing ones can do when matched with certain agents and certain styles. Look fr mood, timing, outcome of call etc an then find another 3-4 which are real and create them.


Use voice and speech experts, data anaylists and a guy called Serg.


The justification of the integration tool is it's ability to put this stuff together time and again. Sell the sizzle and forget the sausage.



What if want to verify category which is under cart ??