swagger-sheet

Swagger + Excel Sheets, a wonderful way of validating REST APIs

Swagger Files (aka OpenAPI Specification) is the most popular way for documenting API specifications and Excel sheet provides an easy and simple way of writing structured data. Anybody can write data in excel sheet irrespective of their programming skills. Introducing vREST NG (An enterprise ready application for Automated API Testing), which combines the power of both to make your API Testing experience more seamless. The approach is also known as Data Driven Testing.

Data Driven testing is an approach in which test data is written separately from the test logic or script.

So, this is how the process looks like:

process

vREST NG uses swagger files to generate all of the test logic and sample test data CSV files. vREST NG reads test data from the CSV files and iterate over the rows available in the CSV files and run the iterations one by one. Today in this post, we will look at the following in detail:

  1. How you may generate the test cases by using the swagger files.
  2. How you may feed the test data to those generated test cases through an

How to perform Data Driven API Testing in vREST NG

To elaborate the process, I will take a sample test application named as contacts application which provides the CRUD APIs. I will guide you through the following steps:

  1. Setup the test application
  2. Download and Install vREST NG Application
  3. Perform Data Driven API Testing in vREST NG

1. Setup the Test Application:

You may skip this step if you want to follow the instructions for your own test application.

Otherwise, just download the sample Test Application from this repository link. This application is a NodeJS based application and tested with NodeJS v10.16.2.

To setup this application, simply follow the instructions mentioned in the README file of the repository.

2. Download and Install vREST NG Application

Now, simply download the application through vREST NG website and install it. Installation is simpler but if you need OS specific instructions, then you may follow this guide link.

After installation, start the vREST NG Application and use vREST NG Pro version when prompted in order to proceed further.

Now first setup a project by dragging any empty directory from your file system in the vREST NG workspace area. vREST NG will treat it as a project and store all the tests in that directory. For more information on setting up a project, please read this guide link.

For a quick start, if you don’t want to follow the whole process and just want to see the end result. They may download and add this project directory in the vREST NG application directly.

3. Performing Data Driven API Testing in vREST NG

vREST NG provides a quick 3 step process to perform data driven API Testing:

  1. Import the Swagger File
  2. Write Test Data in CSV Files
  3. Setup Environment

Now, we will see these steps in detail:

(a) Import the Swagger File

To import the Swagger file, simply click on the Importer button available in the top left corner of the vREST NG Application.

Importer

An import dialog window will open. In this dialog window:

  1. Select “Swagger” as Import Source
  2. Tick the option `Generate Data Driven Tests`. If this option is ticked then vREST NG Importer will generate the data driven test cases for each API spec available in the swagger file.
  3. Provide the swagger file. For this demonstration, I will use the swagger file from the test application repository. Download Swagger File.

The dialog window will look something like this. Now, click on the Import button to proceed further.

Importer

The import process has done the following things so far:

1. It has generated a test case for each API spec available in the swagger file. And test suites will be generated against each tag available in the swagger file.

generated-tc

2. It has automatically created the sample CSV files against each test case with desired columns according to your swagger file as shown in the following image.

csv-file

We will discuss in detail on how you may fill this excel sheet later in this post.

3. The generated CSV files are also automatically linked as shown in the following image.

linked-csv

So, before every test execution, the API test will read the data from the linked CSV file and convert it into JSON format and store it in a variable named as data. Now the test case will iterate over the data received and run the iterations. So, if you make a change in the CSV file, just run the test case again. Test Case will always pick up the latest state of the CSV file. No need to import again and again.

4. It has automatically inserted some variables in the API request params as per the API definitions available in the swagger file. These variable values will be picked up from the linked CSV file automatically.

linked-csv

5. It has automatically added the response validation logic as well. Status code assertion is used to validate the status code of the API response. Text Body with Default Validator assertion compares the expected response body with the actual response body. Text body with Default Schema Validator assertion validates the API response through the JSON schema.

The expected status code will be picked up from the linked CSV file.

linked-csv

And the expected response body will also be picked up from the linked CSV file.

response-body

And the expected schema name is also picked up from the linked CSV file.

expected-schema-name

6. It has imported all the swagger schema definitions in the Schemas section available in the Configuration tab.

configuration-tab

You may refer to these schema definitions in the Expected Schema tab as discussed earlier. And in the CSV file, you will need to just specify the respective schema name for the test iterations in the expectedSchema column.

Write Test Data in CSV Files

As we have already seen the data file generated from the import process. Let me show you the generated file again for the Create Contact API:

csv-file

In this sample file, you may add test data related to various iterations for the Create Contact API. In the iterationSummary column, simply provide the meaningful summary for your iterations. This iteration summary will show up in the Results tab of the vREST NG Application. You will need to fill this test data by yourself. You may even generate this test data through any external script.

Now, let’s add some test iterations in the linked CSV file.

iterations

With the above CSV file, we are checking two test conditions of our Create Contact API:

  1. When the name field is empty
  2. And when the name field length is greater than the limit of 35 characters.

In the above CSV file, we have intentionally left the expectedBody column blank. We don’t need to fill this column. We can fill this column’s value via the vREST NG Application itself.

Before executing the test case, we need to configure the baseURL variable of your test application in the Configuration tab like this:

iterations

Now, let’s execute this test in vREST NG Application. Both the iterations are failed because expected response body doesn’t match with the actual response body as shown in the following image:

iterations

Now, click on button “Copy Actual to Expected” for each iteration. vREST NG will directly copy the actual response body to expectedBody column in the CSV file like this.

Now after this operation, if you look at the CSV file again. You can see that vREST NG has filled the expectedBody column for you as shown in the following image.

summary

Note: If you have opened this CSV file in Microsoft Excel then you will need to close the file and open it again in order to reflect the changes. But some code editors automatically detect the changes on the file system and reflect it in real time.

Now, if you execute the test again, you can see that the tests are now passing.

summary

You may also see the expected vs actual response for the selected test iteration:

summary

And you may see the execution details of the selected iteration by going to Execution Tab:

execution-tab

So, in this way, you may add iterations in the CSV file. Just add iterations in your CSV file and run it in the vREST NG Application directly. No need to import again and again. It all just work seamlessly. So, it increases your test efficiency drastically.

(c) Setup Environment

For the generated steps, you may also need to set the initial application or DB state before executing your tests. So that you can perform the regressions in an automated way. Some use cases of setting up initial state can be:

  1. Restoring the database state from the backups
  2. Execute an external command or script
  3. Invoke a REST API to setup the initial state

In this section, Let’s see how you may execute an external command before the execution of our tests. As our sample test application is simple and built for demonstrating vREST NG. It stores all the contacts data in a JSON file. So, I already have the initial data in a JSON file which I can copy to our test application project directory before executing the test cases.

You may specify the command as shown in the following image:

external-command

The above command will restore the application state from the initial data which is already there in vREST NG Project directory >> dump.json file.

Note: You will also need to specify the cpCmd variable in the Environments section because in Linux/MacOS the command name is cp and for Windows OS, the command name is copy. So, for Windows OS, you may create another environment in vREST NG application. So that your API tests can run on any machine by just switching the environment.

So, this is how easy you may perform the data driven testing in vREST NG.

In conclusion, vREST NG offers a streamlined approach to data-driven API testing, harnessing the power of Swagger and Excel. By seamlessly integrating these tools, vREST NG enhances testing efficiency. For a robust API testing experience, explore vREST NG at Optimizory and elevate your testing capabilities effortlessly.

Have any queries?

Please send a mail to support@optimizory.com to get in touch with us.