Testing Methods of `test::nginx`: Configuration, Sending Requests, and Handling Responses

API7.ai

November 18, 2022

OpenResty (NGINX + Lua)

By the last article, we have already had the first glimpse of test::nginx and run the simplest example. However, in a real open-source project, test cases written in test::nginx are much more complex and difficult to master than the sample code. Otherwise, it wouldn't be called a roadblock.

In this article, I will take you through the frequently used commands and test methods in test::nginx, so that you can understand most of the test case sets in the OpenResty project and have the ability to write more realistic test cases. Even if you haven't contributed code to OpenResty yet, getting familiar with OpenResty's testing framework will be a great inspiration for you to design and write test cases in your work.

The test::nginx test essentially generates nginx.conf and starts an NGINX process based on the configuration of each test case. Then, it simulates a client request with the specified request body and headers. Next, the Lua code in the test case processes the request and makes a response. At this time, test::nginx parses the critical information like the response body, response headers, and error logs and compares them to the test configuration. If a discrepancy exists, the test fails with an error; otherwise, it is successful.

test::nginx provides a lot of DSL (Domain-specific language) primitives. I have made a simple classification according to configuring NGINX, sending requests, processing responses, and checking logs. This 20% of the functionality can cover 80% of the application scenarios, so we must have a firm grasp on it. As for other more advanced primitives and usage, we'll introduce them in the next article.

NGINX Configuration

Let's first look at the NGINX configuration. The primitive of test::nginx with the keyword "config" is related to NGINX configuration, such as config, stream_config, http_config, etc.

Their functions are the same: insert the specified NGINX configuration into different NGINX contexts. These configurations can be either NGINX commands or Lua code encapsulated in content_by_lua_block.

When doing unit tests, config is the most commonly used primitive in which we load Lua libraries and call functions for white-box testing. Here is a test code snippet, which can't run entirely. It is from a real open-source project, so if you are interested, you can click the link to see the complete test, or you can try to run it locally.

=== TEST 1: sanity
--- config
    location /t {
        content_by_lua_block {
            local plugin = require("apisix.plugins.key-auth")
            local ok, err = plugin.check_schema({key = 'test-key'})
            if not ok then
                ngx.say(err)
            end
            ngx.say("done")
        }
    }

The purpose of this test case is to test if the check_schema function in the plugins.key-auth code file works properly. It uses the content_by_lua_block NGINX command in location /t to require the module to be tested and to call the function that needs to be checked directly.

This is a common means of white-box testing in test::nginx. However, this configuration alone is not enough to complete the test, so let's move on and see how to send a client request.

Sending Requests

Simulating a client sending a request involves quite a few details, so let's start with the simplest one - sending a single request.

request

Continue with the above test case, if we want the unit test code to be run, then we have to initiate an HTTP request to the /t address specified in the config, as shown in the following test code:

--- request
GET /t

This code sends a GET request to /t in the request primitive. Here, we don't specify the IP address, domain name, or port of the access, nor do we specify whether it is HTTP 1.0 or HTTP 1.1. All these details are hidden by test::nginx, so we don't have to care. This is one of the benefits of DSL - we only need to focus on the business logic without being distracted by all the details.

Also, this provides partial flexibility. For example, the default is the protocol for HTTP 1.1, or if we want to test HTTP 1.0, we can specify separately:

--- request
GET /t  HTTP/1.0

In addition to the GET method, the POST method also needs to be supported. In the following example, we can POST the string hello world to the specified address.

--- request
POST /t  
hello world

Again, test::nginx calculates the request body length for you here, and adds the host and connection request headers to ensure that this is a normal request automatically.

Of course, we can add comments to make it more readable. Those starting with # will be recognized as code comments.

--- request
# post request
POST /t  
hello world

The request also supports a more complex and flexible mode, which uses eval as a filter to embed Perl code directly since test::nginx is written in Perl. If the current DSL language does not meet your needs, eval is the "ultimate weapon" to execute Perl code directly.

For the usage of eval, let's look at a few simple examples here, and we'll continue with other more complex ones in the next article.

--- request eval
"POST /t
hello\x00\x01\x02
world\x03\x04\xff"

In the first example, we use eval to specify non-printable characters, which is one of its uses. The content between the double quotes will be treated as a Perl string and then passed to the request as an argument.

Here is a more interesting example:

--- request eval
"POST /t\n" . "a" x 1024

However, to understand this example, we need to know something about strings in Perl, so I need to briefly mention two points here.

  • In Perl, we use a dot to represent string splicing. Isn't this somewhat similar to Lua's two dots?
  • A lowercase x indicates the number of times a character is repeated. For example, the "a" x 1024 above means that the character "a" is repeated 1024 times.

So, the second example means that the POST method sends a request containing 1024 characters a to the /t address.

pipelined_requests

After understanding how to send a single request, let's look at how to send multiple requests. In test::nginx, we can use the pipelined_requests primitive to send multiple requests in sequence within the same keep-alive connection:

--- pipelined_requests eval
["GET /hello", "GET /world", "GET /foo", "GET /bar"]

For example, this example will access these four APIs sequentially in the same connection. There are two advantages about this:

  • The first is that a lot of repetitive test code can be eliminated, and the four test cases can be compressed into one.
  • The second and most important reason is that we can use pipelined requests to detect whether the code logic will have exceptions in the case of multiple accesses.

You may wonder, I write multiple test cases in sequence, then the code will also be executed multiple times in the execution phase. Doesn't it also cover the second problem above?

It comes down to the execution mode of test::nginx, which works differently than you might think. After each test case, test::nginx shuts down the current NGINX process, and all data in memory disappears. When running the next test case, nginx.conf is regenerated, and a new NGINX Worker is started. This mechanism ensures that test cases do not affect each other.

So, when we want to test multiple requests, we need to use the pipelined_requests primitive. Based on it, we can simulate rate-limiting, concurrency-limiting, and many other scenarios to test whether your system works properly with more realistic and complex scenarios. We'll leave this for the next article as well, as it will involve multiple commands and primitives.

repeat_each

We just mentioned the case of testing multiple requests, so how should we execute the same test multiple times?

For this problem, test::nginx provides a global setting: repeat_each, which is a Perl function that defaults to repeat_each(1), indicating that the test case will only run once. So in the previous test cases, we don't bother to set it separately.

Naturally, we can set it before the run_test() function, for example, by changing the argument to 2.

repeat_each(2);
run_tests();

Then, each test case is run twice, and so on.

more_headers

After talking about the request body, let's look at the request headers. As we mentioned above, test::nginx sends the request with the host and connection headers by default. What about the other request headers?

more_headers is specifically designed to do just that.

--- more_headers
X-Foo: blah

We can use it to set various custom headers. If we want to set more than one header, then set more than one line:

--- more_headers
X-Foo: 3
User-Agent: openresty

Handling Responses

After sending the request, the most important part of test::nginx is processing the response, where we will determine if the response meets expectations. Here we divide it into four parts and introduce them: the response body, response header, response status code, and log.

response_body

The counterpart of the request primitive is response_body, and the following is an example of their two configurations in use:

=== TEST 1: sanity
--- config
    location /t {
        content_by_lua_block {
            ngx.say("hello")
        }
    }
--- request
GET /t
--- response_body
hello

This test case will pass if the response body is hello, and will report an error in other cases. But how do we test for a long return body? Don't worry, test::nginx has already taken care of that for you. It supports detecting the response body with a regular expression, like the following:

--- response_body_like
^he\w+$

This allows you to be very flexible with the response body. In addition, test::nginx also supports unlike operations:

--- response_body_unlike
^he\w+$

At this point, if the response body is hello, the test will not pass.

Along the same lines, after understanding the detection of a single request, let's look at the detection of multiple requests. Here is an example of how to use it with pipelined_requests:

--- pipelined_requests eval
["GET /hello", "GET /world", "GET /foo", "GET /bar"]
--- response_body eval
["hello", "world", "oo", "bar"]

Of course, the important thing to note here is that as many requests as you send, you need to have as many responses to correspond.

response_headers

Second, let's talk about the response header. The response header is similar to the request header in that each line corresponds to the key and value of a header.

--- response_headers
X-RateLimit-Limit: 2
X-RateLimit-Remaining: 1

Like the detection of the response body, response headers also support regular expressions and unlike operations, such as response_headers_like, raw_response_headers_like, and raw_response_headers_unlike.

error_code

The third one is the response code. The detection of response code supports direct comparison and also supports like operations, such as the following two examples:

--- error_code: 302
--- error_code_like: ^(?:500)?$

In the case of multiple requests, the error_code needs to be checked multiple times:

--- pipelined_requests eval
["GET /hello", "GET /hello", "GET /hello", "GET /hello"]
--- error_code eval
[200, 200, 503, 503]

error_log

The last test item is the error log. In most test cases, no error log is generated. We can use no_error_log to detect:

--- no_error_log
[error]

In the above example, if the string [error] appears in the NGINX error.log, the test will fail. This is a very common feature, and it is recommended that you add detection of the error log to all your normal tests.

--- error_log
hello world

The above configuration is detecting the presence of hello world in error.log. Of course, you can use eval embedded in Perl code to implement regular expression detection, such as the following:

--- error_log eval
qr/\[notice\] .*?  \d+ hello world/

Summary

Today, we are learning how to send requests and test responses in test::nginx, containing the request body, header, response status code, and error log. We can implement a complete set of test cases with the combination of these primitives.

Finally, here is a thinking question: What are the advantages and disadvantages of test::nginx, an abstract DSL? Feel free to leave comments and discuss with me, and you are also welcome to share this article to communicate and think together.