Background
When I worked for Arity (an Allstate startup), Allstate had contracted with Pivotal Labs to facilitate its agile transformation. Pivotal's model was one of enablement - they would augment a cross-functional product team with their own cross-functional staff, instill processes and principles onto the core team, then gradually peel away, leaving the core team newly capable of best (or better) agile practices.
During this time I was introduced by Pivotal engineers to consumer-driven contract testing. This is an API development methodology in which API consumers drive the development of new and existing API contracts. Here is an example workflow:
API consumer desires a new endpoint
API consumer writes a new contract test in the API codebase asserting that the new endpoint exists, as well as accepts and returns the desired data
API consumer opens a pull request with the new contract test. This test immediately fails, as it's not implemented
From the new and existing contract tests, stubs are generated for use in creating a mock server
API consumer uses new stubs as a basis for new work. For example, a new view in a mobile app requires this new endpoint, and this endpoint doesn't exist yet in production.
API provider takes over the pull request with failing contract tests and implements the new endpoint. Contract tests now pass.
API provider merges changes and deploys to production.
API consumer turns on the new mobile app view.
There is immense value in this approach:
Expression of intent. The desired operation of the API is expressed in code in the form of a contract test. In the above workflow, this is usually in some contract testing-specific DSL or language, such as Groovy.
Validation of intent. The execution of contract tests validates that the API operates as intended.
Decoupling consumers and producers. With the creation of stubs, consumers are free to develop against a new endpoint before it's been implemented by the producer.
The approach I'll offer in this article is a similar one, focused on the expression and validation of intent. Where I'll deviate from the approaches supported by tools such as Sprint Cloud Contract and Pact is that I suggest using an API specification (we'll use OpenAPI) as the cornerstone of contract testing instead of manually-written tests.
Motivating Example
For the remainder of the article, I'll use an example API and its API spec to demonstrate how to generate and execute contract tests from that spec.
The example API we'll use is one of dinosaurs - a CRUD API allowing the storage and retrieval of dinos. I'll start with endpoints to add a new dinosaur and fetch all dinosaurs. The example includes an OpenAPI specification with these endpoints.
Then, I'll demonstrate how to generate and execute contract tests with Dredd before adding a new endpoint to fetch an individual dinosaur. In a subsequent article, I'll show how this can be incorporated into CI with GitHub Actions for continuous validation.
All code as it exists both at our starting point and at the conclusion of this article can be found on GitHub at christherama/dino-api. There are two ways to get to the starting point:
Clone or fork the above repo and checkout the commit for the starting point of this repo:
git clone git@github.com:christherama/dino-api.git git checkout v0.0.1
Download and extract an archive of the repo at the commit of our starting point at christherama/dino-api/archive/refs/tags/v0.0.1.zip
Prerequisites
Exploring the Dino API
After you've cloned or forked the above repo, or have extracted an archive at the indicated tag, build the Docker image locally (this could take a few minutes):
docker build -t dino-api:local .
Then migrate the database:
docker run -v $(pwd)/db:/usr/app/db dino-api:local python manage.py migrate
And finally, start the app:
docker run -p 8000:8000 -v $(pwd)/db:/usr/app/db dino-api:local
With the app running, you can now visit the API documentation, which is available at http://localhost:8000/api/docs/.
You can use Postman or cURL to see that we have no dinosaurs in our system yet (output simplified):
curl -v http://localhost:8000/api/dinosaurs/
< HTTP/1.1 200 OK
< Content-Type: application/json
<
[]
To add a dino, use the POST endpoint (output simplified):
curl POST http://localhost:8000/api/dinosaurs/ -H "Content-Type: application/json" -d '{"common_name": "T-Rex", "scientific_name": "Tyrannosaurus Rex"}'
> POST /api/dinosaurs/ HTTP/1.1
> Content-Type: application/json
>
< HTTP/1.1 201 Created
< Content-Type: application/json
<
{"id":1,"common_name":"T-Rex","scientific_name":"Tyrannosaurus Rex"}
Generate Contract Tests with Dredd
We now turn to the OpenAPI specification already present in the repo, using it as a basis for generating contract tests to execute against the running API. Here are the general assumptions:
Each endpoint we wish to test is documented in the OpenAPI spec
Each endpoint includes at least one example (refer to this Swagger guide for adding examples)
Any responses containing data previously added will have a previous request that added that data. For example, documentation of a POST request adding a dino should precede one that asserts the dino is present in a GET request of that dino.
For generating and executing contract tests, we'll use Dredd. Before diving in, note the following:
The tests generated and executed by Dredd are ephemeral. That is, they will not be stored in any way that requires manual intervention or maintenance.
This is one of the key values of this approach - since only the API spec is used for generating contract tests, it is the only artifact requiring maintenance. There is no additional language or DSL (like Groovy or non-OpenAPI YAML as with Spring Cloud Contract). There is only the OpenAPI spec, which already integrates with a plethora of tools.
Running Dredd Against the Dino API
Though there are options to run Dredd using a node package, we'll stick with the docker image for optimal portability. With the API still running, open a new terminal window or tab in the root of the repo and run the following:
docker run \
-v $(pwd)/api/docs:/api \
apiaryio/dredd \
dredd /api/openapi.yaml http://host.docker.internal:8000
Here I volume-mount the api/docs
directory of our repo to the /api
directory of the running dredd
container. This ensures that our API spec at api/docs/openapi.yaml
is available to the dredd
container at /api/openapi.yaml
, which I pass as the first argument to the dredd
command. The second argument is the host and port of the API under test, http://host.docker.internal:8000
. The hostname host.docker.internal
is specific to Docker Desktop and resolves to an internal IP used by the host machine. We need this since the dredd
container is running on its own network and localhost
will loop back to the container itself. The same could be accomplished by using --network host
and using localhost
as the hostname.
Assuming you have the API running on the same machine and you've migrated the database, running the above command should produce output that indicates success:
docker run \
-v $(pwd)/api/docs:/api \
apiaryio/dredd \
dredd /api/openapi.yaml http://host.docker.internal:8000
pass: POST (201) /api/dinosaurs/ duration: 196ms
pass: GET (200) /api/dinosaurs/ duration: 42ms
complete: 2 passing, 0 failing, 0 errors, 0 skipped, 2 total
complete: Tests took 244ms
Now that we have passing contract tests, let's turn to adding a new endpoint.
Contract Test-Drive a New Endpoint
Now we want a new endpoint for fetching an individual dino. For this, I'll demonstrate a new workflow:
Add the endpoint to the OpenAPI spec
Generate and run contract tests (they should fail)
Implement the new endpoint
Repeat 2. Tests should now pass.
Add the endpoint to the OpenAPI spec
To fetch a new endpoint, we'll use a GET
request to /api/dinosaurs/{id}/
, which we'll now add to api/docs/openapi.yaml
:
...
paths:
...
/api/dinosaurs/{id}/:
get:
summary: Get a dino
description: Retrieve a dinosaur by id
parameters:
- in: path
name: id
schema:
type: integer
required: true
description: Unique identifier of dino
example: 1
responses:
"200":
description: OK
content:
application/json:
schema:
type: object
properties:
id:
type: integer
common_name:
type: string
scientific_name:
type: string
Generate and run contract tests
After saving the above changes, generating and running contract tests results in a failure (output simplified):
docker run \
-v $(pwd)/api/docs:/api \
apiaryio/dredd \
dredd /api/openapi.yaml http://host.docker.internal:8000
pass: POST (201) /api/dinosaurs/ duration: 129ms
pass: GET (200) /api/dinosaurs/ duration: 38ms
fail: GET (200) /api/dinosaurs/1/ duration: 87ms
info: Displaying failed tests...
fail: GET (200) /api/dinosaurs/1/ duration: 87ms
fail: headers: At '/content-type' No enum match for: "text/html; charset=utf-8"
body: Can't validate actual media type 'text/plain' against the expected media type 'application/json'.
statusCode: Expected status code '200', but got '404'.
complete: 2 passing, 1 failing, 0 errors, 0 skipped, 3 total
complete: Tests took 260ms
Yay for failure! We are officially test-driving the API with our spec. The reference to content-type failures in the output is an implementation detail of Django. With our current Django configuration, a 404
response returns HTML content when it can't find a URL mapping to the one requested. Let's now turn to the implementation of that endpoint.
Implement the new endpoint
We'll start with adding a Django view to the bottom of api/views.py
:
from rest_framework.generics import ListCreateAPIView, RetrieveAPIView
...
class DinosaurView(RetrieveAPIView):
serializer_class = DinosaurSerializer
permission_classes = []
def get_queryset(self):
return Dinosaur.objects.all()
Next we'll map a new URL to this view. In api/urls.py
, add the following entry to urlpatterns
:
urlpatterns = [
...
path("dinosaurs/<int:pk>/", views.DinosaurView.as_view(), name="dinosaur"),
]
Rerun contract tests
Before rerunning tests, let's do the following:
Stop the Django container using CTRL+C
Rebuild the image
Delete the existing database. This ensures that a request to
GET /api/dinosaurs/1/
will succeed. In the scenario that the dino with id1
has been deleted, the next one added will have anid
greater than1
, leading to a contract test failure.Create a new database and migrate
Start the container again
# Stop the container with CTRL+C
# Rebuild the Django image
docker build -t dino-api:local .
# Delete the existing database
rm db/db.sqlite3
# Create a new database and migrate
docker run -v $(pwd)/db:/usr/app/db dino-api:local python manage.py migrate
# Run the Django container
docker run -p 8000:8000 -v $(pwd)/db:/usr/app/db dino-api:local
Now let's rerun our contract tests:
docker run -v $(pwd)/api/docs:/api apiaryio/dredd dredd /api/openapi.yaml http://host.docker.internal:8000
pass: POST (201) /api/dinosaurs/ duration: 243ms
pass: GET (200) /api/dinosaurs/ duration: 46ms
pass: GET (200) /api/dinosaurs/1/ duration: 97ms
complete: 3 passing, 0 failing, 0 errors, 0 skipped, 3 total
complete: Tests took 392ms
Pass! Congratulations, you've now driven API development with tests.
Limitations
With the introduction of a test-driven API development workflow, it's important to understand its limits. These tests are contract tests, not functional tests. That is, we're testing only the shape of the API, and not the actual content. To clarify this distinction, consider the example of sending the following API request:
POST /api/dinosaurs/
{
"common_name": "t-rex",
"scientific_name: "tyrannosaurus rex"
}
If your API is expected to title-case the value "tyrannosaurus rex" into "Tyrannosaurus Rex", these tests won't catch that. They are, after all, only contract tests, in this case ensuring that the response contains the expected property names and that their values are of the expected data types. Asserting that lowercase values are converted to title case values is the subject of unit tests in the repository, not contract tests.
This is to say that contract tests are not meant to be a substitute for other kinds of testing, be that unit, component/integration, or end-to-end.
Recap and Next Steps
In this article I introduced and demonstrated a workflow for driving the development of an API:
Have an OpenAPI spec
Update the spec with a new endpoint, or a change to an existing endpoint
Generate and run contract tests with Dredd
Implement or fix if needed, according to test results
Ensuring an API specification stays in lockstep with implementation is critical for consumers of an API. As such, validating this in a continuous integration pipeline is the next natural step.
In my next article, I'll demonstrate how to add this to CI in GitHub Actions for continuous validation of an API against the expressed intent of the spec.