Note: Please use the updated version of this blog post that uses a better JMeter reports plugin and a cleaner pipeline.
I've recently been working with an app team to set up automated integration and performance testing with Apache JMeter for their suite of microservices hosted in OpenShift and orchestrated by a Jenkins pipeline. One of the challenges we faced was the integration of each of these tools into one seamless process, with all of the necessary OpenShift objects for JMeter and quick communication between Jenkins and the cluster. The eventual goal was to slip this into our main CI/CD pipeline.
The final result is a section of our Jenkins pipeline that:
- builds an image of the JMeter test suite in OpenShift
- spins up an OpenShift job with the latest image of the test suite
- runs the JMeter tests included in the container against all the services in the OpenShift project
- sends the results back to a file input in the Jenkins pipeline to parse with the Jenkins Performance Plugin
This proved to be a bit of a hassle to coordinate, so here are the steps I took, both for my future self and anyone else who might benefit. :)
Setting up the Test Suite Repo
Required files in the repo:
- JMeter JMX files all in one folder
- a Dockerfile -- we're using a Docker build strategy in our test suite's BuildConfig, so this is required to produce the test suite image
- OpenShift objects in YAML form (a BuildConfig, ImageStream, and Job)
- a Jenkinsfile for coordinating the OpenShift objects and parsing the test results
- shell script as entrypoint for the test suite image
Most of these files can also be found here.
Creating the JMeter Base Image
Thanks to this awesome blog post, this part was super easy. This rhel-jmeter image contains Java and JMeter, and sets up the jmeter/tests and jmeter/results folders that will contain the application team's JMX test files. This image should be built and stored separately in whatever registry you're using, or included at the beginning of the test suite Dockerfile instead of using "FROM."
Creating the Test Suite Image
This just inherits the rhel-jmeter base image above and copies the JMeter test files and scripts. (In the test suite repo, all the JMX test files are stored in the jmeter folder.) When we build the integration test suite image in OpenShift, the container will have Java, JMeter, and the JMX files, as well as the runjob.sh script that coordinates the tests running.
Building the OpenShift Objects
To run these tests in OpenShift, we need a BuildConfig, an ImageStream, and a Job.
The BuildConfig pulls down the test suite repo and builds the image above with all the test files, using the repo's Dockerfile. The ImageStream organizes all of your test suite images over time, and gives you the most recent one with the "latest" tag. The Job runs a pod using the latest test suite image, with the runjob.sh script as its entrypoint.
We're using a Job here because it conveniently spins down after completing the runjob.sh script, but you could also use a DeploymentConfig with a single pod replica and then scale it down manually in the Jenkinsfile after receiving the test results.
The BuildConfig and ImageStream objects are saved in YAML form. To create/update these in OpenShift, you'll run oc apply -f file.yaml.
For building images from a Git repo, you'll also need a secret with your Git credentials saved in your OpenShift project. Run:
oc secrets new-basicauth gitsecret --username=<git-user> --password=<git-pass>
Then make sure your BuildConfig uses the source secret "gitsecret."
The Job is structured as a Template, so that we can input our own parameters. This will be created by processing the template and piping it into an "oc apply" command: oc process -f job-template.yaml | oc apply -f - -n <project>.
BuildConfig
ImageStream
Job Template
Running the Tests
The Job runs whatever is under the "command" line of the template. In this case, it's a script that we've included in the test suite image -- runjob.sh. This script runs the specified JMeter tests file in the container against the other services in the OpenShift project, puts the results into a matching JTL file, and curls the results file back to a specific input in a Jenkins pipeline.
It takes four arguments (under the "args" line):
- Jenkins pipeline return URL -- the URL of the pipeline input step that will receive the JMeter results file
- file name -- name of the JMX file to run
- username -- Jenkins username for authentication
- password -- Jenkins API token for authentication
- get this token in Jenkins by clicking on your username in the top right corner, then clicking "Configure" and "Show API Token"
Coordinating with Jenkins
Our setup of Jenkins uses the Kubernetes plugin to dynamically spin up agents inside OpenShift, labeled 'jenkins-agent' in the node line below. Each of these agents has the oc client and git installed.
Here's the gist of the Jenkins side of things:
- spin up a Jenkins agent inside OpenShift and login to the OpenShift cluster
- checkout the test suite Git repo into the agent's workspace
- build the test suite image in your OpenShift project
- for every JMX file found in the test suite repo:
- kick off a job in OpenShift that runs the test and reports back
- create an input step in the pipeline that takes a file as a parameter
- wait for the job to send back the reports JTL file to that input step
- use the Performance Plugin to parse the JMeter reports
In our full CI/CD pipeline Jenkinsfile, most of these steps are in methods and reused elsewhere, but for the purpose of this outline I've separated them out and commented each line.
Output
The pipeline logs will look a little something like this:
While running this in Jenkins, you can view the test suite logs by hopping into OpenShift and looking for pods named jmeter-test-suite-* in your project. Once the jobs complete, you'll see a number of "Performance Report" links show up on the Jenkins build sidebar, and there are your test results!