GitLab CI Performance Testing Pipeline
Set up automated performance testing in GitLab CI/CD with k6 and JMeter, including Docker runners and result visualisation.
Mark
Performance Testing Expert
GitLab CI provides excellent support for performance testing integration, with built-in features for storing artifacts, visualising metrics, and triggering tests on schedules or deployments. Here’s how to set up a comprehensive performance testing pipeline.
Basic k6 Pipeline
Create .gitlab-ci.yml with a performance testing stage:
stages:
- build
- test
- performance
- deploy
performance_test:
stage: performance
image: grafana/k6:latest
script:
- k6 run tests/load-test.js --out json=results.json
artifacts:
paths:
- results.json
expire_in: 30 days
only:
- main
- merge_requests
k6 Test Script with Thresholds
Create tests/load-test.js:
import http from 'k6/http';
import { check, sleep } from 'k6';
export const options = {
stages: [
{ duration: '1m', target: 20 },
{ duration: '3m', target: 20 },
{ duration: '1m', target: 0 },
],
thresholds: {
http_req_duration: ['p(95)<500'],
http_req_failed: ['rate<0.01'],
},
};
const BASE_URL = __ENV.TARGET_URL || 'https://staging.example.com';
export default function () {
const res = http.get(`${BASE_URL}/api/health`);
check(res, {
'status is 200': (r) => r.status === 200,
'response time OK': (r) => r.timings.duration < 500,
});
sleep(1);
}
Environment-Specific Testing
Use GitLab environments to test different targets:
variables:
STAGING_URL: "https://staging.example.com"
PRODUCTION_URL: "https://api.example.com"
.performance_template: &performance_template
image: grafana/k6:latest
script:
- k6 run tests/load-test.js --out json=results.json
artifacts:
paths:
- results.json
reports:
performance: results.json
performance_staging:
<<: *performance_template
stage: performance
variables:
TARGET_URL: $STAGING_URL
environment:
name: staging
only:
- main
performance_production:
<<: *performance_template
stage: performance
variables:
TARGET_URL: $PRODUCTION_URL
K6_VUS: "10" # Lower load for production
environment:
name: production
when: manual
only:
- tags
JMeter in GitLab CI
For JMeter tests, use a custom Docker image or install during the job:
jmeter_test:
stage: performance
image: justb4/jmeter:latest
script:
- jmeter -n -t tests/load-test.jmx -l results.jtl -e -o report
artifacts:
paths:
- results.jtl
- report/
expire_in: 30 days
Or with a lighter setup:
jmeter_test:
stage: performance
image: openjdk:11-jdk-slim
before_script:
- apt-get update && apt-get install -y wget
- wget -q https://archive.apache.org/dist/jmeter/binaries/apache-jmeter-5.5.tgz
- tar -xzf apache-jmeter-5.5.tgz
script:
- ./apache-jmeter-5.5/bin/jmeter -n -t tests/load-test.jmx -l results.jtl
artifacts:
paths:
- results.jtl
Scheduled Performance Tests
Run tests on a schedule independent of deployments:
performance_nightly:
stage: performance
image: grafana/k6:latest
script:
- k6 run tests/soak-test.js --out json=results.json
artifacts:
paths:
- results.json
rules:
- if: $CI_PIPELINE_SOURCE == "schedule"
Configure the schedule in GitLab:
- Go to CI/CD → Schedules
- Create new schedule (e.g., daily at 2 AM)
- Set target branch and any variables
GitLab Performance Metrics
GitLab can display performance metrics directly in merge requests:
performance_test:
stage: performance
image: grafana/k6:latest
script:
- k6 run tests/load-test.js --out json=gl-performance.json
artifacts:
reports:
performance: gl-performance.json
The performance report type enables GitLab’s built-in performance comparison between branches.
Using Docker Services
For tests requiring additional services (databases, mock servers):
performance_with_services:
stage: performance
image: grafana/k6:latest
services:
- name: wiremock/wiremock:latest
alias: mock-api
variables:
TARGET_URL: "http://mock-api:8080"
script:
- k6 run tests/integration-test.js
Parallel Test Execution
Split tests across multiple jobs for faster feedback:
performance_test:
stage: performance
image: grafana/k6:latest
parallel:
matrix:
- TEST_SCRIPT: [api-test.js, web-test.js, checkout-test.js]
script:
- k6 run tests/$TEST_SCRIPT --out json=results-$TEST_SCRIPT.json
artifacts:
paths:
- results-*.json
Storing Results Long-Term
Upload results to external storage for trend analysis:
performance_test:
stage: performance
image: grafana/k6:latest
script:
- k6 run tests/load-test.js --out json=results.json
- |
curl -X PUT \
-H "Authorization: Bearer $S3_TOKEN" \
-T results.json \
"https://s3.amazonaws.com/perf-results/${CI_COMMIT_SHA}/results.json"
artifacts:
paths:
- results.json
Performance Gates
Fail the pipeline if performance degrades:
performance_test:
stage: performance
image: grafana/k6:latest
script:
- k6 run tests/load-test.js --out json=results.json
after_script:
- |
# Check if thresholds passed
if grep -q '"thresholds":{"http_req_duration":\[{"ok":false' results.json; then
echo "Performance threshold failed!"
exit 1
fi
k6 exits with code 99 when thresholds fail, which GitLab treats as a job failure automatically.
Notifications
Send results to Slack or other channels:
performance_test:
stage: performance
image: grafana/k6:latest
script:
- k6 run tests/load-test.js --out json=results.json 2>&1 | tee output.log
after_script:
- |
# Extract summary and send to Slack
SUMMARY=$(tail -20 output.log)
curl -X POST -H 'Content-type: application/json' \
--data "{\"text\":\"Performance Test Results:\n\`\`\`$SUMMARY\`\`\`\"}" \
$SLACK_WEBHOOK_URL
Runner Configuration
For consistent results, use dedicated runners for performance tests:
# /etc/gitlab-runner/config.toml
[[runners]]
name = "performance-runner"
executor = "docker"
[runners.docker]
memory = "4g"
cpus = "2"
privileged = false
[runners.custom_build_dir]
enabled = true
Tag the job to use specific runners:
performance_test:
stage: performance
tags:
- performance
- high-memory
script:
- k6 run tests/load-test.js
Complete Pipeline Example
stages:
- build
- test
- performance
- deploy
variables:
STAGING_URL: "https://staging.example.com"
build:
stage: build
script:
- docker build -t $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA .
- docker push $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA
unit_tests:
stage: test
script:
- npm test
performance_test:
stage: performance
image: grafana/k6:latest
variables:
TARGET_URL: $STAGING_URL
script:
- k6 run tests/load-test.js --out json=gl-performance.json
artifacts:
reports:
performance: gl-performance.json
paths:
- gl-performance.json
rules:
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
deploy_production:
stage: deploy
script:
- ./deploy.sh
environment:
name: production
when: manual
only:
- main
GitLab CI’s native features for artifacts, environments, and metrics make it well-suited for performance testing pipelines. Start with simple threshold-based tests and expand to scheduled soak tests and multi-environment validation as your needs grow.
Tags: