Problem Statement
Design a comprehensive continuous testing strategy for CI/CD. Include test types, execution frequency, quality gates, and feedback mechanisms.
Explanation
Continuous testing integrates testing throughout development lifecycle with automated execution and rapid feedback. Strategy components: test pyramid implementation, quality gates, shift-left testing, test automation at all levels.
Test strategy by pipeline stage:
```yaml
stages:
- validate
- unit-test
- build
- integration-test
- security-scan
- deploy-staging
- acceptance-test
- deploy-production
# Stage 1: Pre-commit validation
lint:
stage: validate
script:
- npm run lint
- npm run format-check
only:
- merge_requests
# Stage 2: Unit tests (every commit)
unit-tests:
stage: unit-test
script:
- npm test -- --coverage
coverage: '/Statements\s*:\s*(\d+\.\d+)%/'
artifacts:
reports:
coverage_report:
coverage_format: cobertura
path: coverage/cobertura-coverage.xml
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
- if: '$CI_COMMIT_BRANCH == "main"'
# Stage 3: Build
build:
stage: build
script:
- docker build -t myapp:$CI_COMMIT_SHA .
dependencies:
- unit-tests
# Stage 4: Integration tests
integration-tests:
stage: integration-test
services:
- postgres:13
- redis:6
script:
- npm run test:integration
only:
- main
- merge_requests
# Stage 5: Security scanning
security-scan:
stage: security-scan
script:
- npm audit --audit-level=high
- trivy image myapp:$CI_COMMIT_SHA
- semgrep --config=auto src/
allow_failure: false
# Stage 6: Deploy to staging
deploy-staging:
stage: deploy-staging
script:
- kubectl apply -f k8s/staging/
environment:
name: staging
only:
- main
# Stage 7: Acceptance/E2E tests
acceptance-tests:
stage: acceptance-test
script:
- npm run test:e2e:staging
dependencies:
- deploy-staging
retry:
max: 2
when: script_failure
performance-tests:
stage: acceptance-test
script:
- k6 run performance/load-test.js
only:
- main
allow_failure: true
# Stage 8: Production deployment
deploy-production:
stage: deploy-production
script:
- kubectl apply -f k8s/production/
environment:
name: production
when: manual
only:
- main
# Smoke tests after production deploy
smoke-tests:
stage: deploy-production
script:
- npm run test:smoke:production
dependencies:
- deploy-production
when: on_success
```
Quality gates enforce quality standards:
```javascript
// quality-gates.js
const qualityGates = {
unitTests: {
coverage: {
statements: 80,
branches: 75,
functions: 80,
lines: 80
},
passRate: 100 // All tests must pass
},
staticAnalysis: {
criticalIssues: 0,
highIssues: 0,
codeSmells: 50,
duplication: 3 // percent
},
security: {
criticalVulnerabilities: 0,
highVulnerabilities: 0,
mediumVulnerabilities: 10
},
performance: {
p95ResponseTime: 500, // ms
errorRate: 1 // percent
}
};
function evaluateQualityGates(results) {
const failures = [];
// Check coverage
if (results.coverage.statements < qualityGates.unitTests.coverage.statements) {
failures.push(`Statement coverage ${results.coverage.statements}% below threshold`);
}
// Check vulnerabilities
if (results.security.critical > qualityGates.security.criticalVulnerabilities) {
failures.push(`${results.security.critical} critical vulnerabilities found`);
}
if (failures.length > 0) {
console.error('Quality gate failed:');
failures.forEach(f => console.error(` - ${f}`));
process.exit(1);
}
console.log('All quality gates passed');
}
```
Test execution frequency:
1. On every commit:
- Linting
- Unit tests
- Static analysis
2. On merge request:
- All commit checks
- Integration tests
- Security scans
- Code coverage
3. On main branch:
- All MR checks
- E2E tests (smoke suite)
- Container scanning
- Deploy to staging
4. Nightly:
- Full E2E test suite
- Performance tests
- Cross-browser tests
- Visual regression tests
- Dependency updates check
5. Weekly:
- Penetration testing
- Chaos engineering tests
- Disaster recovery drills
Shift-left testing moves testing earlier:
```javascript
// Pre-commit hooks with Husky
// .husky/pre-commit
#!/bin/sh
. "$(dirname "$0")/_/husky.sh"
npm run lint
npm test -- --findRelatedTests --bail
npm run type-check
```
Feedback mechanisms:
1. Pull request annotations:
```yaml
test:
script:
- npm test -- --json --outputFile=test-results.json
after_script:
- node scripts/annotate-pr.js test-results.json
```
2. Slack notifications:
```yaml
notify:
stage: .post
script:
- |
curl -X POST $SLACK_WEBHOOK \
-d "{
'text': 'Pipeline ${CI_PIPELINE_STATUS} for ${CI_COMMIT_REF_NAME}',
'attachments': [{
'color': '${CI_PIPELINE_STATUS == "success" ? "good" : "danger"}',
'fields': [
{'title': 'Project', 'value': '${CI_PROJECT_NAME}'},
{'title': 'Branch', 'value': '${CI_COMMIT_REF_NAME}'},
{'title': 'Coverage', 'value': '${COVERAGE}%'}
]
}]
}"
when: always
```
3. Dashboard metrics:
```javascript
// Collect test metrics
const metrics = {
timestamp: new Date(),
pipeline: process.env.CI_PIPELINE_ID,
branch: process.env.CI_COMMIT_BRANCH,
unitTests: {
total: testResults.numTotalTests,
passed: testResults.numPassedTests,
failed: testResults.numFailedTests,
duration: testResults.testResults.reduce((sum, r) =>
sum + r.perfStats.end - r.perfStats.start, 0)
},
coverage: {
statements: coverage.total.statements.pct,
branches: coverage.total.branches.pct
}
};
// Send to monitoring system
await sendMetrics(metrics);
```
4. Test trend analysis:
```python
# Analyze test trends
import pandas as pd
import matplotlib.pyplot as plt
df = pd.read_csv('test-history.csv')
df['date'] = pd.to_datetime(df['date'])
# Plot coverage trend
plt.plot(df['date'], df['coverage'])
plt.axhline(y=80, color='r', linestyle='--', label='Threshold')
plt.xlabel('Date')
plt.ylabel('Coverage %')
plt.title('Test Coverage Trend')
plt.savefig('coverage-trend.png')
```
Best practices:
1. Fail fast (run fastest tests first)
2. Test in production-like environment
3. Automate everything
4. Make tests deterministic
5. Maintain test hygiene (remove obsolete tests)
6. Monitor test health metrics
7. Provide clear failure messages
8. Enable easy local reproduction
9. Balance speed and coverage
10. Continuous improvement (review metrics, optimize)
Understanding continuous testing strategy ensures quality built into entire development process with rapid, automated feedback.