Performance tuning

If your test suite is slow, here is a checklist of options to speed things up.

Scenario parallelism

By default, scenarios within a feature run in parallel. The number of concurrent scenarios is:

scenarioExecutionParallelismFactor * number of CPUs + 1

The default factor is 1. If your scenarios are IO-bound (waiting on HTTP responses, databases, message brokers), increasing this factor lets more scenarios run while others wait:

cornichon {
  scenarioExecutionParallelismFactor = 2
}

See Execution model for details.

Feature parallelism

By default, features run sequentially. If your features are independent, enable parallel feature execution in your SBT build:

Test / parallelExecution := true

When using the CLI runner, use the --featureParallelism flag:

--featureParallelism 4

Step-level concurrency

Steps within a scenario run sequentially because each step can depend on the session state from the previous one. When you have independent steps that don't depend on each other, use Concurrently to run them in parallel:

Concurrently(maxTime = 10.seconds) {
  When I get("/api/users")
  When I get("/api/products")
  When I get("/api/orders")
}

For load-testing a single operation, use RepeatConcurrently:

RepeatConcurrently(times = 50, parallelism = 10, maxTime = 30.seconds) {
  When I get("/api/health")
  Then assert status.is(200)
}

Both are documented in the DSL wrapper steps.

Request timeout

The default HTTP request timeout is 2 seconds. If your test target is slow, individual requests may time out and trigger retries or failures. Adjust it per feature:

override lazy val requestTimeout = 5.seconds

Or globally in application.conf:

cornichon {
  requestTimeout = 5 seconds
}

Conversely, if your target is fast, lowering the timeout lets failures surface quicker.

Reducing test scope

When debugging or iterating on a specific scenario, avoid running the entire suite: