CTEST(1) | CMake | CTEST(1) |
ctest - CTest Command-Line Reference
Run Tests ctest [<options>] [--test-dir <path-to-build>] Build and Test Mode ctest --build-and-test <path-to-source> <path-to-build> --build-generator <generator> [<options>...] [--build-options <opts>...] [--test-command <command> [<args>...]] Dashboard Client ctest -D <dashboard> [-- <dashboard-options>...] ctest -M <model> -T <action> [-- <dashboard-options>...] ctest -S <script> [-- <dashboard-options>...] ctest -SP <script> [-- <dashboard-options>...] View Help ctest --help[-<topic>]
The ctest executable is the CMake test driver program. CMake-generated build trees created for projects that use the enable_testing() and add_test() commands have testing support. This program will run the tests and report results.
Some CMake-generated build trees can have multiple build configurations in the same tree. This option can be used to specify which one should be tested. Example configurations are Debug and Release.
When the output of ctest is being sent directly to a terminal, the progress through the set of tests is reported by updating the same line rather than printing start and end messages for each test on new lines. This can significantly reduce the verbosity of the test output. Test completion messages are still output on their own line for failed tests and the final test summary will also still be logged.
This option can also be enabled by setting the environment variable CTEST_PROGRESS_OUTPUT.
Test output is normally suppressed and only summary information is displayed. This option will show all test output.
Test output is normally suppressed and only summary information is displayed. This option will show even more test output.
This feature will result in a large number of output that is mostly useful for debugging dashboard problems.
This option allows CTest to resume a test set execution that was previously interrupted. If no interruption occurred, the -F option will have no effect.
This option tells CTest to run the tests in parallel using given number of jobs. This option can also be set by setting the CTEST_PARALLEL_LEVEL environment variable.
This option can be used with the PROCESSORS test property.
See Label and Subproject Summary.
When ctest is run as a Dashboard Client this sets the ResourceSpecFile option of the CTest Test Step.
When ctest is run as a Dashboard Client this sets the TestLoad option of the CTest Test Step.
This option will suppress all the output. The output log file will still be generated if the --output-log is specified. Options such as --verbose, --extra-verbose, and --debug are ignored if --quiet is specified.
This option tells CTest to write all its output to a <file> log file.
Write test results in JUnit format.
This option tells CTest to write test results to <file> in JUnit XML format. If <file> already exists, it will be overwritten. If using the -S option to run a dashboard script, use the OUTPUT_JUNIT keyword with the ctest_test() command instead.
This option tells CTest to list the tests that would be run but not actually run them. Useful in conjunction with the -R and -E options.
New in version 3.14: The --show-only option accepts a <format> value.
<format> can be one of the following values.
This option tells CTest to run only the tests whose labels match the given regular expression. When more than one -L option is given, a test will only be run if each regular expression matches at least one of the test's labels (i.e. the multiple -L labels form an AND relationship). See Label Matching.
This option tells CTest to run only the tests whose names match the given regular expression.
This option tells CTest to NOT run the tests whose names match the given regular expression.
This option tells CTest to NOT run the tests whose labels match the given regular expression. When more than one -LE option is given, a test will only be excluded if each regular expression matches at least one of the test's labels (i.e. the multiple -LE labels form an AND relationship). See Label Matching.
If a test in the set of tests to be executed requires a particular fixture, that fixture's setup and cleanup tests would normally be added to the test set automatically. This option prevents adding setup or cleanup tests for fixtures matching the <regex>. Note that all other fixture behavior is retained, including test dependencies and skipping tests that have fixture setup tests that fail.
This option causes CTest to run tests starting at number Start, ending at number End, and incrementing by Stride. Any additional numbers after Stride are considered individual test numbers. Start, End, or Stride can be empty. Optionally a file can be given that contains the same syntax as the command line.
When both -R and -I are specified by default the intersection of tests are run. By specifying -U the union of tests is run instead.
This option tells CTest to perform only the tests that failed during its previous run. When this option is specified, CTest ignores all other options intended to modify the list of tests to run ( -L, -R, -E, -LE, -I, etc). In the event that CTest runs and no tests fail, subsequent calls to CTest with the --rerun-failed option will run the set of tests that most recently failed (if any).
Set the maximum width for each test name to show in the output. This allows the user to widen the output to avoid clipping the test name which can be very annoying.
This option causes CTest to run tests in either an interactive mode or a non-interactive mode. In dashboard mode (Experimental, Nightly, Continuous), the default is non-interactive. In non-interactive mode, the environment variable DASHBOARD_TEST_FROM_CTEST is set.
Prior to CMake 3.11, interactive mode on Windows allowed system debug popup windows to appear. Now, due to CTest's use of libuv to launch test processes, all system debug popup windows are always blocked.
This option tells CTest not to print summary information for each label associated with the tests run. If there are no labels on the tests, nothing extra is printed.
See Label and Subproject Summary.
This option tells CTest not to print summary information for each subproject associated with the tests run. If there are no subprojects on the tests, nothing extra is printed.
See Label and Subproject Summary.
Limit the output for passed tests to <size> bytes.
Limit the output for failed tests to <size> bytes.
Truncate tail (default), middle or head of test output once maximum output size is reached.
By default CTest uses configuration options from configuration file. This option will overwrite the configuration option.
By default CTest will run child CTest instances within the same process. If this behavior is not desired, this argument will enforce new processes for child CTest processes.
This option will run the tests in a random order. It is commonly used to detect implicit dependencies in a test suite.
This option effectively sets a timeout on all tests that do not already have a timeout set on them via the TIMEOUT property.
Set a real time of day at which all tests should timeout. Example: 7:00:00 -0400. Any time format understood by the curl date parser is accepted. Local time is assumed if no timezone is specified.
This option will not run any tests, it will simply print the list of all labels associated with the test set.
If no tests were found, the default behavior of CTest is to always log an error message but to return an error code in script mode only. This option unifies the behavior of CTest by either returning an error code if no tests were found or by ignoring it.
New in version 3.26.
This option can also be set by setting the CTEST_NO_TESTS_ACTION environment variable.
To print version details or selected pages from the CMake documentation, use one of the following options:
Usage describes the basic command line interface and its options.
<keyword> can be a property, variable, command, policy, generator or module.
The relevant manual entry for <keyword> is printed in a human-readable text format. The output is printed to a named <file> if given.
Changed in version 3.28: Prior to CMake 3.28, this option supported command names only.
All manuals are printed in a human-readable text format. The output is printed to a named <file> if given.
The specified manual is printed in a human-readable text format. The output is printed to a named <file> if given.
The list contains all manuals for which help may be obtained by using the --help-manual option followed by a manual name. The output is printed to a named <file> if given.
The cmake-commands(7) manual entry for <cmd> is printed in a human-readable text format. The output is printed to a named <file> if given.
The list contains all commands for which help may be obtained by using the --help-command option followed by a command name. The output is printed to a named <file> if given.
The cmake-commands(7) manual is printed in a human-readable text format. The output is printed to a named <file> if given.
The cmake-modules(7) manual entry for <mod> is printed in a human-readable text format. The output is printed to a named <file> if given.
The list contains all modules for which help may be obtained by using the --help-module option followed by a module name. The output is printed to a named <file> if given.
The cmake-modules(7) manual is printed in a human-readable text format. The output is printed to a named <file> if given.
The cmake-policies(7) manual entry for <cmp> is printed in a human-readable text format. The output is printed to a named <file> if given.
The list contains all policies for which help may be obtained by using the --help-policy option followed by a policy name. The output is printed to a named <file> if given.
The cmake-policies(7) manual is printed in a human-readable text format. The output is printed to a named <file> if given.
The cmake-properties(7) manual entries for <prop> are printed in a human-readable text format. The output is printed to a named <file> if given.
The list contains all properties for which help may be obtained by using the --help-property option followed by a property name. The output is printed to a named <file> if given.
The cmake-properties(7) manual is printed in a human-readable text format. The output is printed to a named <file> if given.
The cmake-variables(7) manual entry for <var> is printed in a human-readable text format. The output is printed to a named <file> if given.
The list contains all variables for which help may be obtained by using the --help-variable option followed by a variable name. The output is printed to a named <file> if given.
The cmake-variables(7) manual is printed in a human-readable text format. The output is printed to a named <file> if given.
Tests may have labels attached to them. Tests may be included or excluded from a test run by filtering on the labels. Each individual filter is a regular expression applied to the labels attached to a test.
When -L is used, in order for a test to be included in a test run, each regular expression must match at least one label. Using more than one -L option means "match all of these".
The -LE option works just like -L, but excludes tests rather than including them. A test is excluded if each regular expression matches at least one label.
If a test has no labels attached to it, then -L will never include that test, and -LE will never exclude that test. As an example of tests with labels, consider five tests, with the following labels:
Running ctest with -L tuesday -L test will select test2, which has both labels. Running CTest with -L test will select test2 and test5, because both of them have a label that matches that regular expression.
Because the matching works with regular expressions, take note that running CTest with -L es will match all five tests. To select the tuesday and wednesday tests together, use a single regular expression that matches either of them, like -L "tue|wed".
CTest prints timing summary information for each LABEL and subproject associated with the tests run. The label time summary will not include labels that are mapped to subprojects.
New in version 3.22: Labels added dynamically during test execution are also reported in the timing summary. See Additional Labels.
When the PROCESSORS test property is set, CTest will display a weighted test timing result in label and subproject summaries. The time is reported with sec*proc instead of just sec.
The weighted time summary reported for each label or subproject j is computed as:
Weighted Time Summary for Label/Subproject j = sum(raw_test_time[j,i] * num_processors[j,i], i=1...num_tests[j]) for labels/subprojects j=1...total
where:
Therefore, the weighted time summary for each label or subproject represents the amount of time that CTest gave to run the tests for each label or subproject and gives a good representation of the total expense of the tests for each label or subproject when compared to other labels or subprojects.
For example, if SubprojectA showed 100 sec*proc and SubprojectB showed 10 sec*proc, then CTest allocated approximately 10 times the CPU/core time to run the tests for SubprojectA than for SubprojectB (e.g. so if effort is going to be expended to reduce the cost of the test suite for the whole project, then reducing the cost of the test suite for SubprojectA would likely have a larger impact than effort to reduce the cost of the test suite for SubprojectB).
CTest provides a command-line signature to configure (i.e. run cmake on), build, and/or execute a test:
ctest --build-and-test <path-to-source> <path-to-build> --build-generator <generator> [<options>...] [--build-options <opts>...] [--test-command <command> [<args>...]]
The configure and test steps are optional. The arguments to this command line are the source and binary directories. The --build-generator option must be provided to use --build-and-test. If --test-command is specified then that will be run after the build is complete. Other options that affect this mode include:
If no --build-target is specified, the all target is built.
Skip the cmake step.
Directory where programs will be after it has been compiled.
CTest can operate as a client for the CDash software quality dashboard application. As a dashboard client, CTest performs a sequence of steps to configure, build, and test software, and then submits the results to a CDash server. The command-line signature used to submit to CDash is:
ctest -D <dashboard> [-- <dashboard-options>...] ctest -M <model> -T <action> [-- <dashboard-options>...] ctest -S <script> [-- <dashboard-options>...] ctest -SP <script> [-- <dashboard-options>...]
Options for Dashboard Client include:
This option tells CTest to act as a CDash client and perform a dashboard test. All tests are <Mode><Test>, where <Mode> can be Experimental, Nightly, and Continuous, and <Test> can be Start, Update, Configure, Build, Test, Coverage, and Submit.
If <dashboard> is not one of the recognized <Mode><Test> values, this will be treated as a variable definition instead (see the dashboard-options further below).
This option tells CTest to act as a CDash client where the <model> can be Experimental, Nightly, and Continuous. Combining -M and -T is similar to -D.
This option tells CTest to act as a CDash client and perform some action such as start, build, test etc. See Dashboard Client Steps for the full list of actions. Combining -M and -T is similar to -D.
This option tells CTest to load in a configuration script which sets a number of parameters such as the binary and source directories. Then CTest will do what is required to create and run a dashboard. This option basically sets up a dashboard and then runs ctest -D with the appropriate options.
This option does the same operations as -S but it will do them in a separate process. This is primarily useful in cases where the script may modify the environment and you do not want the modified environment to impact other -S scripts.
The available <dashboard-options> are the following:
Pass in variable values on the command line. Use in conjunction with -S to pass variable values to a dashboard script. Parsing -D arguments as variable values is only attempted if the value following -D does not match any of the known dashboard types.
Submit dashboard to specified group instead of default one. By default, the dashboard is submitted to Nightly, Experimental, or Continuous group, but by specifying this option, the group can be arbitrary.
This replaces the deprecated option --track. Despite the name change its behavior is unchanged.
This option tells CTest to include a notes file when submitting dashboard.
This is useful if the build will not finish in one day.
This option will submit extra files to the dashboard.
This option will force CTest to use HTTP 1.0 to submit files to the dashboard, instead of HTTP 1.1.
This flag will turn off automatic compression of test output. Use this to maintain compatibility with an older version of CDash which doesn't support compressed test output.
CTest defines an ordered list of testing steps of which some or all may be run as a dashboard client:
CTest defines three modes of operation as a dashboard client:
CTest can perform testing on an already-generated build tree. Run the ctest command with the current working directory set to the build tree and use one of these signatures:
ctest -D <mode>[<step>] ctest -M <mode> [-T <step>]...
The <mode> must be one of the above Dashboard Client Modes, and each <step> must be one of the above Dashboard Client Steps.
CTest reads the Dashboard Client Configuration settings from a file in the build tree called either CTestConfiguration.ini or DartConfiguration.tcl (the names are historical). The format of the file is:
# Lines starting in '#' are comments. # Other non-blank lines are key-value pairs. <setting>: <value>
where <setting> is the setting name and <value> is the setting value.
In build trees generated by CMake, this configuration file is generated by the CTest module if included by the project. The module uses variables to obtain a value for each setting as documented with the settings below.
CTest can perform testing driven by a cmake-language(7) script that creates and maintains the source and build tree as well as performing the testing steps. Run the ctest command with the current working directory set outside of any build tree and use one of these signatures:
ctest -S <script> ctest -SP <script>
The <script> file must call CTest Commands commands to run testing steps explicitly as documented below. The commands obtain Dashboard Client Configuration settings from their arguments or from variables set in the script.
The Dashboard Client Steps may be configured by named settings as documented in the following sections.
Start a new dashboard submission to be composed of results recorded by the following steps.
In a CTest Script, the ctest_start() command runs this step. Arguments to the command may specify some of the step settings. The command first runs the command-line specified by the CTEST_CHECKOUT_COMMAND variable, if set, to initialize the source directory.
Configuration settings include:
In a CTest Script, the ctest_update() command runs this step. Arguments to the command may specify some of the step settings.
Configuration settings to specify the version control tool include:
The source tree is updated by git fetch followed by git reset --hard to the FETCH_HEAD. The result is the same as git pull except that any local modifications are overwritten. Use GITUpdateCustom to specify a different approach.
When this variable is set to a non-empty string, CTest will report the value you specified rather than using the update command to discover the current version that is checked out. Use of this variable supersedes UpdateVersionOnly. Like UpdateVersionOnly, using this variable tells CTest not to update the source tree to a different version.
Additional configuration settings include:
In a CTest Script, the ctest_configure() command runs this step. Arguments to the command may specify some of the step settings.
Configuration settings include:
See Label and Subproject Summary.
In a CTest Script, the ctest_build() command runs this step. Arguments to the command may specify some of the step settings.
Configuration settings include:
See Label and Subproject Summary.
In a CTest Script, the ctest_test() command runs this step. Arguments to the command may specify some of the step settings.
Configuration settings include:
See Resource Allocation for more information.
See Label and Subproject Summary.
To report extra test values to CDash, see Additional Test Measurements.
In a CTest Script, the ctest_coverage() command runs this step. Arguments to the command may specify some of the step settings.
Configuration settings include:
These options are the first arguments passed to CoverageCommand.
In a CTest Script, the ctest_memcheck() command runs this step. Arguments to the command may specify some of the step settings.
Configuration settings include:
Additional configuration settings include:
In a CTest Script, the ctest_submit() command runs this step. Arguments to the command may specify some of the step settings.
Configuration settings include:
New in version 3.14.
When the --show-only=json-v1 command line option is given, the test information is output in JSON format. Version 1.0 of the JSON object model is defined as follows:
CTest provides a mechanism for tests to specify the resources that they need in a fine-grained way, and for users to specify the resources available on the running machine. This allows CTest to internally keep track of which resources are in use and which are free, scheduling tests in a way that prevents them from trying to claim resources that are not available.
When the resource allocation feature is used, CTest will not oversubscribe resources. For example, if a resource has 8 slots, CTest will not run tests that collectively use more than 8 slots at a time. This has the effect of limiting how many tests can run at any given time, even if a high -j argument is used, if those tests all use some slots from the same resource. In addition, it means that a single test that uses more of a resource than is available on a machine will not run at all (and will be reported as Not Run).
A common use case for this feature is for tests that require the use of a GPU. Multiple tests can simultaneously allocate memory from a GPU, but if too many tests try to do this at once, some of them will fail to allocate, resulting in a failed test, even though the test would have succeeded if it had the memory it needed. By using the resource allocation feature, each test can specify how much memory it requires from a GPU, allowing CTest to schedule tests in a way that running several of these tests at once does not exhaust the GPU's memory pool.
Please note that CTest has no concept of what a GPU is or how much memory it has. It does not have any way of communicating with a GPU to retrieve this information or perform any memory management, although the project can define a test that provides details about the test machine (see Dynamically-Generated Resource Specification File).
CTest keeps track of a list of abstract resource types, each of which has a certain number of slots available for tests to use. Each test specifies the number of slots that it requires from a certain resource, and CTest then schedules them in a way that prevents the total number of slots in use from exceeding the listed capacity. When a test is executed, and slots from a resource are allocated to that test, tests may assume that they have exclusive use of those slots for the duration of the test's process.
The CTest resource allocation feature consists of at least two inputs:
When CTest runs a test, the resources allocated to that test are passed in the form of a set of environment variables as described below. Using this information to decide which resource to connect to is left to the test writer.
The RESOURCE_GROUPS property tells CTest what resources a test expects to use grouped in a way meaningful to the test. The test itself must read the environment variables to determine which resources have been allocated to each group. For example, each group may correspond to a process the test will spawn when executed.
Note that even if a test specifies a RESOURCE_GROUPS property, it is still possible for that to test to run without any resource allocation (and without the corresponding environment variables) if the user does not pass a resource specification file. Passing this file, either through the --resource-spec-file command-line argument or the RESOURCE_SPEC_FILE argument to ctest_test(), is what activates the resource allocation feature. Tests should check the CTEST_RESOURCE_GROUP_COUNT environment variable to find out whether or not resource allocation is activated. This variable will always (and only) be defined if resource allocation is activated. If resource allocation is not activated, then the CTEST_RESOURCE_GROUP_COUNT variable will not exist, even if it exists for the parent ctest process. If a test absolutely must have resource allocation, then it can return a failing exit code or use the SKIP_RETURN_CODE or SKIP_REGULAR_EXPRESSION properties to indicate a skipped test.
The resource specification file is a JSON file which is passed to CTest in one of a number of ways. It can be specified on the command line with the ctest --resource-spec-file option, it can be given using the RESOURCE_SPEC_FILE argument of ctest_test(), or it can be generated dynamically as part of test execution (see Dynamically-Generated Resource Specification File).
If a dashboard script is used and RESOURCE_SPEC_FILE is not specified, the value of CTEST_RESOURCE_SPEC_FILE in the dashboard script is used instead. If --resource-spec-file, RESOURCE_SPEC_FILE, and CTEST_RESOURCE_SPEC_FILE in the dashboard script are not specified, the value of CTEST_RESOURCE_SPEC_FILE in the CMake build is used instead. If none of these are specified, no resource spec file is used.
The resource specification file must be a JSON object. All examples in this document assume the following resource specification file:
{ "version": { "major": 1, "minor": 0 }, "local": [ { "gpus": [ { "id": "0", "slots": 2 }, { "id": "1", "slots": 4 }, { "id": "2", "slots": 2 }, { "id": "3" } ], "crypto_chips": [ { "id": "card0", "slots": 4 } ] } ] }
The members are:
Each array element is a JSON object with members whose names are equal to the desired resource types, such as gpus. These names must start with a lowercase letter or an underscore, and subsequent characters can be a lowercase letter, a digit, or an underscore. Uppercase letters are not allowed, because certain platforms have case-insensitive environment variables. See the Environment Variables section below for more information. It is recommended that the resource type name be the plural of a noun, such as gpus or crypto_chips (and not gpu or crypto_chip.)
Please note that the names gpus and crypto_chips are just examples, and CTest does not interpret them in any way. You are free to make up any resource type you want to meet your own requirements.
The value for each resource type is a JSON array consisting of JSON objects, each of which describe a specific instance of the specified resource. These objects have the following members:
Identifiers must be unique within a resource type. However, they do not have to be unique across resource types. For example, it is valid to have a gpus resource named 0 and a crypto_chips resource named 0, but not two gpus resources both named 0.
Please note that the IDs 0, 1, 2, 3, and card0 are just examples, and CTest does not interpret them in any way. You are free to make up any IDs you want to meet your own requirements.
In the example file above, there are four GPUs with ID's 0 through 3. GPU 0 has 2 slots, GPU 1 has 4, GPU 2 has 2, and GPU 3 has a default of 1 slot. There is also one cryptography chip with 4 slots.
See RESOURCE_GROUPS for a description of this property.
Once CTest has decided which resources to allocate to a test, it passes this information to the test executable as a series of environment variables. For each example below, we will assume that the test in question has a RESOURCE_GROUPS property of 2,gpus:2;gpus:4,gpus:1,crypto_chips:2.
The following variables are passed to the test process:
This variable will only be defined if ctest(1) has been given a --resource-spec-file, or if ctest_test() has been given a RESOURCE_SPEC_FILE. If no resource specification file has been given, this variable will not be defined.
In this example, group 0 gets 2 slots from GPU 0, group 1 gets 2 slots from GPU 2, and group 2 gets 4 slots from GPU 1, 1 slot from GPU 3, and 2 slots from cryptography chip card0.
<num> is a number from zero to CTEST_RESOURCE_GROUP_COUNT minus one. <resource-type> is the name of a resource type, converted to uppercase. CTEST_RESOURCE_GROUP_<num>_<resource-type> is defined for the product of each <num> in the range listed above and each resource type listed in CTEST_RESOURCE_GROUP_<num>.
Because some platforms have case-insensitive names for environment variables, the names of resource types may not clash in a case-insensitive environment. Because of this, for the sake of simplicity, all resource types must be listed in all lowercase in the resource specification file and in the RESOURCE_GROUPS property, and they are converted to all uppercase in the CTEST_RESOURCE_GROUP_<num>_<resource-type> environment variable.
New in version 3.28.
A project may optionally specify a single test which will be used to dynamically generate the resource specification file that CTest will use for scheduling tests that use resources. The test that generates the file must have the GENERATED_RESOURCE_SPEC_FILE property set, and must have exactly one fixture in its FIXTURES_SETUP property. This fixture is considered by CTest to have special meaning: it's the fixture that generates the resource spec file. The fixture may have any name. If such a fixture exists, all tests that have RESOURCE_GROUPS set must have the fixture in their FIXTURES_REQUIRED, and a resource spec file may not be specified with the --resource-spec-file argument or the CTEST_RESOURCE_SPEC_FILE variable.
The following resources are available to get help using CMake:
The primary starting point for learning about CMake.
Links to available documentation and community resources may be found on this web page.
The Discourse Forum hosts discussion and questions about CMake.
2000-2024 Kitware, Inc. and Contributors
April 15, 2024 | 3.28.3 |