Skip to content

Commit 3b8788d

Browse files
Add support for skipping testitems (#117)
* Initial support for skipping testitems * Add some tests * Simplify module building * Test skipped testitems have empty stats * WIP integration tests for skipping testitems * more tests * more tests 2 * docs * Test JUnit report for skipped test-items * cleanup * Fixup block expr test on v1.10 * Update README.md Co-authored-by: Nathan Daly <NHDaly@gmail.com> * Update src/macros.jl Co-authored-by: Nathan Daly <NHDaly@gmail.com> * Remove unused file * Fix and test log alignment * Print SKIP in warning color * Emphasise difference between `skip` and filtering `runtests` * fixup! Emphasise difference between `skip` and filtering `runtests` * Bump version * fixup! Fix and test log alignment --------- Co-authored-by: Nathan Daly <NHDaly@gmail.com>
1 parent 597f7df commit 3b8788d

File tree

12 files changed

+455
-48
lines changed

12 files changed

+455
-48
lines changed

Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
name = "ReTestItems"
22
uuid = "817f1d60-ba6b-4fd5-9520-3cf149f6a823"
3-
version = "1.22.0"
3+
version = "1.23.0"
44

55
[deps]
66
Dates = "ade2ca70-3891-5945-98fb-dc099432e06a"

README.md

Lines changed: 47 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,17 @@ julia> runtests(
6060
)
6161
```
6262

63-
You can use the `name` keyword, to select test-items by name.
63+
For interactive sessions, all logs from the tests will be printed out in the REPL by default.
64+
You can disable this by passing `logs=:issues` in which case logs from a test-item are only printed if that test-items errors or fails.
65+
`logs=:issues` is also the default for non-interactive sessions.
66+
67+
```julia
68+
julia> runtests("test/Database/"; logs=:issues)
69+
```
70+
71+
#### Filtering tests
72+
73+
You can use the `name` keyword to select test-items by name.
6474
Pass a string to select a test-item by its exact name,
6575
or pass a regular expression (regex) to match multiple test-item names.
6676

@@ -70,12 +80,19 @@ julia> runtests("test/Database/"; name="issue-123")
7080
julia> runtests("test/Database/"; name=r"^issue")
7181
```
7282

73-
For interactive sessions, all logs from the tests will be printed out in the REPL by default.
74-
You can disable this by passing `logs=:issues` in which case logs from a test-item are only printed if that test-items errors or fails.
75-
`logs=:issues` is also the default for non-interactive sessions.
83+
You can pass `tags` to select test-items by tag.
84+
When passing multiple tags a test-item is only run if it has all the requested tags.
7685

7786
```julia
78-
julia> runtests("test/Database/"; logs=:issues)
87+
# Run tests that are tagged as both `regression` and `fast`
88+
julia> runtests("test/Database/"; tags=[:regression, :fast])
89+
```
90+
91+
Filtering by `name` and `tags` can be combined to run only test-items that match both the name and tags.
92+
93+
```julia
94+
# Run tests named `issue*` which also have tag `regression`.
95+
julia> runtests("test/Database/"; tags=:regression, name=r"^issue")
7996
```
8097

8198
## Writing tests
@@ -130,6 +147,31 @@ end
130147
The `setup` is run once on each worker process that requires it;
131148
it is not run before every `@testitem` that depends on the setup.
132149

150+
#### Skipping tests
151+
152+
The `skip` keyword can be used to skip a `@testitem`, meaning no code inside that test-item will run.
153+
A skipped test-item logs that it is being skipped and records a single "skipped" test result, similar to `@test_skip`.
154+
155+
```julia
156+
@testitem "skipped" skip=true begin
157+
@test false
158+
end
159+
```
160+
161+
If `skip` is given as an `Expr`, it must return a `Bool` indicating whether or not to skip the test-item.
162+
This expression will be run in a new module similar to a test-item immediately before the test-item would be run.
163+
164+
```julia
165+
# Don't run "orc v1" tests if we don't have orc v1
166+
@testitem "orc v1" skip=:(using LLVM; !LLVM.has_orc_v1()) begin
167+
# tests
168+
end
169+
```
170+
171+
The `skip` keyword allows you to define the condition under which a test needs to be skipped,
172+
for example if it can only be run on a certain platform.
173+
See [filtering tests](#filtering-tests) for controlling which tests run in a particular `runtests` call.
174+
133175
#### Post-testitem hook
134176

135177
If there is something that should be checked after every single `@testitem`, then it's possible to pass an expression to `runtests` using the `test_end_expr` keyword.

src/ReTestItems.jl

Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -861,6 +861,40 @@ end
861861
const GLOBAL_TEST_CONTEXT_FOR_TESTING = TestContext("ReTestItems", 0)
862862
const GLOBAL_TEST_SETUPS_FOR_TESTING = Dict{Symbol, TestSetup}()
863863

864+
# Check the `skip` keyword, and return a `Bool` indicating if we should skip the testitem.
865+
# If `skip` is an expression, run it in a new module just like how we run testitems.
866+
# If the `skip` expression doesn't return a Bool, throw an informative error.
867+
function should_skip(ti::TestItem)
868+
ti.skip isa Bool && return ti.skip
869+
# `skip` is an expression.
870+
# Give same scope as testitem body, e.g. imports should work.
871+
skip_body = deepcopy(ti.skip::Expr)
872+
softscope_all!(skip_body)
873+
# Run in a new module to not pollute `Main`.
874+
# Need to store the result of the `skip` expression so we can check it.
875+
mod_name = gensym(Symbol(:skip_, ti.name))
876+
skip_var = gensym(:skip)
877+
skip_mod_expr = :(module $mod_name; $skip_var = $skip_body; end)
878+
skip_mod = Core.eval(Main, skip_mod_expr)
879+
# Check what the expression evaluated to.
880+
skip = getfield(skip_mod, skip_var)
881+
!isa(skip, Bool) && _throw_not_bool(ti, skip)
882+
return skip::Bool
883+
end
884+
_throw_not_bool(ti, skip) = error("Test item $(repr(ti.name)) `skip` keyword must be a `Bool`, got `skip=$(repr(skip))`")
885+
886+
# Log that we skipped the testitem, and record a "skipped" test result with empty stats.
887+
function skiptestitem(ti::TestItem, ctx::TestContext; verbose_results::Bool=true)
888+
ts = DefaultTestSet(ti.name; verbose=verbose_results)
889+
Test.record(ts, Test.Broken(:skipped, ti.name))
890+
push!(ti.testsets, ts)
891+
stats = PerfStats()
892+
push!(ti.stats, stats)
893+
log_testitem_skipped(ti, ctx.ntestitems)
894+
return TestItemResult(ts, stats)
895+
end
896+
897+
864898
# assumes any required setups were expanded outside of a runtests context
865899
function runtestitem(ti::TestItem; kw...)
866900
# make a fresh TestSetupModules for each testitem run
@@ -879,6 +913,9 @@ function runtestitem(
879913
ti::TestItem, ctx::TestContext;
880914
test_end_expr::Expr=Expr(:block), logs::Symbol=:eager, verbose_results::Bool=true, finish_test::Bool=true,
881915
)
916+
if should_skip(ti)::Bool
917+
return skiptestitem(ti, ctx; verbose_results)
918+
end
882919
name = ti.name
883920
log_testitem_start(ti, ctx.ntestitems)
884921
ts = DefaultTestSet(name; verbose=verbose_results)

src/log_capture.jl

Lines changed: 28 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ function _print_scaled_one_dec(io, value, scale, label="")
5555
end
5656
print(io, label)
5757
end
58-
function time_print(io; elapsedtime, bytes=0, gctime=0, allocs=0, compile_time=0, recompile_time=0)
58+
function print_time(io; elapsedtime, bytes=0, gctime=0, allocs=0, compile_time=0, recompile_time=0)
5959
_print_scaled_one_dec(io, elapsedtime, 1e9, " secs")
6060
if gctime > 0 || compile_time > 0
6161
print(io, " (")
@@ -241,35 +241,47 @@ function _print_test_errors(report_iob, ts::DefaultTestSet, worker_info)
241241
return nothing
242242
end
243243

244-
# Marks the start of each test item
245-
function log_testitem_start(ti::TestItem, ntestitems=0)
246-
io = IOContext(IOBuffer(), :color => get(DEFAULT_STDOUT[], :color, false)::Bool)
244+
function print_state(io, state, ti, ntestitems; color=:default)
247245
interactive = parse(Bool, get(ENV, "RETESTITEMS_INTERACTIVE", string(Base.isinteractive())))
248246
print(io, format(now(), "HH:MM:SS | "))
249247
!interactive && print(io, _mem_watermark())
250-
printstyled(io, "START"; bold=true)
251248
if ntestitems > 0
249+
# rpad/lpad so that the eval numbers are all vertically aligned
250+
printstyled(io, rpad(uppercase(state), 5); bold=true, color)
252251
print(io, " (", lpad(ti.eval_number[], ndigits(ntestitems)), "/", ntestitems, ")")
252+
else
253+
printstyled(io, uppercase(state); bold=true)
253254
end
254-
print(io, " test item $(repr(ti.name)) at ")
255+
print(io, " test item $(repr(ti.name)) ")
256+
end
257+
258+
function print_file_info(io, ti)
259+
print(io, "at ")
255260
printstyled(io, _file_info(ti); bold=true, color=:default)
261+
end
262+
263+
function log_testitem_skipped(ti::TestItem, ntestitems=0)
264+
io = IOContext(IOBuffer(), :color => get(DEFAULT_STDOUT[], :color, false)::Bool)
265+
print_state(io, "SKIP", ti, ntestitems; color=Base.warn_color())
266+
print_file_info(io, ti)
267+
println(io)
268+
write(DEFAULT_STDOUT[], take!(io.io))
269+
end
270+
271+
# Marks the start of each test item
272+
function log_testitem_start(ti::TestItem, ntestitems=0)
273+
io = IOContext(IOBuffer(), :color => get(DEFAULT_STDOUT[], :color, false)::Bool)
274+
print_state(io, "START", ti, ntestitems)
275+
print_file_info(io, ti)
256276
println(io)
257277
write(DEFAULT_STDOUT[], take!(io.io))
258278
end
259279

260-
# mostly copied from timing.jl
261280
function log_testitem_done(ti::TestItem, ntestitems=0)
262281
io = IOContext(IOBuffer(), :color => get(DEFAULT_STDOUT[], :color, false)::Bool)
263-
interactive = parse(Bool, get(ENV, "RETESTITEMS_INTERACTIVE", string(Base.isinteractive())))
264-
print(io, format(now(), "HH:MM:SS | "))
265-
!interactive && print(io, _mem_watermark())
266-
printstyled(io, "DONE "; bold=true)
267-
if ntestitems > 0
268-
print(io, " (", lpad(ti.eval_number[], ndigits(ntestitems)), "/", ntestitems, ")")
269-
end
270-
print(io, " test item $(repr(ti.name)) ")
282+
print_state(io, "DONE", ti, ntestitems)
271283
x = last(ti.stats) # always print stats for most recent run
272-
time_print(io; x.elapsedtime, x.bytes, x.gctime, x.allocs, x.compile_time, x.recompile_time)
284+
print_time(io; x.elapsedtime, x.bytes, x.gctime, x.allocs, x.compile_time, x.recompile_time)
273285
println(io)
274286
write(DEFAULT_STDOUT[], take!(io.io))
275287
end

src/macros.jl

Lines changed: 23 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -120,6 +120,7 @@ struct TestItem
120120
setups::Vector{Symbol}
121121
retries::Int
122122
timeout::Union{Int,Nothing} # in seconds
123+
skip::Union{Bool,Expr}
123124
file::String
124125
line::Int
125126
project_root::String
@@ -131,10 +132,10 @@ struct TestItem
131132
stats::Vector{PerfStats} # populated when the test item is finished running
132133
scheduled_for_evaluation::ScheduledForEvaluation # to keep track of whether the test item has been scheduled for evaluation
133134
end
134-
function TestItem(number, name, id, tags, default_imports, setups, retries, timeout, file, line, project_root, code)
135+
function TestItem(number, name, id, tags, default_imports, setups, retries, timeout, skip, file, line, project_root, code)
135136
_id = @something(id, repr(hash(name, hash(relpath(file, project_root)))))
136137
return TestItem(
137-
number, name, _id, tags, default_imports, setups, retries, timeout, file, line, project_root, code,
138+
number, name, _id, tags, default_imports, setups, retries, timeout, skip, file, line, project_root, code,
138139
TestSetup[],
139140
Ref{Int}(0),
140141
DefaultTestSet[],
@@ -145,7 +146,7 @@ function TestItem(number, name, id, tags, default_imports, setups, retries, time
145146
end
146147

147148
"""
148-
@testitem "name" [tags=[] setup=[] retries=0 default_imports=true] begin
149+
@testitem "name" [tags=[] setup=[] retries=0 skip=false default_imports=true] begin
149150
# code that will be run as tests
150151
end
151152
@@ -228,13 +229,26 @@ Note that `timeout` currently only works when tests are run with multiple worker
228229
@testitem "Sometimes too slow" timeout=10 begin
229230
@test sleep(rand(1:100))
230231
end
232+
233+
If a `@testitem` needs to be skipped, then you can set the `skip` keyword.
234+
Either pass `skip=true` to unconditionally skip the test item, or pass `skip` an
235+
expression that returns a `Bool` to determine if the testitem should be skipped.
236+
237+
@testitem "Skip on old Julia" skip=(VERSION < v"1.9") begin
238+
v = [1]
239+
@test 0 == @allocations sum(v)
240+
end
241+
242+
The `skip` expression is run in its own module, just like a test-item.
243+
No code inside a `@testitem` is run when a test-item is skipped.
231244
"""
232245
macro testitem(nm, exs...)
233246
default_imports = true
234247
retries = 0
235248
timeout = nothing
236249
tags = Symbol[]
237250
setup = Any[]
251+
skip = false
238252
_id = nothing
239253
_run = true # useful for testing `@testitem` itself
240254
_source = QuoteNode(__source__)
@@ -257,12 +271,16 @@ macro testitem(nm, exs...)
257271
setup = map(Symbol, setup.args)
258272
elseif kw == :retries
259273
retries = ex.args[2]
260-
@assert retries isa Integer "`default_imports` keyword must be passed an `Integer`"
274+
@assert retries isa Integer "`retries` keyword must be passed an `Integer`"
261275
elseif kw == :timeout
262276
t = ex.args[2]
263277
@assert t isa Real "`timeout` keyword must be passed a `Real`"
264278
@assert t > 0 "`timeout` keyword must be passed a positive number. Got `timeout=$t`"
265279
timeout = ceil(Int, t)
280+
elseif kw == :skip
281+
skip = ex.args[2]
282+
# If the `Expr` doesn't evaluate to a Bool, throws at runtime.
283+
@assert skip isa Union{Bool,Expr} "`skip` keyword must be passed a `Bool`"
266284
elseif kw == :_id
267285
_id = ex.args[2]
268286
# This will always be written to the JUnit XML as a String, require the user
@@ -287,7 +305,7 @@ macro testitem(nm, exs...)
287305
ti = gensym(:ti)
288306
esc(quote
289307
let $ti = $TestItem(
290-
$Ref(0), $nm, $_id, $tags, $default_imports, $setup, $retries, $timeout,
308+
$Ref(0), $nm, $_id, $tags, $default_imports, $setup, $retries, $timeout, $skip,
291309
$String($_source.file), $_source.line,
292310
$gettls(:__RE_TEST_PROJECT__, "."),
293311
$q,

test/integrationtests.jl

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1032,4 +1032,31 @@ end
10321032
@test_throws expected_err runtests(file; nworkers=1, memory_threshold=xx)
10331033
end
10341034

1035+
@testset "skipping testitems" begin
1036+
# Test report printing has test items as "skipped" (which appear under "Broken")
1037+
using IOCapture
1038+
file = joinpath(TEST_FILES_DIR, "_skip_tests.jl")
1039+
results = encased_testset(()->runtests(file; nworkers=1))
1040+
c = IOCapture.capture() do
1041+
Test.print_test_results(results)
1042+
end
1043+
@test contains(
1044+
c.output,
1045+
r"""
1046+
Test Summary: \s* \| Pass Fail Broken Total Time
1047+
ReTestItems \s* \| 4 1 3 8 \s*\d*.\ds
1048+
"""
1049+
)
1050+
end
1051+
1052+
@testset "logs are aligned" begin
1053+
file = joinpath(TEST_FILES_DIR, "_skip_tests.jl")
1054+
c1 = IOCapture.capture() do
1055+
encased_testset(()->runtests(file))
1056+
end
1057+
@test contains(c1.output, r"START \(1/6\) test item \"no skip, 1 pass\"")
1058+
@test contains(c1.output, r"DONE \(1/6\) test item \"no skip, 1 pass\"")
1059+
@test contains(c1.output, r"SKIP \(3/6\) test item \"skip true\"")
1060+
end
1061+
10351062
end # integrationtests.jl testset

test/internals.jl

Lines changed: 43 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -169,7 +169,7 @@ end # `include_testfiles!` testset
169169
@testset "report_empty_testsets" begin
170170
using ReTestItems: TestItem, report_empty_testsets, PerfStats, ScheduledForEvaluation
171171
using Test: DefaultTestSet, Fail, Error
172-
ti = TestItem(Ref(42), "Dummy TestItem", "DummyID", [], false, [], 0, nothing, "source/path", 42, ".", nothing)
172+
ti = TestItem(Ref(42), "Dummy TestItem", "DummyID", [], false, [], 0, nothing, false, "source/path", 42, ".", nothing)
173173

174174
ts = DefaultTestSet("Empty testset")
175175
report_empty_testsets(ti, ts)
@@ -281,4 +281,46 @@ end
281281
@test_throws ArgumentError("\"$nontest_file\" is not a test file") _validated_paths((nontest_file,), true)
282282
end
283283

284+
@testset "skiptestitem" begin
285+
# Test that `skiptestitem` unconditionally skips a testitem
286+
# and returns `TestItemResult` with a single "skipped" `Test.Result`
287+
ti = @testitem "skip" _run=false begin
288+
@test true
289+
@test false
290+
@test error()
291+
end
292+
ctx = ReTestItems.TestContext("test_ctx", 1)
293+
ti_res = ReTestItems.skiptestitem(ti, ctx)
294+
@test ti_res isa TestItemResult
295+
test_res = only(ti_res.testset.results)
296+
@test test_res isa Test.Result
297+
@test test_res isa Test.Broken
298+
@test test_res.test_type == :skipped
299+
end
300+
301+
@testset "should_skip" begin
302+
should_skip = ReTestItems.should_skip
303+
304+
ti = @testitem("x", skip=true, _run=false, begin end)
305+
@test should_skip(ti)
306+
ti = @testitem("x", skip=false, _run=false, begin end)
307+
@test !should_skip(ti)
308+
309+
ti = @testitem("x", skip=:(1 == 1), _run=false, begin end)
310+
@test should_skip(ti)
311+
ti = @testitem("x", skip=:(1 != 1), _run=false, begin end)
312+
@test !should_skip(ti)
313+
314+
ti = @testitem("x", skip=:(x = 1; x + x == 2), _run=false, begin end)
315+
@test should_skip(ti)
316+
ti = @testitem("x", skip=:(x = 1; x + x != 2), _run=false, begin end)
317+
@test !should_skip(ti)
318+
319+
ti = @testitem("x", skip=:(x = 1; x + x), _run=false, begin end)
320+
@test_throws "Test item \"x\" `skip` keyword must be a `Bool`, got `skip=2`" should_skip(ti)
321+
322+
ti = @testitem("x", skip=:(x = 1; x + y), _run=false, begin end)
323+
@test_throws UndefVarError(:y) should_skip(ti)
324+
end
325+
284326
end # internals.jl testset

0 commit comments

Comments
 (0)