Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor files in test/ so they can be run independently #2279

Merged
merged 1 commit into from
Jul 15, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 1 addition & 7 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,16 +22,10 @@ ForwardDiff = "~0.5.0, ~0.6, ~0.7, ~0.8, ~0.9, ~0.10"
MathOptInterface = "~0.9.11"
MutableArithmetics = "0.2"
NaNMath = "0.3"
OffsetArrays = "≥ 0.2.13"
julia = "1"

[extras]
DualNumbers = "fa6b7ba4-c1ee-5f82-b5fc-ecf0adba8f74"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
OffsetArrays = "6fe1bfb0-de20-5000-8ca7-80f57d26f881"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

[targets]
test = ["OffsetArrays", "LinearAlgebra", "DualNumbers", "Random", "SparseArrays", "Test"]
test = ["Test"]
17 changes: 8 additions & 9 deletions test/Containers/Containers.jl
Original file line number Diff line number Diff line change
@@ -1,13 +1,12 @@
using Test
using JuMP
using JuMP.Containers

@testset "Containers" begin
include("DenseAxisArray.jl")
include("SparseAxisArray.jl")
include("generate_container.jl")
include("vectorized_product_iterator.jl")
include("nested_iterator.jl")
include("no_duplicate_dict.jl")
include("macro.jl")
@testset "$(file)" for file in filter(f -> endswith(f, ".jl"), readdir(@__DIR__))
if file in [
"Containers.jl",
]
continue
end
include(joinpath(@__DIR__, file))
end
end
3 changes: 3 additions & 0 deletions test/Containers/DenseAxisArray.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
using JuMP.Containers
using Test

@testset "DenseAxisArray" begin
@testset "undef constructor" begin
A = @inferred DenseAxisArray{Int}(undef, [:a, :b], 1:2)
Expand Down
3 changes: 3 additions & 0 deletions test/Containers/SparseAxisArray.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
using JuMP.Containers
using Test

@testset "SparseAxisArray" begin
function sparse_test(d, sum_d, d2, d3, dsqr, d_bads)
sqr(x) = x^2
Expand Down
2 changes: 1 addition & 1 deletion test/Containers/generate_container.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.

using Test
using JuMP
using JuMP.Containers
using Test

macro dummycontainer(expr, requestedtype)
name = gensym()
Expand Down
2 changes: 1 addition & 1 deletion test/Containers/macro.jl
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
using Test
using JuMP
using JuMP.Containers
using Test

@testset "Macro" begin
@testset "Array" begin
Expand Down
3 changes: 3 additions & 0 deletions test/Containers/nested_iterator.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
using JuMP.Containers
using Test

@testset "Nested Iterator" begin
iterators = (() -> 1:3, i -> 1:i)
condition(i, j) = j > i
Expand Down
3 changes: 3 additions & 0 deletions test/Containers/no_duplicate_dict.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
using JuMP.Containers
using Test

@testset "Iterator with constant eltype" begin
f(ij) = ij => sum(ij)
g = Base.Generator(f, Iterators.product(1:2, 1:2))
Expand Down
3 changes: 3 additions & 0 deletions test/Containers/vectorized_product_iterator.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
using JuMP.Containers
using Test

@testset "Vectorized Product Iterator" begin
I = [1 2
3 4]
Expand Down
5 changes: 2 additions & 3 deletions test/JuMPExtension.jl
Original file line number Diff line number Diff line change
@@ -1,12 +1,11 @@
module JuMPExtension

# Simple example of JuMP extension used in the tests to check that JuMP works well with extensions
# The main difference between `JuMP.Model` and `JuMPExtension.MyModel` is the fact that in `add_variable` (resp. `add_constraint`),
# `JuMP.Model` applies the modification to its `moi_backend` field while
# `JuMPExtension.MyModel` stores the `AbstractVariable` (resp. `AbstractConstraint`) in a list.

using MathOptInterface
const MOI = MathOptInterface
import JuMP
using JuMP

struct ConstraintIndex
value::Int # Index in `model.constraints`
Expand Down
10 changes: 10 additions & 0 deletions test/constraint.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,13 @@
using JuMP
using LinearAlgebra
using Test

include(joinpath(@__DIR__, "utilities.jl"))

@static if !(:JuMPExtension in names(Main))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we do this unusual check for JuMPExtension.jl but not utilities.jl?

Copy link
Member Author

@odow odow Jul 15, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Utilities just loads some functions, whereas JuMPExtension is a module so we get annoying WARNING: module JuMPExtension has been redefined warnings.

The other PRs fix this by making each file a module, removing the need for the check. The module approach also prevents state leaking between files, and allows us to programmatically determine the tests to run within a particular file.

include(joinpath(@__DIR__, "JuMPExtension.jl"))
end

function test_constraint_name(constraint, name, F::Type, S::Type)
@test name == @inferred JuMP.name(constraint)
model = constraint.model
Expand Down
47 changes: 24 additions & 23 deletions test/derivatives.jl
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
using JuMP
using JuMP._Derivatives
using LinearAlgebra
using Test
using MathOptInterface

struct ΦEvaluator <: MathOptInterface.AbstractNLPEvaluator
const ForwardDiff = JuMP.ForwardDiff

struct ΦEvaluator <: MOI.AbstractNLPEvaluator
end

@testset "Derivatives" begin
Expand Down Expand Up @@ -282,18 +285,18 @@ test_linearity(:(1/ifelse(x[1] < 1, x[1],0)), NONLINEAR, Set([(1,1)]))
#Φ(x,y) = 1/3(y)^3 - 2x^2
# c(x) = cos(x)

function MathOptInterface.eval_objective(::ΦEvaluator,x)
function MOI.eval_objective(::ΦEvaluator,x)
@assert length(x) == 2
return (1/3)*x[2]^3-2x[1]^2
end
function MathOptInterface.eval_objective_gradient(::ΦEvaluator,grad,x)
function MOI.eval_objective_gradient(::ΦEvaluator,grad,x)
grad[1] = -4x[1]
grad[2] = x[2]^2
end
r = _Derivatives.UserOperatorRegistry()
register_multivariate_operator!(r,:Φ,ΦEvaluator())
register_univariate_operator!(r,:c,cos,x->-sin(x),x->-cos(x))
Φ(x,y) = MathOptInterface.eval_objective(ΦEvaluator(),[x,y])
Φ(x,y) = MOI.eval_objective(ΦEvaluator(),[x,y])
ex = :(Φ(x[2],x[1]-1)*c(x[3]))
nd,const_values = expr_to_nodedata(ex,r)
@test _Derivatives.has_user_multivariate_operators(nd)
Expand All @@ -313,11 +316,6 @@ reverse_extract(grad,reverse_storage,nd,adj,[],1.0)
true_grad = [cos(x[3])*(x[1]-1)^2, -4cos(x[3])*x[2], -sin(x[3])*Φ(x[2],x[1]-1)]
@test isapprox(grad,true_grad)



using DualNumbers
using ForwardDiff

# dual forward test
function dualforward(ex, x; ignore_nan=false)
nd,const_values = expr_to_nodedata(ex)
Expand Down Expand Up @@ -345,31 +343,34 @@ function dualforward(ex, x; ignore_nan=false)
@test isapprox(fval_ϵ[1], dot(grad,ones(length(x))))

# compare with running dual numbers
forward_dual_storage = zeros(DualNumbers.Dual{Float64},length(nd))
partials_dual_storage = zeros(DualNumbers.Dual{Float64},length(nd))
output_dual_storage = zeros(DualNumbers.Dual{Float64},length(x))
reverse_dual_storage = zeros(DualNumbers.Dual{Float64},length(nd))
x_dual = [DualNumbers.Dual(x[i],1.0) for i in 1:length(x)]
_epsilon(x::ForwardDiff.Dual{Nothing, Float64, 1}) = x.partials[1]

forward_dual_storage = zeros(ForwardDiff.Dual{Nothing, Float64, 1},length(nd))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, it was always complaining that I didn't have DualNumbers because I was not running the tests with ] test.

partials_dual_storage = zeros(ForwardDiff.Dual{Nothing, Float64, 1},length(nd))
output_dual_storage = zeros(ForwardDiff.Dual{Nothing, Float64, 1},length(x))
reverse_dual_storage = zeros(ForwardDiff.Dual{Nothing, Float64, 1},length(nd))

x_dual = [ForwardDiff.Dual(x[i],1.0) for i in 1:length(x)]
fval = forward_eval(forward_dual_storage, partials_dual_storage, nd, adj,
const_values, [], x_dual, [], [], [], NO_USER_OPS)
reverse_eval(reverse_dual_storage,partials_dual_storage,nd,adj)
reverse_extract(output_dual_storage,reverse_dual_storage,nd,adj,[],DualNumbers.Dual(2.0))
reverse_extract(output_dual_storage,reverse_dual_storage,nd,adj,[],ForwardDiff.Dual(2.0, 0.0))
for k in 1:length(nd)
@test isapprox(epsilon(forward_dual_storage[k]), forward_storage_ϵ[k][1])
if !(isnan(epsilon(partials_dual_storage[k])) && ignore_nan)
@test isapprox(epsilon(partials_dual_storage[k]), partials_storage_ϵ[k][1])
@test isapprox(_epsilon(forward_dual_storage[k]), forward_storage_ϵ[k][1])
if !(isnan(_epsilon(partials_dual_storage[k])) && ignore_nan)
@test isapprox(_epsilon(partials_dual_storage[k]), partials_storage_ϵ[k][1])
else
@test !isnan(forward_storage_ϵ[k][1])
end
if !(isnan(epsilon(reverse_dual_storage[k])) && ignore_nan)
@test isapprox(epsilon(reverse_dual_storage[k]), reverse_storage_ϵ[k][1]/2)
if !(isnan(_epsilon(reverse_dual_storage[k])) && ignore_nan)
@test isapprox(_epsilon(reverse_dual_storage[k]), reverse_storage_ϵ[k][1]/2)
else
@test !isnan(reverse_storage_ϵ[k][1])
end
end
for k in 1:length(x)
if !(isnan(epsilon(output_dual_storage[k])) && ignore_nan)
@test isapprox(epsilon(output_dual_storage[k]), output_ϵ[k][1])
if !(isnan(_epsilon(output_dual_storage[k])) && ignore_nan)
@test isapprox(_epsilon(output_dual_storage[k]), output_ϵ[k][1])
else
@test !isnan(output_ϵ[k][1])
end
Expand Down
13 changes: 8 additions & 5 deletions test/derivatives_coloring.jl
Original file line number Diff line number Diff line change
@@ -1,10 +1,13 @@
using Test

import JuMP._Derivatives.Coloring: acyclic_coloring, recovery_preprocess,
reverse_topological_sort_by_dfs,
gen_adjlist, hessian_color_preprocess,
prepare_seed_matrix!, recover_from_matmat!,
seed_matrix
import JuMP._Derivatives.Coloring:
acyclic_coloring, recovery_preprocess,
reverse_topological_sort_by_dfs,
gen_adjlist,
hessian_color_preprocess,
prepare_seed_matrix!,
recover_from_matmat!,
seed_matrix

struct Graph
num_vertices::Int
Expand Down
12 changes: 10 additions & 2 deletions test/expr.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
import MutableArithmetics
const MA = MutableArithmetics
using JuMP
using Test

const MA = JuMP._MA

include(joinpath(@__DIR__, "utilities.jl"))

@static if !(:JuMPExtension in names(Main))
include(joinpath(@__DIR__, "JuMPExtension.jl"))
end

# For "expression^3 and unary*"
struct PowVariable <: JuMP.AbstractVariableRef
Expand Down
4 changes: 4 additions & 0 deletions test/lp_sensitivity.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,10 @@
# An algebraic modeling language for Julia
# See http://github.com/jump-dev/JuMP.jl
#############################################################################

using JuMP
using Test

function test_lp_rhs_perturbation_range(model_string, primal_solution, basis_status, feasibility_ranges)
model = JuMP.Model()
MOIU.loadfromstring!(JuMP.backend(model), model_string)
Expand Down
12 changes: 10 additions & 2 deletions test/macros.jl
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,16 @@
# Testing for macros
#############################################################################

import MutableArithmetics
const MA = MutableArithmetics
using JuMP
using Test

const MA = JuMP._MA

include(joinpath(@__DIR__, "utilities.jl"))

@static if !(:JuMPExtension in names(Main))
include(joinpath(@__DIR__, "JuMPExtension.jl"))
end

@testset "Check Julia generator expression parsing" begin
sumexpr = :(sum(x[i,j] * y[i,j] for i = 1:N, j in 1:M if i != j))
Expand Down
11 changes: 7 additions & 4 deletions test/mutable_arithmetics.jl
Original file line number Diff line number Diff line change
@@ -1,9 +1,12 @@
using LinearAlgebra, Test
using LinearAlgebra
using JuMP
using Test

import MutableArithmetics
const MA = MutableArithmetics
const MA = JuMP._MA

using JuMP
@static if !(:JuMPExtension in names(Main))
include(joinpath(@__DIR__, "JuMPExtension.jl"))
end

struct DummyVariableRef <: JuMP.AbstractVariableRef end
JuMP.name(::DummyVariableRef) = "dummy"
Expand Down
8 changes: 8 additions & 0 deletions test/nlp.jl
Original file line number Diff line number Diff line change
@@ -1,4 +1,12 @@
# TODO: Replace isapprox with ≈ everywhere.

using JuMP
using LinearAlgebra
using SparseArrays
using Test

include(joinpath(@__DIR__, "utilities.jl"))

@testset "Nonlinear" begin

import JuMP: _NonlinearExprData
Expand Down
1 change: 1 addition & 0 deletions test/nonnegative_bridge.jl
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
# This file contains an example bridge used for tests.

using JuMP

const MOIB = MOI.Bridges
const MOIBC = MOI.Bridges.Constraint

Expand Down
6 changes: 5 additions & 1 deletion test/objective.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
using Test
using JuMP
using Test

@static if !(:JuMPExtension in names(Main))
include(joinpath(@__DIR__, "JuMPExtension.jl"))
end

struct DummyOptimizer <: MOI.AbstractOptimizer end
MOI.is_empty(::DummyOptimizer) = true
Expand Down
14 changes: 10 additions & 4 deletions test/operator.jl
Original file line number Diff line number Diff line change
@@ -1,9 +1,15 @@
using LinearAlgebra, Test
using JuMP
using LinearAlgebra
using SparseArrays
using Test

import MutableArithmetics
const MA = MutableArithmetics
const MA = JuMP._MA

using JuMP
include(joinpath(@__DIR__, "utilities.jl"))

@static if !(:JuMPExtension in names(Main))
include(joinpath(@__DIR__, "JuMPExtension.jl"))
end

# For "DimensionMismatch when performing vector-matrix multiplication with custom types #988"
import Base: +, *
Expand Down
11 changes: 9 additions & 2 deletions test/print.jl
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,15 @@
#############################################################################

using JuMP
using LinearAlgebra, Test
import JuMP.REPLMode, JuMP.IJuliaMode
using LinearAlgebra
using Test

import JuMP.IJuliaMode
import JuMP.REPLMode

@static if !(:JuMPExtension in names(Main))
include(joinpath(@__DIR__, "JuMPExtension.jl"))
end

# Helper function to test IO methods work correctly
function io_test(mode, obj, exp_str; repl=:both)
Expand Down
11 changes: 4 additions & 7 deletions test/runtests.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,16 +10,11 @@
# test/runtests.jl
#############################################################################

using JuMP

using LinearAlgebra # for dot and tr
using SparseArrays # for sparse
using Test

t = time()
include("Containers/Containers.jl")

include("utilities.jl")
include("JuMPExtension.jl")
println("Containers.jl took $(round(time() - t; digits = 1)) seconds.")

@testset "$(file)" for file in filter(f -> endswith(f, ".jl"), readdir(@__DIR__))
if file in [
Expand All @@ -31,7 +26,9 @@ include("JuMPExtension.jl")
]
continue
end
t = time()
include(file)
println("$(file) took $(round(time() - t; digits = 1)) seconds.")
end

# TODO: The hygiene test should run in a separate Julia instance where JuMP
Expand Down
Loading