This project is still in development, any feedback is much appreciated!
MHLib.jl
is a collection of types and functions in
Julia supporting
the effective implementation of metaheuristics and certain hybrid optimization approaches
for solving primarily combinatorial optimization problems.
Julia MHLib.jl
emerged from the
Python mhlib
and the older
C++ mhlib
to which it has certain similarities
but also many differences.
The main purpose of the library is to support rapid prototyping and teaching as well as efficient implementations due to Julia's highly effective just-in-time-compilation.
MHLib.jl
is developed primarily by the
Algorithms and Complexity Group of TU Wien,
Vienna, Austria, since 2020.
- Günther Raidl (primarily responsible)
- Nikolaus Frohner
- Thomas Jatschka
- Fabio Oberweger
- James Mulhern
Major versions of MHLib.jl
can be installed from the Julia REPL via
] add MHLib
Development versions are available at https://github.com/ac-tuwien/MHLib.jl and can be installed via
] add https://github.com/ac-tuwien/MHLib.jl.git
- Make a copy of the
MHLibDemos
subpackage, see below - rename it,
- choose one of the demo problems as a template,
- and adapt it to your own problem and solution approach.
The main file src/MHLib.jl
provides the following types to represent candidate solutions and various functions for them:
Solution
: An abstract type that represents a candidate solution to an optimization problem.VectorSolution
: An abstract solution encoded by a vector of some user-provided type.BoolVectorSolution
: An abstract solution encoded by a boolean vector.PermutationSolution
: An abstract solution representing permutations of a fixed number of elements.SubsetVectorSolution
: A solution that is an arbitrarily large subset of a given set represented in vector form. The front part represents the selected elements, the back part optionally the unselected ones.
Moreover, the main file provides:
git_version()
: Function returning the abbreviated git version string of the current project. It is good practice to write this information also to a file wit log-information of an optimization run in order to be able to later possibly reproduce results with the same program version.
Remark: Earlier MHLib versions relied heavily on an extensible settings
mechanism for various configuration parameters that corresponded to a global dictionary that is compiled from command line arguments provided when starting Julia. With version 0.3.0, MHLib
replaced this mechanism by classical keyword arguments with default values in functions or constructors of structs. Partly, they are stored in separate ...Config
structs, partly directly in the respective structs representing certain metaheuristics.
In applications using MHLib
, we also do not recommended to make use of such a global variable-based configuration parameter dictionary approach and/or the heavy use of Julia command line arguments (ARGS
) anymore.
Typically in Julia development, one better does not restart/call the whole Julia framework independently for each optimization/test run. It is usually much more efficient to use Revise to automatically update the code in a continuously running Julia REPL session and to call individual optimization runs/tests from therein - directly with the keyword arguments to be used.
Also when performing larger tests over many optimization runs in a batched fashion, it is often much simpler to do this directly in Julia instead of using a shell script that calls Julia many times. This partly also holds when using a compute cluster: better aggregate multiple runs - in particular when each run is relatively short - into fewer Julia processes to avoid/reduce Julia startup overhead.
Thus, for the various configuration parameters of the divers metaheuristics realized in MHLib
, see the respective functions and structs doc-strings.
A an abstract framework for single trajectory metaheuristics that rely on iteratively
applying certain methods to a current solution.
Modules like GVNSs
and LNSs
extend this type towards more specific metaheuristics.
Keyword-parameters that can be passed to the Scheduler
:
checkit=false
: callcheck
for each solution after each method application in order to test and ensure validity (often useful when debugging)consider_initial_sol=
: if set, consider the given solution as valid initial solution, otherwise it is assumed to be uninitializedlog=true
: if true write all log information, else nonelnewinc=true
: always write iteration log if new incumbent solutionlfreq=-1
: frequency of writing iteration logs (0: none, >0: number of iterations, -1: iteration 1,2,5,10,20,...)titer=100
: maximum number of iterations (<0: turned off)ttime=-1
: time limit in seconds (<0: turned off)tciter-1
: maximum number of iterations without improvement (<0: turned off)tctime-1
: maximum time in seconds without improvement (<0: turned off)tobj=-1
: objective value at which should be terminated when reached (<0: turned off)
A framework for local search, iterated local search, (general) variable neighborhood search, GRASP, etc.
A GVNS
is provided an initial Solution
, and lists of construction methods,
local improvement methods, and shaking methods.
Keyword-parameters correspond to those of the Scheduler
(see above).
A framework for different variants of large neighborhood search (LNS).
The selection of the destroy and repair methods is done in an extensible way by
means of the abstract type MethodSelector
and derived types in order to realize
different LNS variants.
An LNS
is provided an initial Solution
, and lists of construction methods,
destroy methods, and repair methods.
Keyword-parameters that can be passed to the LNS
:
meths_compat=nothing
: eithernothing
or a Boolean matrix indicating which destroy method can be applied in conjunction with which repair method.method_selector=UniformRandomMethodSelector()
is the technique used for selecting the- destroy and repair methods
init_temp_factor=0.0
: factor for determining the initial temperature, i.e., the objective value of the initial solution multiplied by this factor is the initial temperaturetemp_dec_factor=0.99
: factor by which the temperature is decreased each iteration- all parameters of the
Scheduler
(see above)
Adaptive Large Neighborhood Search (ALNS). It is realized via LNS
and ALNSMethodSelector
.
An ALNS
is provided an initial Solution
, and lists of construction methods,
destroy methods, and repair methods.
Keyword-parameters that can be passed to the ALNS
:
segment_size=100
: size of segments for updating method weightsgamma=0.025
: reaction factor for updating the method weightssigma1=10
: score for new global best solutionsigma2=9
: score for better than current solutionsigma3=3
: score for worse accepted solution- all keyword parameters of the
LNS
and theScheduler
(see above)
A trivial test problem to which the above algorithms are applied in the unit tests in test
.
For demonstration purposes subdirectory MHLibDemos
provides a Julia package (not separately registered at JuliaHub), with basic implementations for the following classical combinatorial optimization problems, to which some of MHLib's metaheuristics are applied:
GraphColoring
: graph coloring problem based onVectorSolution
MAXSAT
: maximum satisfiability problem based onBinaryVectorSolution
TSP
: traveling salesperson problem based onPermutationSolution
MKP
: multi-constrained knapsack problem based onSubsetVectorSolution
MISP
: maximum independent set problem based onSubsetVectorSolution
It is recommended to take the MHLibDemos
package with one of the demos as template for
developing MHLib-based metaheuristics for your own problem. Remember to activate this package's own Environment in order to run the demos.
Further smaller usage examples can also be found in the test directory of the MHLibDemos package.
Subdirectory Tuning
contains examples on how irace
can specifically be used for tuning
algorithms implemented in Julia. See Tuning/README.md for details.
See CHANGELOG.md