@@ -29,155 +29,14 @@ Adjust the parameters set in the `algobattle/configs/config.ini` file to set
29
29
which hardware resources you want to assign. You can pass alternative
30
30
configuration files to the script using the ` --config_file ` option.
31
31
32
- To start a basic run on the ` pairsum ` problem, using the ` solver ` and ` generator ` that
32
+ To start a basic run on the ` biclique ` problem, using the ` solver ` and ` generator ` that
33
33
are part of the problem directory, execute
34
34
```
35
- battle algobattle/problems/pairsum
35
+ battle algobattle/problems/biclique
36
36
```
37
37
or provide any alternative problem folder path.
38
38
39
- Read the section * Creating a New Task* to learn about the expected
40
- structure of a problem.
41
-
42
39
The ` battle ` script offers several options, e.g. to give custom paths for
43
40
solvers and generators. Run ` battle --help ` for all options.
44
41
45
- # How does a Battle Between Two Teams Work?
46
- There are currently two types of battles implemented. The first one is the
47
- * iterated* version, in which two teams battle against one another up to an
48
- instance size at which one of the teams fails. This is the default option.
49
- An alternative is the * averaged* battle, which is still relatively new and will
50
- thus most probably be further adjusted in the future. In this battle, the
51
- solvers and generators of students battle a fixed number of times on a fixed
52
- instance size and try to give good approximative solutions. The approximation
53
- factor is then averaged, with the team having the best average approximation
54
- ratio winning.
55
-
56
- Whenever we run the code as described above, we are supplied a generator and a
57
- solver for each team, either explicitly via options on the call or implicitely
58
- from the folders ` path/to/problem/generator ` and
59
- ` path/to/problem/solver ` if the option is not set.
60
-
61
- Log files are automatically generated for each run and saved to
62
- ` ~/.algobattle_logs ` by default. You can change this path with the
63
- ` --output_folder ` option.
64
-
65
- ## The Iterated Battle (default)
66
- What we are interested in is how good each solver of one group is in solving the
67
- instances of the other group. Thus, we start by letting the generator of one
68
- group generate an instance and a certificate for a small instance size (which we
69
- will call ` n ` ) and plug this instance into the solver of the other team. If the
70
- solver is unable to solve this instance to our liking, the solver loses for this
71
- ` n ` . This could be because its solution size is smaller than that of the
72
- generator or for other reasons the problem description asks for. When a solver
73
- actually solves an instance for an ` n ` , we have probably not seen the best the
74
- solver can do: We want to see if it can also solve bigger instances. Thus, we
75
- increment the ` n ` , let the generator create a new instance and plug it into the
76
- solver again. This process is repeated until the solver fails.
77
- We then do the same by using the generator and solver of the respectively other
78
- group.
79
-
80
- While this is the general idea of how a battle works, there are some details
81
- which were introduced to optimize the code and make it fairer.
82
- ### The Step Size Increment
83
- When we want to increment the instance size, we can hardly know how big the
84
- biggest instances are that a solver can solve. This is because we rarely know
85
- how clever the instances of the generator are designed or how good the solver is
86
- at solving them. They are essentially blackboxes for us.
87
- Thus, we do not want to simply increment the ` n ` by one every time the solver wins,
88
- as we may wait for a very long time until we have results, otherwise.
89
-
90
- In this implementation, we increase the step size more aggressively: If the
91
- solver has solved ` i ` increments in a row already, the next step size increase
92
- is ` i^2 ` . This usually leads to the solver overshooting its target and failing
93
- after a big increment. In this case, we set ` i = 1 ` , take back the last, big
94
- increment and start incrementing by ` i^2 ` again. This is done until the solver
95
- fails after an increment of ` 1 ` . To not overly favor randomized approaches, the
96
- biggest instance size reached may not exceed that of each failed instance size.
97
-
98
- Finally, there is also a cutoff value set in the ` config.ini ` after which the
99
- incrementation is automatically stopped.
100
- ### Averaging the Results Over Several Battles
101
- As the machines on which the code is executed cannot be guaranteed to be free of
102
- load at every time and since randomized approaches may fail due to outliers, we
103
- are executing several battles in a row and report back the results of each battle
104
- such that an average number of points can be computed.
105
- ### Handling Malformed Inputs
106
- We are very lenient with malformed instances: If an output line does not follow
107
- the problem specification, the parser is tasked with discarding it and logging a
108
- warning.
109
- If the verifier gets an empty instance or the certificate of the generator is
110
- not valid, the solver automatically wins for this instance size. This usually
111
- happens due to the generator timing out before writing the instance and
112
- certificate out.
113
- ### Optional Approximative Battles
114
- It is possible to allow for approximative solutions up to a fixed factor to be
115
- a solution criteria. This can be easily set using the ` --approx_ratio ` option,
116
- provided the problem is compatible with the notion of approximation.
117
- A solver then only fails if its solution is either invalid or outside the given
118
- approximation ratio.
119
-
120
- ## The Averaged Battle
121
- In this alternative battle version, we are interested in the students ability
122
- to write good approximation algorithms. For a fixed instance size set with the
123
- option ` -approx_inst_size ` , the students generate a number of instances set by
124
- the option ` --approx_iterations ` . The task is then to provide an approximative
125
- solution that is as close to the optimal solution as possible within the solver
126
- time limit.
127
- The approximation ratios are then averaged, and points are awarded relative to
128
- the averaged ratio of the other team.
129
-
130
- # Creating a New Task
131
- Tasks are created as packages and are automatically imported by supplying their
132
- path to the ` battle ` executable.
133
-
134
- The basic directory structure of a task is the following:
135
- <pre >
136
- newtask
137
- ├── generator
138
- │ └── Dockerfile
139
- ├── solver
140
- │ └── Dockerfile
141
- ├── parser.py
142
- ├── problem.py
143
- ├── verifier.py
144
- └── __init__.py
145
- </pre >
146
-
147
- The ` problem.py ` file is the easiest to fill. It defines a new subclass of the
148
- abstract ` Problem ` class, imports the verifier and parser and sets the lowest
149
- value for which a battle is to be executed for the specific problem.
150
-
151
- The ` parser.py ` implements methods that are used for cleaning up whatever
152
- instance or solutions the solvers and generators produce, such that the verifier
153
- is able to semantically check them for correctness. Lines of the input that do
154
- not conform to the defined format are discarded, along with a warning.
155
-
156
- The ` verfier.py ` , as already mentioned, checks the input semantically. At least
157
- two functions need to be implemented: One that verifies that a solution is
158
- correct for an instance and one that verifies that a solvers solution is of a
159
- required quality (usually that its size is at least equal to that of the
160
- generator). The verifier should be able to handle the cases that an empty
161
- instance is given (the solution is automatically valid) or that an empty
162
- solution is given (the solution is automatically invalid).
163
-
164
- In order to integrate a problem file, an ` __init__.py ` is required that contains
165
- an import of the class created in the ` problem.py ` file, renaming it to
166
- ` Problem ` :
167
- ```
168
- from .problem import MyNewProblemClass as Problem
169
- ```
170
- In order to test the functionality of your problem implementation, it is
171
- recommended to create a dummy ` solver ` and ` generator ` in the problem directory.
172
- These can return the same instance and solution for each instance size, assuming
173
- the problem definition allows this. The ` solver ` and ` generator ` folders are
174
- also the default paths for solvers and generators of both teams if no other path
175
- is given.
176
-
177
- If you want to execute a run on your newly created problem, execute
178
- ```
179
- battle /path/to/newtask
180
- ```
181
- There are a few example tasks in the ` algobattle/problems ` directory of this
182
- repository with task descriptions, if you want a reference for creating your own
183
- tasks.
42
+ Check the [ wiki] ( https://github.com/Benezivas/algobattle/wiki ) for further documentation.
0 commit comments