-
Notifications
You must be signed in to change notification settings - Fork 3
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Since #7, we've had evaluation metrics in the pv211_utils.evaluation_metrics
module. Here is a number of issues with the implementation:
- Both
calc_map
andmean_average_precision
implement the same functionality, see also a related comment from the review of Implement the MUNI/33/0769/2022 R&D project #8. Let's remove one of them and update the library to always use the single remaining implementation? - Both
calc_map
andmean_average_precision
pickle the system whennum_workers > 1
, which may easily fail or slow the evaluation down significantly when the system references a large object that is expensive to pickle.- For
calc_map
, this seems to be a bug, since the code does the heavy lifting of saving the system to a global and forking only to pickle the system anyways, see also a related comment from the review of Implement the MUNI/33/0769/2022 R&D project #8. - For
mean_average_precision
, there is not even a pretense of avoiding pickling the system, see also a related comment from the review of Implement the MUNI/33/0769/2022 R&D project #8.
- For
- The evaluation can take a bit. Let's be friendly to users and give them an ETA? See also a related comment from the review of Implement the MUNI/33/0769/2022 R&D project #8.
After fixing the above, let's make num_workers=None
the default? Pool(processes=None)
uses all the available CPUs.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working