Replies: 4 comments
-
Hi, I'm also stuck with this issue. Here is what I've tried to add the evaluation during the training, but currently not working:
Thank you |
Beta Was this translation helpful? Give feedback.
-
Would like to know that too. Example needed :) |
Beta Was this translation helpful? Give feedback.
-
here is my code and it doing evaluation after predefined no of iterations then instantiate an object from my trainer class and enjoy .
|
Beta Was this translation helpful? Give feedback.
-
Hi @hakespear, all you have to do is to define your own Trainer subclass where you define list of evaluators that will be used during the training: class MyTrainer(DefaultTrainer):
@classmethod
def build_evaluator(cls, cfg, dataset_name, output_folder=None):
coco_evaluator = COCOEvaluator(dataset_name, output_dir=output_folder)
evaluator_list = [coco_evaluator]
return DatasetEvaluators(evaluator_list) Set the evaluation interval and treshold: cfg.TEST.EVAL_PERIOD = 1000
cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.50 Train with: trainer = MyTrainer(cfg)
trainer.resume_or_load(resume=False)
trainer.train() Note Post training test with: cfg.MODEL.WEIGHTS = os.path.join(cfg.OUTPUT_DIR, "model_final.pth")
cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.50
trainer = MyTrainer(cfg)
trainer.test(cfg, trainer.model) |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
📚 Documentation Issue
Hi everyone,
I'm struggling to understand how detectron2's Default Trainer is supposed to handle the validation set. Since I just want to do basic testing on a custom dataset, I mostly looked for a way to insert a validation set in train_net.py rather than studying Hooks or plain_train_net.py. That way I might see when the model starts overfitting thanks to the validation losses, so I can stop training accordingly.
So I found this training script example : https://github.com/facebookresearch/detectron2/blob/main/tools/train_net.py
and eventually came up with this script to add a validation set to the training set :
By looking to the configs documentation : https://detectron2.readthedocs.io/en/latest/modules/config.html I found that a cfg.TEST.EVAL_PERIOD needs to be set in order to call the Evaluator during training. Since it's a .TEST config I understood the validation set has to be put in cfg.DATASETS.TEST rather than cfg.DATASETS.TRAIN. However that gave me the expected error :
This is where I misunderstand the docs. I looked for other basic training loops and tutorial examples but could neither understand how the validation set is taken into account. Is there anything obvious that I missed ? Or did I correctly understand the docs but something was a bit inaccurate ?
Thanks for letting me know. Also sorry if I misslabeled this post
Beta Was this translation helpful? Give feedback.
All reactions