MMEngine has implemented commonly used hooks for training and test, When users have requirements for customization, they can follow examples below. For example, if some hyper-parameter of the model needs to be changed when model training, we can implement a new hook for it:
# Copyright (c) OpenMMLab. All rights reserved.
from typing import Optional, Sequence
from mmengine.hooks import Hook
from mmengine.model import is_model_wrapper
from mmseg.registry import HOOKS
@HOOKS.register_module()
class NewHook(Hook):
"""Docstring for NewHook.
"""
def __init__(self, a: int, b: int) -> None:
self.a = a
self.b = b
def before_train_iter(self,
runner,
batch_idx: int,
data_batch: Optional[Sequence[dict]] = None) -> None:
cur_iter = runner.iter
# acquire this model when it is in a wrapper
if is_model_wrapper(runner.model):
model = runner.model.module
model.hyper_parameter = self.a * cur_iter + self.b
The module which is defined above needs to be imported into main namespace first to ensure being registered.
We assume NewHook
is implemented in mmseg/engine/hooks/new_hook.py
, there are two ways to import it:
mmseg/engine/hooks/__init__.py
.
Modules should be imported in mmseg/engine/hooks/__init__.py
thus these new modules can be found and added by registry.from .new_hook import NewHook
__all__ = [..., NewHook]
custom_imports
in config file.custom_imports = dict(imports=['mmseg.engine.hooks.new_hook'], allow_failed_imports=False)
Users can set and use customized hooks in training and test followed methods below.
The execution priority of hooks at the same place of Runner
can be referred here,
Default priority of customized hook is NORMAL
.
custom_hooks = [
dict(type='NewHook', a=a_value, b=b_value, priority='ABOVE_NORMAL')
]
We recommend the customized optimizer implemented in mmseg/engine/optimizers/my_optimizer.py
. Here is an example of a new optimizer MyOptimizer
which has parameters a
, b
and c
:
from mmseg.registry import OPTIMIZERS
from torch.optim import Optimizer
@OPTIMIZERS.register_module()
class MyOptimizer(Optimizer):
def __init__(self, a, b, c)
The module which is defined above needs to be imported into main namespace first to ensure being registered.
We assume MyOptimizer
is implemented in mmseg/engine/optimizers/my_optimizer.py
, there are two ways to import it:
mmseg/engine/optimizers/__init__.py
.
Modules should be imported in mmseg/engine/optimizers/__init__.py
thus these new modules can be found and added by registry.from .my_optimizer import MyOptimizer
custom_imports
in config file.custom_imports = dict(imports=['mmseg.engine.optimizers.my_optimizer'], allow_failed_imports=False)
Then it needs to modify optimizer
in optim_wrapper
of config file, if users want to use customized MyOptimizer
, it can be modified as:
optim_wrapper = dict(type='OptimWrapper',
optimizer=dict(type='MyOptimizer',
a=a_value, b=b_value, c=c_value),
clip_grad=None)
Optimizer constructor is used to create optimizer and optimizer wrapper for model training, which has powerful functions like specifying learning rate and weight decay for different model layers. Here is an example for a customized optimizer constructor.
from mmengine.optim import DefaultOptimWrapperConstructor
from mmseg.registry import OPTIM_WRAPPER_CONSTRUCTORS
@OPTIM_WRAPPER_CONSTRUCTORS.register_module()
class LearningRateDecayOptimizerConstructor(DefaultOptimWrapperConstructor):
def __init__(self, optim_wrapper_cfg, paramwise_cfg=None):
def __call__(self, model):
return my_optimizer
Default optimizer constructor is implemented here. It can also be used as base class of new optimizer constructor.
The module which is defined above needs to be imported into main namespace first to ensure being registered.
We assume MyOptimizerConstructor
is implemented in mmseg/engine/optimizers/my_optimizer_constructor.py
, there are two ways to import it:
mmseg/engine/optimizers/__init__.py
.
Modules should be imported in mmseg/engine/optimizers/__init__.py
thus these new modules can be found and added by registry.from .my_optimizer_constructor import MyOptimizerConstructor
custom_imports
in config file.custom_imports = dict(imports=['mmseg.engine.optimizers.my_optimizer_constructor'], allow_failed_imports=False)
Then it needs to modify constructor
in optim_wrapper
of config file, if users want to use customized MyOptimizerConstructor
, it can be modified as:
optim_wrapper = dict(type='OptimWrapper',
constructor='MyOptimizerConstructor',
clip_grad=None)
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。