Utils¶
SaveLoadMixin¶
- class SaveLoadMixin[source]¶
- save_finetuned(finetuned_delta_path: ~typing.Optional[~typing.Union[str, ~os.PathLike]] = './delta_checkpoints/', save_config: bool = True, state_dict: ~typing.Optional[dict] = None, save_function: ~typing.Callable = <function save>, push_to_dc: bool = False, center_args: ~typing.Optional[~typing.Union[~opendelta.utils.saving_loading_utils.DeltaCenterArguments, dict]] = {}, center_args_pool: ~typing.Optional[dict] = {}, list_tags: ~typing.Optional[~typing.List] = [], dict_tags: ~typing.Optional[~typing.Dict] = {}, delay_push: bool = False, test_result=None, usage: ~typing.Optional[str] = '')[source]¶
Save a model and its configuration file to a directory, so that it can be re-loaded using the
save_finetuned()
class method.- Parameters
finetuned_delta_path – (optional) path to the directory where the model and its configuration file will be saved. If not specified, the model will be saved in the directory
./delta_checkpoints/
, which is a subdirectory of the current working directory.save_config – (optional) if
True
, the configuration file will be saved in the same directory as the model file. ifFalse
, only the state dict will be saved.state_dict – (optional) a dictionary containing the model’s state_dict. If not specified, the state_dict is loaded from the backbone model’s trainable parameters.
save_function – (optional) the function used to save the model. Defaults to
torch.save
.state_dict_only – (optional) if
True
, only the state_dict will be saved.push_to_dc – (optional) if
True
, the model will prepare things to pushed to the DeltaCenter. This includes: - creating a configuration file for the model - creating a directory for the model - saving the model’s trainable parameters - pushing the model to the DeltaCentercenter_args – (optional) the arguments that are used to distinguish between different delta models on the DeltaCenter
center_args_pool – (optional) a dictionary containing the arguments that are used to distinguish between different delta models on the DeltaCenter
list_tags – (optional) a list of tags that will be added to the model’s configuration file
dict_tags – (optional) a dictionary of tags that will be added to the model’s configuration file
delay_push – (optional) if
True
, the model will not be pushed to the DeltaCenter. This is useful if you want to push the model later.
- load_checkpoint(path, load_func=<function load>, backbone_model=None)[source]¶
Simple method for loading only the checkpoint
- save_checkpoint(path, save_func=<function save>, backbone_model=None)[source]¶
Simple method for saving only the checkpoint
- classmethod from_finetuned(finetuned_delta_path: Optional[Union[str, PathLike]], backbone_model: Module, delta_config=None, cache_dir: Optional[Union[str, PathLike]] = None, state_dict: Optional[dict] = None, *model_args, force_download: Optional[bool] = False, check_hash: Optional[bool] = True, local_files_only: Optional[bool] = False, **kwargs)[source]¶
Instantiate a finetuned delta model from a path. The backbone_model is set in evaluation mode by default using
model.eval()
(Dropout modules are deactivated). To further train the model, you can use thefreeze_module
method.- Parameters
finetuned_delta_path – (optional) path to the directory where the model and its configuration file will be saved. If not specified, the model will be loaded from the directory cahce directory. (see
cache_dir
),backbone_model – the backbone model that will be used to instantiate the finetuned delta model.
delta_config – (optional) the configuration file of the finetuned delta model. If not specified, the configuration file is loaded from the directory
finetuned_delta_path
.cache_dir – (optional) path to the directory where the model and its configuration file will be saved. If not specified, we will first look into current working directory, then the cache directory of your system, e.g., ~/.cache/delta_center/,
state_dict – (optional) a dictionary containing the model’s state_dict. If not specified, the state_dict is loaded from the
finetuned_delta_path
.force_download – (optional) if
True
, the model will be downloaded from the internet even if it is already present in the cache directory.check_hash – (optional) if
True
, check whether the hash of the model once it’s trained differs from what we load now.local_files_only – (optional) if
True
, the model will be loaded from the local cache directory.
Visualization¶
- class Visualization(plm: Module)[source]¶
Better visualization tool for BIG pretrained models.
Better repeated block representation
Clearer parameter position
and Visible parameter state.
- Parameters
plm (
torch.nn.Module
) – The pretrained model, actually can be any pytorch module.
Structure Map¶
Utility Functions¶
Hashing¶
- gen_parameter_hash(generator, md5=None)[source]¶
Get parameter hash. From https://zhuanlan.zhihu.com/p/392942816
Signature¶
Named-based addressing¶
- superstring_in(str_a: str, list_b: List[str])[source]¶
check whether there is any string in list b containing str_a.
Args: Returns:
- is_child_key(str_a: str, list_b: List[str])[source]¶
check whether a string in
list_b
is the child key instr_a
Args: Returns: