Auto Classes¶
AutoDeltaConfig¶
- class AutoDeltaConfig(*args, **kwargs)[source]¶
This is a generic configuration class that will be instantiated as one of the configuration classes of the library when created with the
from_finetuned()
orfrom_dict()
class method. This class cannot be instantiated directly using__init__()
(throws an error).- classmethod from_dict(config_dict: Dict[str, Any], **kwargs)[source]¶
Instantiate a DeltaConfig according to the dict. Automatically load the config specified by
delta_type
.- Parameters
config_dict (
dict
) – The dict of configs of delta model.kwargs – Other keyword argument pass to initialize the config.
Examples:
config = AutoDeltaConfig.from_dict({"delta_type":"lora"}) # This will load the dault lora config. config = AutoDeltaConfig.from_dict({"delta_type":"lora", "lora_r":5}) # Will load the default lora config, with lora_r = 5
- classmethod from_finetuned(finetuned_delta_path, **kwargs)[source]¶
Instantiate one of the configuration classes of the library from a finetuned delta model configuration. The configuration class to instantiate is selected based on the
delta_type
property of the config object that is loaded.- Parameters
finetuned_delta_path (
str
oros.PathLike
, optional) –Can be either:
A string, the model id of a finetuned delta model configuration hosted inside a model repo on huggingface.co. Valid model ids can be located at the root-level, like
Davin/lora
, or namespaced under a user or organization name, likeDeltaHub/lora_t5-base_mrpc
.A path to a directory containing a configuration file saved using the
save_finetuned()
method, e.g.,./my_model_directory/
.A path or url to a saved configuration JSON file, e.g.,``./my_model_directory/configuration.json``.
cache_dir (
str
oros.PathLike
, optional) – Path to a directory in which a downloaded pretrained model configuration should be cached if the standard cache should not be used.
Examples:
from transformers import AutoConfig delta_config = AutoDeltaConfig.from_finetuned("thunlp/FactQA_T5-large_Adapter")
AutoDeltaModel¶
- class AutoDeltaModel(*args, **kwargs)[source]¶
- classmethod from_config(config, backbone_model, **kwargs) DeltaBase [source]¶
Automatically instantiates a delta model based on the
config
. The delta model correspond to the deltaconfig
will be loaded and initialized using the arguments inconfig
.Note
Only using
from_config()
method will not load the finetuned weight file (e.g., pytorch_model.bin). Please use from_finetuned directly.- Parameters
config (
BaseDeltaConfig
) –backbone_model (
nn.Module
) –
Examples:
config = AutoDeltaConfig.from_finetuned("DeltaHub/lora_t5-base_mrpc") delta_model = AutoDeltaModel.from_config(config, backbone_model)
- classmethod from_finetuned(finetuned_delta_path, backbone_model, *model_args, **kwargs) DeltaBase [source]¶
Automatically instantiated a delta model and load the finetuned checkpoints based on the
finetuned_delta_path
, which can either be a string pointing to a local path or a url pointint to the delta hub. It will check the hash after loading the delta model to see whether the correct backbone and delta checkpoint are used.- Parameters
finetuned_delta_path (
str
oros.PathLike
, optional) –Can be either:
A string, the model name of a finetuned delta model configuration hosted inside a model repo on Delta Center, like
thunlp/FactQA_T5-large_Adapter
.A path to a directory containing a configuration file saved using the
save_finetuned()
method, e.g.,./my_model_directory/
.A path or url to a saved configuration JSON file, e.g.,
./my_model_directory/configuration.json
.The last two option are not tested but inherited from huggingface.
backbone_model (
nn.Module
) – The backbone model to be modified.model_args – Other argument for initialize the model. See :DeltaBase.from_finetuned for details.
kwargs – Other kwargs that will be passed into DeltaBase.from_finetuned. See DeltaBase.from_finetuned for details.
Example:
delta_model = AutoDeltaModel.from_finetuned("thunlp/FactQA_T5-large_Adapter", backbone_model=5)