Setting up your own configurations

The augmentation methods relies on default parameters for the model, training and generation. Depending on your data these parameters should be modified.

The model parameters

Each model coded in Pyraug requires a ModelConfig inheriting from BaseModelConfig class to be built. Hence, to build a basic model you need to run the following

>>> from pyraug.models.my_model.my_model_config import MyModelConfig
>>> from pyraug.models.my_model.my_model import MyModel
>>> config = MyModelConfig(
...    input_dim=10 # Setting the data input dimension is needed if you do not use your own autoencoding architecture
...    # your parameters go here
... )
>>> m = MyModel(model_config=config) # Built the model

Let now say that you want to override the model default parameters. The only thing you have to do is to pass you arguments to the dataclass ModelConfig.

Let say, we want to change the temperature T in the metric in a RHVAE model which defaults to 1.5 and raise it to 2. Well simply run the following.

>>> from pyraug.models.rhvae.rhvae_config import RHVAEConfig
>>> from pyraug.models import RHVAE
>>> config = RHVAEConfig(input_dim=10, temperature=2)
>>> m = RHVAE(model_config=config)
>>> m.temperature
Parameter containing:
tensor([2.])

Check out the documentation to see the whole list of parameter you can amend.

The sampler parameters

To generate from a Pyraug’s model a ModelSampler inheriting from BaseSampler is used. A ModelSampler is instantiated with a Pyraug’s model and a ModelSamplerConfig. Hence, likewise the VAE models, the sampler parameters can be easily amended as follows

>>> from pyraug.models.my_model.my_model_config import MyModelSamplerConfig
>>> from pyraug.models.my_model.my_model_sampler import MyModelSampler
>>> config = MyModelSamplerConfig(
...    # your parameters go here
... )
>>> m = MyModelSample(model=my_model, sampler_config=config) # Built the model

Let now say that you want to override the sampler default parameters. The only thing you have to do is to pass you arguments to the dataclass ModelSamplerConfig.

Let say, we want to change the number of leapfrog steps in the RHVAESampler config model which defaults to 15 and make it to 5. Well your code should look like the following.

>>> from pyraug.models import RHVAE
>>> from pyraug.models.rhvae import RHVAESampler, RHVAESamplerConfig, RHVAEConfig
>>> custom_sampler_config = RHVAESamplerConfig(
...   n_lf=5
... ) # Set up sampler config
>>> custom_sampler = RHVAESampler(
...     model=model, sampler_config=custom_sampler_config
... ) # Build sampler
>>> custom_sampler.n_lf
... tensor([5])

Check out the documentation to see the whole list of parameter you can amend.

The Trainer parameters

Likewise the VAE models, the instance Trainer can be created with default parameters or you can easily amend them the same way it is done for the models.

Say you want to train your model for 10 epochs, with no early stopping on the train et and a learning_rate of 0.1

>>> from pyraug.trainers.training_config import TrainingConfig
>>> config = TrainingConfig(
...    max_epochs=10, learning_rate=0.1, train_early_stopping=None)
>>> config
TrainingConfig(output_dir=None, batch_size=50, max_epochs=10, learning_rate=0.1, train_early_stopping=None, eval_early_stopping=None, steps_saving=1000, seed=8, no_cuda=False, verbose=True)

You can find a comprehensive description of any parameters of the Trainer you can set in TrainingConfig