Module flowcon.flows.autoregressive
Implementations of autoregressive flows.
Classes
class MaskedAutoregressiveFlow (features, hidden_features, num_layers, num_blocks_per_layer, use_residual_blocks=True, use_random_masks=False, use_random_permutations=False, activation=<function relu>, dropout_probability=0.0, batch_norm_within_layers=False, batch_norm_between_layers=False)
-
An autoregressive flow that uses affine transforms with masking.
Reference:
G. Papamakarios et al., Masked Autoregressive Flow for Density Estimation, Advances in Neural Information Processing Systems, 2017.
Constructor.
Args
transform
- A
Transform
object, it transforms data into noise. distribution
- A
AutoregressiveTransform
object, the base distribution of the flow that generates the noise. embedding_net
- A
nn.Module
which has trainable parameters to encode the context (condition). It is trained jointly with the flow.
Expand source code
class MaskedAutoregressiveFlow(Flow): """An autoregressive flow that uses affine transforms with masking. Reference: > G. Papamakarios et al., Masked Autoregressive Flow for Density Estimation, > Advances in Neural Information Processing Systems, 2017. """ def __init__( self, features, hidden_features, num_layers, num_blocks_per_layer, use_residual_blocks=True, use_random_masks=False, use_random_permutations=False, activation=F.relu, dropout_probability=0.0, batch_norm_within_layers=False, batch_norm_between_layers=False, ): if use_random_permutations: permutation_constructor = RandomPermutation else: permutation_constructor = ReversePermutation layers = [] for _ in range(num_layers): layers.append(permutation_constructor(features)) layers.append( MaskedAffineAutoregressiveTransform( features=features, hidden_features=hidden_features, num_blocks=num_blocks_per_layer, use_residual_blocks=use_residual_blocks, random_mask=use_random_masks, activation=activation, dropout_probability=dropout_probability, use_batch_norm=batch_norm_within_layers, ) ) if batch_norm_between_layers: layers.append(BatchNorm(features)) super().__init__( transform=CompositeTransform(layers), distribution=StandardNormal([features]), )
Ancestors
- Flow
- Distribution
- torch.nn.modules.module.Module
Class variables
var call_super_init : bool
var dump_patches : bool
var training : bool
Inherited members