Reference

panndas.nn

class panndas.nn.AdditiveSkip(block)

A Module that applies an additive “skip” connection around the provided Module.

forward(xs)

Minimally, define this method to define a Module.

show()

Displays the Module.

class panndas.nn.AlphaDropout(p, alpha=0.0)

A Module that multiplies its inputs by the weights_df and adds the bias_series.

Input ‘tensors’ can be at most 2-D here: feature (rows) and batch/sequence (columns).

The weights dataframe should have the input feature space as its column index and the output feature space as its row index.

forward(xs)

Minimally, define this method to define a Module.

show()

Displays the Module.

class panndas.nn.Dropout(p)
class panndas.nn.Identity

A Module that returns its inputs unaltered.

forward(xs)

Minimally, define this method to define a Module.

class panndas.nn.LayerMaxNorm

Normalize across the feature dimension with respect to the infinity norm.

forward(xs)

Minimally, define this method to define a Module.

class panndas.nn.Linear(weights_df, bias_series=- 1)

A Module that multiplies its inputs by the weights_df and adds the bias_series.

Input ‘tensors’ can be at most 2-D here: feature (rows) and batch/sequence (columns).

The weights dataframe should have the input feature space as its column index and the output feature space as its row index.

forward(xs)

Minimally, define this method to define a Module.

show()

Displays the Module.

class panndas.nn.LinearAttention(queries_df, keys_df, values_df)

The most basic version of an attention layer.

forward(xs)

Combines queries, keys, and values linearly.

class panndas.nn.Mish

Applies the Mish function, element-wise.

For details, see Mish: A Self-Regularized Non-Monotonic Neural Activation Function.

forward(xs)

Applies the Mish function, element-wise.

class panndas.nn.Module

An object that is callable via its .forward method.

abstract forward(xs)

Minimally, define this method to define a Module.

show()

Displays the Module.

class panndas.nn.ReLU

Ol’ ReLU-iable.

forward(xs)

Minimally, define this method to define a Module.

class panndas.nn.Sequential(modules)

A Module that applies an iterable of Modules sequentially.

forward(xs)

Minimally, define this method to define a Module.

show()

Displays the Module.

class panndas.nn.Sigmoid

Applies the sigmoid function, element-wise.

forward(xs)

Applies the sigmoid function, element-wise.

class panndas.nn.Softmax

Applies softmax function, column-wise.

forward(xs)

Applies softmax function, column-wise.

class panndas.nn.SoftmaxAttention(queries_df, keys_df, values_df)

The best-known version of an attention layer.

forward(xs)

Uses a softmax over the sequence dim to select which values to attend to.

class panndas.nn.Softplus

Applies the softplus function, element-wise.

forward(xs)

Applies the softplus function, element-wise.