Positional encoding layer with sine/cosine matrix of different frequencies.
Usage
layer_pos_sinusoid_wrapper(
maxlen = 100,
vocabulary_size = 4,
n = 10000,
load_r6 = FALSE,
embed_dim = 64
)
Arguments
- maxlen
Length of predictor sequence.
- vocabulary_size
Number of unique character in vocabulary.
- n
Frequency of sine waves for positional encoding. Only applied if
pos_encoding = "sinusoid"
.- load_r6
Whether to load the R6 layer class.
- embed_dim
Dimension for token embedding. No embedding if set to 0. Should be used when input is not one-hot encoded (integer sequence).