Positional encoding layer with learned embedding.
     
    
    Usage
    layer_pos_embedding_wrapper(
  maxlen = 100,
  vocabulary_size = 4,
  load_r6 = FALSE,
  embed_dim = 64
)
 
    
    Arguments
- maxlen
- Length of predictor sequence. 
- vocabulary_size
- Number of unique character in vocabulary. 
- load_r6
- Whether to load the R6 layer class. 
- embed_dim
- Dimension for token embedding. No embedding if set to 0. Should be used when input is not one-hot encoded
(integer sequence). 
 
    
    Value
    A keras layer implementing positional embedding.
     
    
    Examples
    if (FALSE) { # reticulate::py_module_available("tensorflow")
l <- layer_pos_embedding_wrapper()
}