Enhanced QKNorm normalization for neural transformers with the Lp norm
arXiv:2602.05006v1 Announce Type: new
Abstract: The normalization of query and key vectors is an essential part of the Transformer architecture. It ensures that learning is stable regardless of the scale of these vectors. Some normalization approaches are available. In this preliminary work, a generalization of the QKNorm normalization scheme is proposed. The approach is based on the Lp norm, allowing non-Euclidean norms to be employed. Experimental results demonstrate the suitability of the method for a simple problem.
Like
0
Liked
Liked