« Encodage positionnel » : différence entre les versions


Aucun résumé des modifications
Aucun résumé des modifications
Ligne 13 : Ligne 13 :
  Positional encoding refers to a method in transformer models that helps to maintain the sequential order of words. This is a crucial component for understanding the context within a sentence and between sentences.
  Positional encoding refers to a method in transformer models that helps to maintain the sequential order of words. This is a crucial component for understanding the context within a sentence and between sentences.


== Source ==
== Sources ==
[https://www.kdnuggets.com/generative-ai-key-terms-explained    Source : kdnuggets]
[https://www.kdnuggets.com/generative-ai-key-terms-explained    Source : kdnuggets]



Version du 23 mai 2025 à 14:28

en construction

Définition

XXXXXXXXX

Français

Encodage positionnel

Anglais

Positional Encoding

PE

Positional encoding refers to a method in transformer models that helps to maintain the sequential order of words. This is a crucial component for understanding the context within a sentence and between sentences.

Sources

Source : kdnuggets

Source : Termium Plus

Contributeurs: Arianne , Patrick Drouin, wiki