AWE is An advanced approach to word embedding, applying a weighting to each word in the sentence to circumvent the weakness of simple averaging.
Word embeddings are the preferred method of representing words in natural language processing tasks. Word embedding techniques create a numerical representation of words such that those with a similar meaning are similarity represented.
Sentence representation is often achieved through the averaging of word embeddings because it’s fast to execute. Although averaging works well when are words have relatively the same importance, random outliers, and extreme values cause problems. As such, it is not well suited for applications requiring text classification.