The fast-changing area of Natural Language Processing (NLP) requires an understanding of the complexities in human language—context, emotion, and semantics—for the development of smart applications. Word2Vec by Mikolov et al. in 2013 is a revolutionary technology that has changed our method of representing words, based on how they co-occur. Nonetheless, it is limited in its power to communicate contextual meanings together with emotional complexities.
The Contextual and Emotional Word Embeddings (CEWE) algorithm is a created method that mathematically embeds both context and emotion into the concept of word embeddings. It brings into play features including enhanced emotion functions, adaptable weighting factors, and multidimensional contexts to describe richer linguistic associations. It is simply a new technique that does not fully displace Word2Vec.
This is simply a theoretical framework that still needs testing with real life data.
We typically see the context window as the definition of terms that relate to a particular word $w$. Unlike Word2Vec's fixed window, we dynamically adjust the context window size $k(w)$ based on linguistic cues:
$C(w) = \{ w_{-k(w)}, \dots, w_{-1}, w_{+1}, \dots, w_{+k(w)} \}$