https://medium.com/@manansuri/a-dummys-guide-to-word2vec-456444f3c673
Endcoding words into a vector of unordered words can use bag-of-words representation. Anyway bag-of-words implies no similarities among the words, thus vector embedding comes into play e.g. word2vec.