Word embedding is a very important concept in NLP. Its very important to understand what is it and what it is used for.
So what is word embedding? Simple! All ML algorithms, Deep Learning models are based on numbers, but in NLP we deal with “text” or language. We need a way to convert the text to numbers in the best possible way so that our deep learning models present best results.
A technical way of presenting this would be to convert our words into vector space, so that we are able to process it further in our pipelines. So generally in NLP first step would be “text pre processing” which we saw in earlier blog post and second step would be “vectorization”