Question :  Why Recurrent Neural Network, when we have the word 2 vector?
Answer :  Word 2 vector is able to capture the inter relationship, however where it fails is to capture the intra-relationship. RNN’s are able to capture this intra-relationship. e.g
Word 2 vector : Given a document “Capital of US is Washington”. “It’s capital since 1970”
Word2Vec can capture relationship that “Washington” is  related to  “Capital” and “US”.

However,  Word2vec cannot find intra-relationship between the above two sentence i.e it is not able to predict the second sentence given the first sentence. This is where the RNN’s true potential lies.

Question : What are some of the  cases of intra-relationship usefulness?
Answer : Some of the  cases, where intra-relationship is extremely useful is
1.  Language Translation:  e.g Find  intra-relation ship between language.
language transaltion.JPG           2. Speech Recognition: Find intra-relation between the voice and the words.

Question : Well, I seen in the screen shot that for two three lettered sentence, the output is of varying length i.e 2 and 3 words respectively? Is this varying length not a problem?
Answer: Yes, it is a problem since it implies that the   n word sentence   when translated can be of any arbitrary length  “n-m” where m  varies both in positive and negative direction.
This is extremely problematic for the word 2 vectors, since it requires the fixed length, however RNN’s are very effective at handling such varying length.