While training a Long Short Term memory Deep neural network, it is very important to ensure that the model created is not over-fitting. If the model is over-fitting then it will imply that the model will perform poorly when tested against the real world data. Over-fitting with the training data means that the model has learnt the training data by heart and hence is able to perform well for the data-set, since it has roted it down, however when faced with new data, it will not be able to perform well.
The following figure will be able to explain the if the model is over-fitting or not.
Figure : Explaining how training accuracy can be used to gauze model over-fitting 
The train set and the test set accuracy can also be used to gauge the over-fitting. i.e if a model yields higher accuracy for a training set but performs poorly for test data. then it also is the indication for the over-fitting model.
Reference 1 : http://cs231n.github.io/neural-networks-3/