Finding words (sentences, documents) with the same meaning is general problem for NLP (Natural Language Processing). Deep learning helps improve this field of science.
For example, word2vec approach helps you derive from text corpus some things with relationship like "man to king" as "women to ?". And "?" should be replaced by "queen". It's amazing stuff. In addition, you can train not just similarity word-to-word but also word-to-sequence of words.
Here is some examples from model which were trained on Google News corpus:
Paper with description:
Distributed Representations of Words and Phrases and their Compositionality"