Distantly supervised relation extraction aims to find the relational facts from unstructured texts, which is meaningful for many downstream tasks. Although distant supervision can automatically generate labeled training instances, it inevitably suffers from the wrong label problem. Current works mostly focus on the denoising process, trying to generate a more effective bag-level representation by selecting valid sentences. Nevertheless, there is a large amount of entity knowledge that can help the model to understand the relationship between entities, and these kinds of knowledge have not been fully utilized. Based on this observation, in this paper, we propose a novel distantly supervised relation extraction approach that exploits external entity knowledge to enhance the model’s expressive ability. In the model, the knowledge-aware word embeddings are generated to enrich the sentence level representations by introducing both structure knowledge from external knowledge graphs and semantic knowledge from corpus. The experimental results demonstrate that our proposed approach outperforms state-of-the-art the methods on both versions of a large-scale benchmark New York Times dataset. Besides, the differences between the two versions of dataset are also investigated through further comparative experiments, in which the dataset with no entity intersection can move effectively reflect model performance.