Knowledge Graphs (KGs) are among the most commonly used knowledge representation paradigms, being at the core of tasks such as question answering or recommendation systems. Knowledge Graph Completion (KGC) is one of the key tasks concerning KGs, where the goal is to extract new elements from the existing information. Different approaches have been proposed through the years to tackle this challenge. Among them, two analogous categories can be distinguished: rule-learning and Knowledge Graph Embeddings (KGE). Different methods have been subsequently proposed to unify both types under a single framework, such that the benefits of both proposals can be exploited. However, most of these methods consider using rule-learning models as a boosting agent for KGE models, but not as an explainability tool. This work presents GEnI11https://github.com/oeg-upm/GEnI, a framework capable of generating insights and explanations for KGE models. GEnI follows a three-phase sequential process, generating a feasible explanation for a given prediction. Possible outcomes are rules, correlations, and influence detection. Moreover, the output is expressed in natural language to further extend the explainability of the proposal. GEnI has been successfully evaluated under three criteria: coherence, the meaningfulness of the output, and reliability. Moreover, it can be used by both translational and bilinear KGE models, offering broad coverage. Furthermore, this work also presents an in-depth review of existing integrative approaches between rule-learning and embedding models, providing a comparative framework between them.