'How to concatenate or weighted average knowledge graphs embeddings with Textual embeddings by Keras embedding layers?

I have generated knowledge graph embeddings by RDF2Vec(<s,p,o> RDF triples.ttl file with DBpedia URIs) of size (7536, 384) and I have embeddings for my text dataset generated by Sbert of size(14169,384). Now, how do I concatenate these embeddings using Keras embedding layers? and then model.fit the concatenated embeddings with target label for training the model? Could anyone help?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source