updated: 2022-12-13
Embedding alignment puts together two independent embeddings into the same space, which enables comparison and knowledge transfer. For instance, one can create a time series of historical word embedding by aligning embeddings generated across different years.
A simple approach to enable this is to find a rotation that aligns one embedding to the other as much as possible. This approach is called the orthogonal Procrustes.
Orthogonal Procrustes problem - Wikipedia
For other methods, see Closed Form Word Embedding Alignment.