State-of-the-art count-based word embeddings for low-resource languages with a special focus on historical languages.
-
Updated
Sep 16, 2024 - Python
State-of-the-art count-based word embeddings for low-resource languages with a special focus on historical languages.
Sequence Models coding assignments
Course materials for "Meaningful Text Analysis with Word Embeddings," taught at the Digital Humanities Summer Institute, June 2021.
a non-neural network approach for word embedding
Systems that can visualize word embedding vectors in 3D and 2D spaces.
Includes projects and assignment from Sequence Model course from Deep Learning Specialization from Coursera.Includes works on how to build a RNN with only python function and without using any framework. Other projects such as LSTM concepts & implementation, Machine translation, trigger word detection, word vector representation etc
Add a description, image, and links to the word-vector-representation topic page so that developers can more easily learn about it.
To associate your repository with the word-vector-representation topic, visit your repo's landing page and select "manage topics."