'One hot Encoding text data in pytorch

I am wondering how to one hot encode text data in pytorch?

For numeric data you could do this

import torch
import torch.functional as F

t = torch.tensor([6,6,7,8,6,1,7], dtype = torch.int64)
one_hot_vector = F.one_hot(x = t, num_classes=9)
print(one_hot_vector.shape)
# Out > torch.Size([7, 9])

But what if you have text data instead

from torchtext.data.utils import get_tokenizer
corpus = ["The cat sat the mat", "The dog ate my homework"]
tokenizer = get_tokenizer("basic_english")
tokens = [tokenizer(doc) for doc in corpus]

But how do I one hot encode this vocab using Pytorch?

With something like Scikit Learn I could do this, is there a similar way to do in pytorch

import spacy
from spacy.lang.en import English
from sklearn.preprocessing import OneHotEncoder

corpus = ["The cat sat the mat", "The dog ate my homework"]
nlp = English()
tokenizer = spacy.tokenizer.Tokenizer(nlp.vocab)
tokens = np.array([[token for token in tokenizer(doc)] for doc in corpus])
one_hot_encoder = OneHotEncoder(sparse = False)
one_hot_encoded = one_hot_encoder.fit_transform(tokens)


Solution 1:[1]

You can do the following:

from typing import Union, Iterable
import torchtext
from torchtext.data.utils import get_tokenizer
from torchtext.vocab import build_vocab_from_iterator

corpus = ["The cat sat the mat", "The dog ate my homework"]
tokenizer = get_tokenizer("basic_english")
tokens = [tokenizer(doc) for doc in corpus]

voc = build_vocab_from_iterator(tokens)

def my_one_hot(voc, keys: Union[str, Iterable]):
    if isinstance(keys, str):
        keys = [keys]
    return F.one_hot(torch.tensor(voc(keys)), num_classes=len(voc))

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1