'Training Spacy model per multiprocessing

I'm training my model with the update function:

for batch in minibatch(TRAIN_DATA, size=10):
    for text, annotations in batch:
        doc = nlp.make_doc(text)
        example = Example.from_dict(doc, annotations)
        nlp.update([example], drop=0.35, sgd=optimizer, losses=losses)

This training only uses one cpu core, with spacy 3.2.3 What can be done, to train in multiprocessing?

As far as I know, the training is iterative, butI know that spacy has that feature. When using a pipe, the number of processes can be defined. But in training?



Solution 1:[1]

It looks like, aab is right. Here is an older post of the Github Repo: https://github.com/explosion/spaCy/issues/3507

Its all right with me. I try to train it on a GPU to speed up the process.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 mirArnold