'How to predict results from 20 million records using Hugging Face Model in minimum time

I am trying to predict sentiment for 20 million records using the model available in Hugging Face.

https://huggingface.co/finiteautomata/beto-sentiment-analysis

This model takes 1 hour and 20 minutes to predict 70000 records.

The model is saved locally and accessed locally by loading it.

Anyone can please suggest how I can efficiently use it to predict 20 million records in a minimum time.

Also, I am using the Zero-Shot Classification Model on the same data it is taking taking

7 minutes to predict for 1000 records.

Kindly suggest for this as well if any way to predict in minimum time.

model_path = 'path where model is saved'
from transformers import pipeline
classifier = pipeline("zero-shot-classification", 
                       model="Recognai/bert-base-spanish-wwm-cased-xnli")
   
def predict(row):
    topics = # five candidate labels here
    res = classifier(row, topics)
    return res

df['Predict'] = df['Message'].apply(lambda x: predict_crash(x)) # This df contains 70k records


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source