'from summarizer import Summarizer it's not running in Colab
I'm trying to use the BERT Text Summarizer inside Colab but I'm getting the following error
from summarizer import Summarizer
I am getting the error as below,
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-12-ebcc505c0b7d> in <module>()
----> 1 from summarizer import Summarizer
1 frames
/usr/local/lib/python3.7/dist-packages/summarizer/model_processors.py in <module>()
2
3 import numpy as np
----> 4 from transformers import (AlbertModel, AlbertTokenizer, BartModel, BigBirdModel, BigBirdTokenizer,
5 BartTokenizer, BertModel, BertTokenizer,
6 CamembertModel, CamembertTokenizer, CTRLModel,
ImportError: cannot import name 'BartModel' from 'transformers' (/usr/local/lib/python3.7/dist-packages/transformers/__init__.py)
Solution 1:[1]
According to the Huggingface documentation, using summarization pipeline in the easiest way can be implemented like this:
from transformers import pipeline
# use bart in pytorch
summarizer = pipeline("summarization")
ptorch = summarizer("An apple a day, keeps the doctor away", min_length=5, max_length=20)
# use t5 in tf
summarizer = pipeline("summarization", model="t5-base", tokenizer="t5-base", framework="tf")
tflow = summarizer("An apple a day, keeps the doctor away", min_length=5, max_length=20)
print(ptorch, tflow)
output for pyTorch and tensorflow:
[{'summary_text': ' An apple a day, keeps the doctor away from your doctor away, says Dr.'}] [{'summary_text': 'an apple a day, keeps the doctor away from the doctor .'}]
but if you are willing to use BertModel as the model you would have two options:
- Break-down pipeline into tokenization ==> encoding ==> decoding (do the work manually)
- Pass a Bert-based model like
bert-base-casedas the model argument to the pipeline.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | meti |
