'How to use neuralcoref in Spacy
I have been trying to use the library neuralcoref: State-of-the-art coreference resolution based on neural nets and spaCy. I am using Ubuntu 16.04, Python 3.7.3 in conda 1.9.7 and Spacy 2.2.4.
My code (from the https://spacy.io/universe/project/neuralcoref):
import spacy
import neuralcoref
nlp = spacy.load('en_core_web_sm')
neuralcoref.add_to_pipe(nlp)
doc1 = nlp('My sister has a dog. She loves him.')
print(doc1._.coref_clusters)
doc2 = nlp('Angela lives in Boston. She is quite happy in that city.')
for ent in doc2.ents:
print(ent._.coref_cluster)
I have got this error
/home/daniel/anaconda3/lib/python3.7/importlib/_bootstrap.py:219: RuntimeWarning: spacy.morphology.Morphology size changed, may indicate binary incompatibility. Expected 104 from C header, got 112 from PyObject
return f(*args, **kwds)
/home/daniel/anaconda3/lib/python3.7/importlib/_bootstrap.py:219: RuntimeWarning: spacy.vocab.Vocab size changed, may indicate binary incompatibility. Expected 96 from C header, got 104 from PyObject
return f(*args, **kwds)
/home/daniel/anaconda3/lib/python3.7/importlib/_bootstrap.py:219: RuntimeWarning: spacy.tokens.span.Span size changed, may indicate binary incompatibility. Expected 72 from C header, got 80 from PyObject
return f(*args, **kwds)
I have tried to downgrade the version of Spacy to 2.1.0 as suggested by this link:
conda config --append channels conda-forge
conda install spacy=2.1.0
However, I am not able
PackagesNotFoundError: The following packages are not available from current channels:
- spacy=2.1.0
Current channels:
- https://conda.anaconda.org/conda-forge/linux-64
- https://conda.anaconda.org/conda-forge/noarch
- https://repo.anaconda.com/pkgs/main/linux-64
- https://repo.anaconda.com/pkgs/main/noarch
- https://repo.anaconda.com/pkgs/r/linux-64
- https://repo.anaconda.com/pkgs/r/noarch
To search for alternate channels that may provide the conda package you're
looking for, navigate to
https://anaconda.org
and use the search bar at the top of the page.
How can I solve this issue without downgrade? Is there any new updated version of neuralcoref?
Solution 1:[1]
Do Precisely what Raqib has said. I used google colab, so skip (1) if using google colab. Add the following commands:
1)Create a new environment using: (change myenv to the name you want to name the environment to)
conda create --name myenv
Select that environment:
conda info --envs
conda activate myenv
2)Then install the python 3.7 in that environment
Install the python version
!apt-get install python3.7
3)install spacy and neuralcoref with version supported.
!pip install spacy==2.1.0
!pip install neuralcoref
!pip install https://github.com/explosion/spacy-models/releases//download/en_core_web_lg-2.1.0/en_core_web_lg-2.1.0.tar.gz
import pandas as pd
import re
import spacy
import neuralcoref
import en_core_web_lg
nlp = en_core_web_lg.load()
neuralcoref.add_to_pipe(nlp)
Solution 2:[2]
Downgrade to spacy 2
pip uninstall -y neuralcoref
pip install --no-cache-dir neuralcoref --no-binary neuralcoref
pip install -U spacy==2.3.7
python -m spacy download en
restart the kernel if you are using jupyter.
and .....
import logging;
logging.basicConfig(level=logging.INFO)
import neuralcoref
update
pip install -U spacy==2.3.7
pip install neuralcoref
python -m spacy download en
Solution 3:[3]
This is not a direct answer to the question, but if you want to use neuralCoref, another option is to use the original version written in Java. The stanza library has a coreNLP client, which can access original coreNLP models. This doesn't require you to use (an outdated version of) Spacy.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | |
| Solution 2 | |
| Solution 3 | rusho |
