Installation with Spacy 3.1 (Archived)

How to install dependency libraries

conda create -n dackar_libs python=3.9

conda activate dackar_libs

pip install spacy==3.1 textacy matplotlib nltk coreferee beautifulsoup4 networkx pysbd tomli numerizer autocorrect pywsd openpyxl quantulum3[classifier] numpy scikit-learn==1.2.2 pyspellchecker

Download language model from spacy

python -m spacy download en_core_web_lg

python -m coreferee install en

Different approach when there is an issue with SSLError

  1. Download en_core_web_lg-3.1.0.tar.gz, then run

python -m pip install ./en_core_web_lg-3.1.0.tar.gz
  1. Download coreferee, then run:

python -m pip install ./coreferee_model_en.zip

You may need to install stemming for some of unit parsing

pip install stemming

Windows machine have an issue with pydantic

pip install typing_extensions==4.5.*

Required libraries and nltk data for similarity analysis

conda install -c conda-forge pandas
python -m nltk.downloader all

Different approach when there is an issue with SSLError

Please check installing_nltk_data on how to manually install nltk data. For this project, users can try these steps:

cd ~
mkdir nltk_data
cd nltk_data
mkdir corpora
mkdir taggers
mkdir tokenizers
Dowload wordnet, averaged_perceptron_tagger, punkt
cp -r wordnet ~/nltk_data/corpora/
cp -r averaged_perceptron_tagger ~/nltk_data/taggers/
cp -r punkt ~/nltk_data/tokenizers

Required library for preprocessing

pip install contextualSpellCheck