-
Notifications
You must be signed in to change notification settings - Fork 39
Open
Description
Hello, I am running EASSE on a PyCharm virtual environment with Python 3.7 and all metrics except for SAMSA are working. I already installed tupa and I fixed the following error message by using the pip install protobuf==3.20.*
command:
TypeError: Descriptors cannot not be created directly.
If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
If you cannot immediately regenerate your protos, some other possible workarounds are:
1. Downgrade the protobuf package to 3.20.x or lower.
2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).
Now I can execute SAMSA but it is still not working. This is my console output:
G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\venv\Scripts\python.exe" "G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\run_dennis.py"
Warning: SAMSA metric is long to compute (120 sentences ~ 4min), disable it if you need fast evaluation.
Loading spaCy model 'en_core_web_md'... Done (33.254s).
Loading from 'G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\resources\tools\ucca-bilstm-1.3.10\models\ucca-bilstm.json'.
[dynet] random seed: 1
[dynet] allocating memory: 512MB
[dynet] memory allocation done.
[dynet] 2.1
Loading from 'G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\resources\tools\ucca-bilstm-1.3.10\models\ucca-bilstm.enum'... Done (0.121s).
Loading model from 'G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\resources\tools\ucca-bilstm-1.3.10\models\ucca-bilstm': 23param [02:14, 5.86s/param]
Loading model from 'G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\resources\tools\ucca-bilstm-1.3.10\models\ucca-bilstm': 100%|██████████| 23/23 [02:06<00:00, 5.51s/param]
Loading from 'G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\resources\tools\ucca-bilstm-1.3.10\models\ucca-bilstm.nlp.json'.
tupa --hyperparams "shared --lstm-layers 2" "amr --max-edge-labels 110 --node-label-dim 20 --max-node-labels 1000 --node-category-dim 5 --max-node-categories 25" "sdp --max-edge-labels 70" "conllu --max-edge-labels 60" --log parse.log --max-words 0 --max-words-external 249861 --vocab G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\resources\tools\ucca-bilstm-1.3.10\vocab\en_core_web_lg.csv --word-vectors ../word_vectors/wiki.en.vec
Loading 'G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\resources\tools\ucca-bilstm-1.3.10\vocab\en_core_web_lg.csv': 1340694 rows [00:06, 218144.04 rows/s]
2 passages [00:01, 1.05 passages/s, en ucca=1_0]
Starting server with command: java -Xmx5G -cp G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\resources\tools\stanford-corenlp-full-2018-10-05/* edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 60000 -threads 40 -maxCharLength 100000 -quiet True -serverProperties corenlp_server-21b4f872deb94b0d.props -preload tokenize,ssplit,pos,lemma,ner,depparse
Traceback (most recent call last):
File "G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\run_dennis.py", line 23, in <module>
sys_sents=["About 95 you now get in.", "Cat on mat."])
File "G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\samsa.py", line 305, in corpus_samsa
return np.mean(get_samsa_sentence_scores(orig_sents, sys_sents, lowercase, tokenizer, verbose))
File "G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\samsa.py", line 281, in get_samsa_sentence_scores
verbose=verbose,
File "G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\samsa.py", line 30, in syntactic_parse_ucca_scenes
verbose=verbose,
File "G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\easse\aligner\corenlp_utils.py", line 144, in syntactic_parse_texts
raw_parse_result = client.annotate(text)
File "G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\venv\lib\site-packages\stanfordnlp\server\client.py", line 398, in annotate
r = self._request(text.encode('utf-8'), request_properties, **kwargs)
File "G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\venv\lib\site-packages\stanfordnlp\server\client.py", line 311, in _request
self.ensure_alive()
File "G:\My Drive\M5\Masterarbeit\implementation_metrics\easse\venv\lib\site-packages\stanfordnlp\server\client.py", line 137, in ensure_alive
raise PermanentlyFailedException("Timed out waiting for service to come alive.")
stanfordnlp.server.client.PermanentlyFailedException: Timed out waiting for service to come alive.
Process finished with exit code 1
Is the problem maybe because the folder "My Drive" has a space? I haven't changed this folder name because changing it is quite a hassle.
Metadata
Metadata
Assignees
Labels
No labels