Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fail at import: dependencies not installed #11

Open
mbrhd opened this issue Feb 7, 2022 · 1 comment
Open

Fail at import: dependencies not installed #11

mbrhd opened this issue Feb 7, 2022 · 1 comment

Comments

@mbrhd
Copy link

mbrhd commented Feb 7, 2022

Hi

I'm exploring Chronos for time series. I've decided to use this example notebook to start.

When running
from zoo.chronos.data import TSDataset
I got the following error message:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/zoo/__init__.py", line 17, in <module>
    from zoo.common.nncontext import *
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/zoo/common/__init__.py", line 17, in <module>
    from .utils import *
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/zoo/common/utils.py", line 16, in <module>
    from bigdl.util.common import Sample as BSample, JTensor as BJTensor,\
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/bigdl/__init__.py", line 18, in <module>
    prepare_env()
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/bigdl/util/engine.py", line 155, in prepare_env
    __prepare_spark_env()
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/bigdl/util/engine.py", line 53, in __prepare_spark_env
    if exist_pyspark():
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/bigdl/util/engine.py", line 26, in exist_pyspark
    import pyspark
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/pyspark/__init__.py", line 51, in <module>
    from pyspark.context import SparkContext
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/pyspark/context.py", line 31, in <module>
    from pyspark import accumulators
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/pyspark/accumulators.py", line 97, in <module>
    from pyspark.serializers import read_int, PickleSerializer
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/pyspark/serializers.py", line 72, in <module>
    from pyspark import cloudpickle
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/pyspark/cloudpickle.py", line 145, in <module>
    _cell_set_template_code = _make_cell_set_template_code()
  File "/opt/anaconda3/envs/analytics-zoo-test/lib/python3.8/site-packages/pyspark/cloudpickle.py", line 126, in _make_cell_set_template_code
    return types.CodeType(
TypeError: an integer is required (got type bytes)

I fixed this by installing spark using: conda install pyspark

Then the same command from zoo.chronos.data import TSDataset fails because of pandas, packaging, and tsfresh not installed. I fixed this issue by installing pandas, packaging and tsfresh.

@shanyu-sys
Copy link
Contributor

You could install analytics-zoo with an extra dependency pip install --pre --upgrade analytics-zoo[automl], as mentioned in our Chronos User Guide.

If you have switched to BigDL, you could install Chronos with pip install --pre --upgrade bigdl-chronos[all] and find more installation details here

@liu-shaojun liu-shaojun transferred this issue from intel-analytics/BigDL-2.x Mar 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants