
Retrying with flexible solve.Ĭollecting package metadata (repodata.json): done Solving environment: failed with initial frozen solve.

So I tried with conda install: % conda install pandas=1.1.2Ĭollecting package metadata (current_repodata.json): done One simple solution that could work is to run pip install with some additional flags ( -no-cache-dir -no-binary :all:) that supposedly compiles the package you are trying to install using the local version of numpy.Īnother person suggests using older packages. They mention a GitHub ticket which expands on the solutions. This StackOverflow post recommends to upgrade numpy to 1.20+, but since I am using TensorFlow, I am stuck with 1.19.5. Expected 88 from C header, got 80 from PyObject ValueError: numpy.ndarray size changed, may indicate binary incompatibility.

Pandas/_libs/interval.pyx in init pandas._libs.interval() > 13 from pandas._libs.interval import Interval ~/miniforge3/envs/tf25/lib/python3.9/site-packages/pandas/_libs/_init_.py in

> 29 from pandas._libs import hashtable as _hashtable, lib as _lib, tslib as _tslibģ0 except ImportError as e: # pragma: no cover ~/miniforge3/envs/tf25/lib/python3.9/site-packages/pandas/_init_.py in However, when I tried to import it in my Jupyter Lab notebook, it crashed with this error: ValueError Traceback (most recent call last) Pandas installs fine with pip install pandas. Before installing scikit-learn, you should install its dependencies.
