pip 19.0 fails to install packages that import to-be-installed package from CWD

Created on 23 Jan 2019  Β·  89Comments  Β·  Source: pypa/pip

Environment

  • pip version: 19.0
  • Python version: 3.6
  • OS: MacOS

Description
When running pip install pyinstaller==3.4 with pip 19.0 we are getting an install error. ModuleNotFoundError: No module named 'PyInstaller'

Expected behavior
Expect pyinstall to be installed, as it is with pip 18.1

How to Reproduce
Using python3:
pip install pyinstaller=3.4

Output

pip install pyinstaller==3.4
Collecting pip
  Using cached https://files.pythonhosted.org/packages/60/64/73b729587b6b0d13e690a7c3acd2231ee561e8dd28a58ae1b0409a5a2b20/pip-19.0-py2.py3-none-any.whl
Installing collected packages: pip
  Found existing installation: pip 9.0.3
    Uninstalling pip-9.0.3:
      Successfully uninstalled pip-9.0.3
Successfully installed pip-19.0
(BuildVEnv) jlaroche-mbp:TrackSense$ pip install pyinstaller
Collecting pyinstaller
  Using cached https://files.pythonhosted.org/packages/03/32/0e0de593f129bf1d1e77eed562496d154ef4460fd5cecfd78612ef39a0cc/PyInstaller-3.4.tar.gz
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  Complete output from command /Users/jlaroche/Dev/uapkg/packages/system/algo/BuildVEnv/bin/python3 /Users/jlaroche/Dev/uapkg/packages/system/algo/BuildVEnv/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py get_requires_for_build_wheel /var/folders/j6/7t8sg1vj4q97zhh9z5cdmxbm4rz935/T/tmps3z6flnv:
  Traceback (most recent call last):
    File "/Users/jlaroche/Dev/uapkg/packages/system/algo/BuildVEnv/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py", line 207, in <module>
      main()
    File "/Users/jlaroche/Dev/uapkg/packages/system/algo/BuildVEnv/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py", line 197, in main
      json_out['return_val'] = hook(**hook_input['kwargs'])
    File "/Users/jlaroche/Dev/uapkg/packages/system/algo/BuildVEnv/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py", line 54, in get_requires_for_build_wheel
      return hook(config_settings)
    File "/private/var/folders/j6/7t8sg1vj4q97zhh9z5cdmxbm4rz935/T/pip-build-env-lo_ir5_f/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 115, in get_requires_for_build_wheel
      return _get_build_requires(config_settings, requirements=['wheel'])
    File "/private/var/folders/j6/7t8sg1vj4q97zhh9z5cdmxbm4rz935/T/pip-build-env-lo_ir5_f/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 101, in _get_build_requires
      _run_setup()
    File "/private/var/folders/j6/7t8sg1vj4q97zhh9z5cdmxbm4rz935/T/pip-build-env-lo_ir5_f/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 85, in _run_setup
      exec(compile(code, __file__, 'exec'), locals())
    File "setup.py", line 20, in <module>
      from PyInstaller import __version__ as version, HOMEPATH, PLATFORM
  ModuleNotFoundError: No module named 'PyInstaller'

Maintainer note on timeline: See https://github.com/pypa/pip/issues/6163#issuecomment-460563963

PEP 517 impact auto-locked bug

Most helpful comment

[...] could someone please check if --no-use-pep517 fixes this for them?

PyInstaller installs fine with --no-use-pep517.

All 89 comments

This seems to be an issue with either how pyinstaller is importing itself for installation.

It might be a good idea to file an issue over at the PyInstaller folks.

We currently use 18.1, and upgrading to 19.0 causes this problem for us as well. There is a related issue on the PyInstaller repo, where it is because pip '' is not in sys.path.

https://github.com/pyinstaller/pyinstaller/issues/2730

I think this is a pretty common workflow. You put __version__ = "1.2.3" in foo/__init__.py and then do import foo in setup.py so that you don't have to specify the version in two places. And any user of the the library can inspect the version according to PEP 396.

# foo/__init__.py
__version__ = "1.2.3"
# setup.py
from setuptools import setup

import foo

setup(..., version=foo.__version__)

Also, this only happens if you have a pyproject.toml file (and setup.py). Remove it and the installation works fine. So there seems to be some differences in behaviour there. Maybe the traditional way modifies sys.path/PYTHONPATH?

Ah, I think I get what is happening. Since by using a pyproject.toml file, you're basically telling pip you want to use PEP 517/518.

# pyproject.toml
[build-system]
requires = ["setuptools", "wheel"]

The above tells pip that it needs setuptools and wheel to build PyInstaller. But in the case of PyInstaller, it's also got this in its setup.py:

# setup.py
from PyInstaller import __version__

From a PEP 517 perspective, aside from setuptools and wheel, it means it needs itself to build. Which is of course a bit weird.

# pyproject.toml
[build-system]
requires = ["setuptools", "wheel", "PyInstaller"]

As @cjerdonek mentioned in https://github.com/pypa/pip/issues/6175#issuecomment-456769285, could someone please check if --no-use-pep517 fixes this for them?

I suspect the cause of this issue is that build isolation or the PEP 517 code isn't making sure that the root of the package directory is on the sys.path, because pandas has a versioneer.py sitting next to setup.py. I recall this coming up at some point, but I don't remember off the top of my head what that discussion was. This might be considered an issue with the setuptools build backend instead of pip, or it might be the fault of pip's isolation mechanism.

[...] could someone please check if --no-use-pep517 fixes this for them?

PyInstaller installs fine with --no-use-pep517.

Ok, then that's certainly an issue with the new PEP 517 code and I'm pretty sure the issue is just that the directory containing the project root hasn't been added to sys.path. Maybe @pfmoore will have a better sense of if that should be pip's responsbility or setuptools.

If it helps another example of this (via apache-airflow is pip install pendulum==1.4.4 fails, but pip install --no-use-pep517 pendulum==1.4.4 works.

The stack trace we get is similar:

Collecting pendulum==1.4.4
  Using cached https://files.pythonhosted.org/packages/85/a5/9fc15751f9725923b170ad37d6c61031fc9e941bafd5288ca6ee51233284/pendulum-1.4.4.tar.gz
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  Complete output from command /Users/ash/.virtualenvs/clean-airflow/bin/python3.7 /Users/ash/.virtualenvs/clean-airflow/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py get_requires_for_build_wheel /var/folders/lr/9jc9vkgn025fn6jmwm4mv4_w0000gn/T/tmprosed3kj:
  Traceback (most recent call last):
    File "/Users/ash/.virtualenvs/clean-airflow/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py", line 207, in <module>
      main()
    File "/Users/ash/.virtualenvs/clean-airflow/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py", line 197, in main
      json_out['return_val'] = hook(**hook_input['kwargs'])
    File "/Users/ash/.virtualenvs/clean-airflow/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py", line 54, in get_requires_for_build_wheel
      return hook(config_settings)
    File "/private/var/folders/lr/9jc9vkgn025fn6jmwm4mv4_w0000gn/T/pip-build-env-g__m0jh6/overlay/lib/python3.7/site-packages/setuptools/build_meta.py", line 115, in get_requires_for_build_wheel
      return _get_build_requires(config_settings, requirements=['wheel'])
    File "/private/var/folders/lr/9jc9vkgn025fn6jmwm4mv4_w0000gn/T/pip-build-env-g__m0jh6/overlay/lib/python3.7/site-packages/setuptools/build_meta.py", line 101, in _get_build_requires
      _run_setup()
    File "/private/var/folders/lr/9jc9vkgn025fn6jmwm4mv4_w0000gn/T/pip-build-env-g__m0jh6/overlay/lib/python3.7/site-packages/setuptools/build_meta.py", line 85, in _run_setup
      exec(compile(code, __file__, 'exec'), locals())
    File "setup.py", line 47, in <module>
      from build import *
    File "/Users/ash/.virtualenvs/clean-airflow/lib/python3.7/site-packages/pip/_vendor/pep517/build.py", line 7, in <module>
      from pip._vendor import pytoml
  ModuleNotFoundError: No module named 'pip'

Also the installation of the following also doesn't work with pip v 19.0 but would if using the --no-use-pep5517:
pendulum==1.5.0 (AttributeError: module 'enum' has no attribute 'IntFlag')
pendulum==1.5.1 (AttributeError: module 'enum' has no attribute 'IntFlag')
pendulum==2.0.0 (AttributeError: module 'enum' has no attribute 'IntFlag')
pendulum==2.0.1 (AttributeError: module 'enum' has no attribute 'IntFlag')
pendulum==2.0.2 (AttributeError: module 'enum' has no attribute 'IntFlag')

Whereas 2.0.3 and 2.0.4 install fine.

cartopy(At least their latest release) also fails to install since 19.0, failing to import its versioneer.py that's next to setup.py

This is also an issue with some projects that I deal with. We use a pyproject.toml to define our python black parameters, and do a similar from project.version import __version__ in our setup.py.

At the very least I feel like being able to define no project isolation in the pyproject.toml would be sufficient. It seems unreasonable to me to make anybody wanting to install the project to use --no-buid-isolation or --no-use-pep517

The failure appears to be in get_requires_for_build_wheel, and the setuptools backend runs setup.py to do some sort of introspection to determine build requirements (the specific code is here). That code appears weird to me, and I don't understand why it's required. My initial instinct is that this is a bug in the setuptools backend that should be addressed by them.

PEP 517 does not state that frontends should run hooks in an environment that adds the build directory to sys.path, and there's a potential concern that if we did that, it could break isolation (if the build directory contained a copy of some required but not specified package, for instance). So my preference would be to not add the build directory to sys.path. But it may be expedient to do so if that offers a quick fix for this regression. I don't think projects should rely on this, though.

Summary:

  1. This should be reported to setuptools for review as a backend issue. I'd consider fixing it in the setuptools backend (possibly just by them adding the build directory to sys.path) as the ideal resolution.
  2. If setuptools doesn't do it, pip could add the build directory to sys.path, but I don't think that PEP 517 views that as the frontend's responsibility..
  3. Requiring the build directory to be visible to hooks on sys.path would require at least a PEP clarification.

I don't think this scenario was considered when PEP 517 was being developed. Maybe because it's setuptools-specific (or rather, specific to backends that run arbitrary Python code as part of the build).

I think it's fairly common for people to import something from the current directory into a setup.py, and just generally treat things as if setup.py is in $PWD.

I think it's reasonable to push this responsibility onto setuptools, since that's probably the only project that really needs it.

Yep, thinking about this some more, I'm certain it's a setuptools backend responsibility. Pre-PEP 517, pip ran setup.py as a script, so standard Python rules put the script directory onto sys.path. Under PEP 517, invocation of setup.py is replaced with calls to the backend hooks, so those hooks need to preserve the semantics. Because setuptools runs setup.py in-process from the hooks, it needs to manage sys.path itself. Hopefully, it's not a big fix for them. @jeanlaroche (or someone else hitting this issue) could you raise an issue on the setuptools tracker, referring back to this thread?

[...] could someone please check if --no-use-pep517 fixes this for them?

I can confirm that --no-use-pep517 allows pip install pandas to succeed.

I can also confirm that using --no-use-pep517 works for all of my broken packages

success for me too

pip install pyinstaller --no-use-pep517
Collecting pyinstaller
  Using cached https://files.pythonhosted.org/packages/03/32/0e0de593f129bf1d1e77eed562496d154ef4460fd5cecfd78612ef39a0cc/PyInstaller-3.4.tar.gz
Requirement already satisfied: setuptools in c:\python37\lib\site-packages (from pyinstaller) (39.0.1)
Collecting pefile>=2017.8.1 (from pyinstaller)
  Downloading https://files.pythonhosted.org/packages/ed/cc/157f20038a80b6a9988abc06c11a4959be8305a0d33b6d21a134127092d4/pefile-2018.8.8.tar.gz (62kB)
    100% |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 71kB 1.0MB/s
Collecting macholib>=1.8 (from pyinstaller)
  Downloading https://files.pythonhosted.org/packages/41/f1/6d23e1c79d68e41eb592338d90a33af813f98f2b04458aaf0b86908da2d8/macholib-1.11-py2.py3-none-any.whl
Collecting altgraph (from pyinstaller)
  Downloading https://files.pythonhosted.org/packages/0a/cc/646187eac4b797069e2e6b736f14cdef85dbe405c9bfc7803ef36e4f62ef/altgraph-0.16.1-py2.py3-none-any.whl
Collecting pywin32-ctypes (from pyinstaller)
  Using cached https://files.pythonhosted.org/packages/9e/4b/3ab2720f1fa4b4bc924ef1932b842edf10007e4547ea8157b0b9fc78599a/pywin32_ctypes-0.2.0-py2.py3-none-any.whl
Collecting future (from pefile>=2017.8.1->pyinstaller)
  Downloading https://files.pythonhosted.org/packages/90/52/e20466b85000a181e1e144fd8305caf2cf475e2f9674e797b222f8105f5f/future-0.17.1.tar.gz (829kB)
    100% |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 829kB 1.6MB/s
Installing collected packages: future, pefile, altgraph, macholib, pywin32-ctypes, pyinstaller
  Running setup.py install for future ... done
  Running setup.py install for pefile ... done
  Running setup.py install for pyinstaller ... done
Successfully installed altgraph-0.16.1 future-0.17.1 macholib-1.11 pefile-2018.8.8 pyinstaller-3.4 pywin32-ctypes-0.2.0

I don't think this is a bug in pip/setuptools, from my reading of PEP 517's Build Environment section it seems nothing about an environment should be assumed except that the dependencies declared in pyproject.toml are available.

Also isn't importing the package being installed from setup.py is a bad practice? There are much better ways of maintaining package version in one place, e.g. as described in this packaging guide from PyPA.

In the discussion in pypa/setuptools#1642, @uranusjr and I both were hoping this could be an opportunity to get people to stop relying on the fact that . is in sys.path when you execute a python script, and start moving people into more explicit semantics.

The major problem here is that the mere presence of pyproject.toml is opting people into both PEP 518 and PEP 517, so even if you haven't specified a build backend you are suddenly getting the new semantics.

Is this decision on pip's part irreversible at this point? Maybe we can have the presence of pyproject.toml opt you in to PEP 518, but PEP 517 is not triggered unless you actually specify a build backend?

Honestly, it's a tough situation all around, but I think it's easier for pip to warn about this than it is for setuptools. If we make PEP 517 opt-in for now and say that the existence of a pyproject.toml will start to trigger PEP 517 after 20.0 or 21.0, we can create a migration guide and start issuing warnings in pip to that effect - "build-backend missing from pyproject.toml, be aware that after version 21.0, build isolation will default to using setuptools.build_meta, see the migration guide at ..."

Also isn't importing the package being installed from setup.py is a bad practice? There are much better ways of maintaining package version in one place, e.g. as described in this packaging guide from PyPA.

To be fair, that guide defines importing the package being installed from setup.py as strategy 6, so it's definitely somewhat common. At this point, if we are moving away from that kind of strategy, that page should be updated to not include it anymore.

The decision to trigger PEP 517 on the existence of pyproject.toml was a deliberate choice (the discussion was probably on the PEP 517 implementation issue or the PR, but I don't have time right now to locate it). Obviously it could be changed in the light of what we see here, but we shouldn't do so without due consideration of the reasons we made the decision we did.

Pip itself doesn't (shouldn't, IMO) take a view on whether setup.py is right to assume that the project directory is on sys.path, so I'm somewhat reluctant to change pip simply because setuptools wants to push a different default when the backend gets used. While I agree that importing the project being built in setup.py has a number of difficulties, it's not like it's something that has triggered warnings up to now, so I'd assumed that the backend should maintain that semantics. To put it another way, even if pip did warn as you suggest, why would users read "build isolation will default to using setuptools.build_meta" as implying "you won't be able to import your project from setup.py"? The two facts seem unrelated to me...

Personally, I agree with the current approach of relying on the existence of pyproject.toml file. The issues IMO stems from people using pyproject.toml for things other than packaging. The correct way out is therefore to push non-packaging tools to offer another way for configuration, so people can choose whether to use pyproject.toml or not.

Pip itself doesn't (shouldn't, IMO) take a view on whether setup.py is right to assume that the project directory is on sys.path, so I'm somewhat reluctant to change pip simply because setuptools wants to push a different default when the backend gets used.

True, it's just that pip is making the assumption that invoking the setuptools.build_meta hooks is and should be perfectly equivalent to calling setup.py. We've already seen cases where it's not, and I think it is still to be established whether we (setuptools) want the contract of setuptools.build_meta to be "this is equivalent to calling python setup.py" or if instead we want it to be "this is a more locked-down and isolated version of invoking setup.py directly".

Of course setuptools could say "that's not the contract of the function so we're not going to fix it" and pip could say, "our decision was that it defaults to PEP 517" and we can both say the bug is in the other project, but it's probably a good idea to coordinate.

To put it another way, even if pip did warn as you suggest, why would users read "build isolation will default to using setuptools.build_meta" as implying "you won't be able to import your project from setup.py"? The two facts seem unrelated to me...

They may or may not be related, but there may also be other changes to the semantics. The point of a warning like that is to say, "Please be explicit about how you want this build to happen, because soon we're going to opt you into something that might break your build." Projects can add the build backend to their pyproject.toml ahead of time and proactively fix any breakages that might happen.

We could also possibly create a "dummy" PEP 517 backend in setuptools like setuptools.build_meta_legacy that just chdirs into the root directory and invokes setuptools.build_meta, that way people can opt in to the old behavior only if they need it before it starts breaking.

Personally, I agree with the current approach of relying on the existence of pyproject.toml file. The issues IMO stems from people using pyproject.toml for things other than packaging.

I think we need to separate PEP 517 and PEP 518. PEP 518 explicitly lists what the "default values" are for the PEP, whereas PEP 517 does not specify anything about what the default backend is.

I remember not feeling terribly averse to the whole "the existence of pyproject.toml opts you in to build isolation", but seeing it in practice, I also don't like the idea that specifying my isolated build's dependencies also would opt me in to setuptools.build_meta.

Maybe the solution is to split the difference and have the backend default to an undocumented setuptools.build_meta_legacy (which does attempt to maintain the semantics of setup.py). That way we'll at least have a way to tell if a user made an affirmative choice to use the new semantics or if they just didn't think about it.

A build_meta_legacy with a warning message sounds like a reasonable solution to me. It is likely better to make the warning very prominent (e.g. during installation and encourage users to file this as a bug to the maintainer), with clear instructions how the migration should be done.

I should also note that pip's intention (that's the "corporate" pip ;-) - what I mean is that the pip devs discussed this a bit and reached a general consensus that it sounded like a reasonable idea, but it's not a firm plan yet and it relies on someone actually writing code for it to happen) is that we relatively quickly switch to passing all projects through PEP 517, and dropping our legacy code path through setup.py altogether. Making the setuptools backend only in pip 21.0 (to use the suggested release from above) pushes that off at least an additional 2 years.

whereas PEP 517 does not specify anything about what the default backend is

True. But at some point, pip will drop the special case support of setuptools. That is after all, the whole point (for us, at least) of PEP 517, to decouple the frontend from the backend, and to put all backends on an equal footing. So whenever we do that, we have to either error if there's no backend, or choose a default (and we'll go for defaulting to setuptools, for legacy reasons). The debate here is when we do that, not if.

Pip currently has 2 code paths for installs - the PEP 517 path, and the legacy setup.py path. That's a source of maintenance issues, and potential bugs. We opted to make PEP 517 the default if pyproject.toml was present to ensure usage of the PEP 517 path (it's unlikely projects will rush to add build-backend = setuptools.buid_meta, so without the current behaviour, the odds are that testing of both pip's PEP 517 code, and setuptools' backend, would remain near-zero for an extended period). There's an opt-out in the form of --no-use-pep517 precisely to cater for the (assumed rare) cases where the setuptools backend was unsuitable.

I don't think anyone anticipated that setuptools would want to have semantic differences between setup.py and the backend, so the possibility that --no-use-pep517 would be needed to work around semantic differences so often that it should be the default was never even considered.

We could also possibly create a "dummy" PEP 517 backend in setuptools like setuptools.build_meta_legacy that just chdirs into the root directory and invokes setuptools.build_meta, that way people can opt in to the old behavior only if they need it before it starts breaking.

That may be a reasonable solution. But it would have to be at least partially documented - at a minimum, pip would document that this was the default value we'd assume. Whether setuptools chose to leave the backend undocumented, is their choice I guess.

I'm not sure how useful any further theoretical discussion is. I don't think I have anything further to add, certainly. I'd suggest that if someone wants to take this forward, the best way would be to create a PR switching the default, and discussion on whether we want to accept it can move over there.

For package maintainers looking to resolve this issue for your users: I’ve published a shim that implements the sys.path fix.

https://pypi.org/project/setuptools-localimport/

Hopefully this can work as a stopgap so we can ponder how this should be moved forward without rushing into a solution, or unnecessarily slow down the adoption of pip 19.0 (which contains much more goodies than just PEP 517).

I’ve published a shim that implements the sys.path fix.

That's awesome! Regardless of the ultimate fix, this is a really nice example of the flexibility of the PEP 517 hook system :-)

I fixed it by downgrade my pip version under 19.x then I tried to install and went successfully

There's a third use case: what about packages that are providing their own build backend? E.g.: setuptools itself only list wheel as a build requirement:

[build-system]
requires = ["wheel"]
build-backend = "setuptools.build_meta"

This of course will fail if pip's code for handling PEP 517 does not add the source directory to sys.path.

From PEP 517:

When importing the module path, we do not look in the directory containing the source tree, unless that would be on sys.path anyway (e.g. because it is specified in PYTHONPATH). Although Python automatically adds the working directory to sys.path in some situations, code to resolve the backend should not be affected by this.

That pretty clearly (to me) says that projects should not expect to be able to see their own project directory when resolving build-backend - so setuptools needs to add itself to requires IMO. And yes, I understand that doing so is circular. But build backends that build themselves are by their nature pretty circular anyway - they certainly aren't the normal case.

That same section also seems to me to confirm that build tools shouldn't expect the project directory to be in the build environment's sys.path.

How would that work with --no-binary :all:?

@pfmoore A variant of the situation is that a package supplies a custom build system. The build system is not installed (and not part of the bdist), but supplied with the sdist, maybe to cusomise some build process. Is this a valid use case, or must the maintainer publish the custom build system as a separate package?

Edit: Something like

project/
    custom_build.py
    src/
        my_package/
            __init__.py
            ...
    pyproject.toml  
[build-system]
requires = []
build-backend = "custom_build"

# Maybe the custom backend specifies metadata like this…
[tool.custom_build.metadata]
name = "my-package"
dependencies = []
packages = ["my_package"]

Maybe a solution would be to add a new optional config, to indicate where to find modules during build?

[build-system]
requires = []
build-backend = "custom_build"
build-backend-findpath = ["build_systems"]   # Put custom_build.py above in a subdirectory.

The config defaults to [] (empty list), meaning no paths are added (i.e. the same as the current behaviour), but projects can add paths to find the build system locally.

If build-system is omitted entirely, the section defaults to:

requires = ["setuptools", "wheel"]
build-backend = "setuptools.build_meta"
build-backend-findpath = [""]

pip can display a warning/info message to tell the user to migrate (with a link to a documentation page, likely).

An additional benefit of this solution is that everything can be done in pip (the vendored pep517 module, actually). Nothing needs to change in setuptools or existing broken projects.

A variant of the situation is that a package supplies a custom build system.

I honestly don't know. It's not a situation I've thought about myself. I don't know if the PEP authors considered it - I don't recall any discussion of the matter when the PEP was being developed.

Actually, having checked, PEP 517 does explicitly say (in the section on build-backend) "When importing the module path, we do not look in the directory containing the source tree, unless that would be on sys.path anyway" which does imply to me that it's explicitly discouraged to have in-tree backends. Whether someone could work around this with sufficiently devious code, I'm not sure (but I suspect not).

The expected use case when we developed PEP 517 was that backends would be shipped as wheels on PyPI (or a custom index) which would be downloaded and installed into the build environment. How backends were built was explicitly out of scope - my personal presumption was that they would not use the PEP 517 mechanisms, but rather would use a lower-level command (setup.py bdist_wheel or flit build, for example). Recursively using a PEP 517 backend to build itself seemed like a step too far in complexity. It was considered as part of the PEP 518 implementation in pip (there's a potential fork bomb exploit if a backend is shipped as a sdist, and uses itself to build itself, that we had to prevent before we could even support backends not distributed as wheels) but only in the context of downloading the backend from PyPI.

tl; dr; All I can offer is my recollection of the discussions at the time - you may be better searching the archives for the actual background.

I don't know if the PEP authors considered it - I don't recall any discussion of the matter when the PEP was being developed.

I just started looking at the archives, and it appears this question was indeed discussed in detail towards the end of the process (or at least well into it). I don't which website is best for viewing the archives, but here's one point where the discussion starts talking about this question again (July 28, 2017):
https://mail.python.org/pipermail/distutils-sig/2017-July/031109.html
https://www.mail-archive.com/[email protected]/msg26624.html

For example, this particular email ends with--

Given how much trouble we're having with PEP 517 already, it might make more sense to have PEP 517 just mandate that the directory not be on sys.path, and make a little follow-up PEP just for python-path.

I'll let you know if I come across what the final verdict was, but I encourage others to read for themselves.

Nice! Thanks for finding that :-)

So I skimmed a bunch of the emails, and I think you can at least skip ahead to August 29, where Nick seems to be suggesting there is consensus on leaving the source directory _out_ of sys.path:
https://mail.python.org/pipermail/distutils-sig/2017-August/031413.html
(One by one, people were becoming convinced of Nathaniel's arguments.)

However, in this same email linked above, Nick does say the following :)

  1. If omitting it is genuinely a problem, we'll likely find out soon enough as part of implementing a setup.py backend

Here is a fuller paragraph from the email:

So I think we can deem this one resolved in favour of "Frontends must ensure the current directory is NOT on sys.path before importing the designated backend", as starting there will mean we maximise our chances of learning something new as part of the initial rollout of the provisionally accepted API.

I don't know yet if there are later emails that alter this summary, but there aren't a whole lot of emails on the topic after.

Thanks for trawling the archives for us @cjerdonek. Stating my current understanding of the problem:

  1. a lot of real world setup.py files currently assume the current directory is on sys.path when they run, creating weird bootstrapping issues as projects treat themselves as an install-time dependency
  2. PEP 517 explicitly says that build front ends should not implicitly add the current directory to sys.path (it doesn't say anything about backends)
  3. pip 19.0 considers pyproject.toml without a build-system section to be equivalent to a build-system section that specifies the setuptools backend
  4. the setuptools PEP 517 backend doesn't currently add the current directory to sys.path either (since getting away from point 1 above is considered a desirable future goal)
  5. Two interim workarounds are available for existing projects while the default behaviours are improved: pinning pip to less than 19.0, and explicitly setting the --no-use-pep517 option. However, folks only discover those after first discovering that upgrading pip breaks their builds.

From a functional perspective, I think the key change we want to introduce is that in the "pyproject.toml without a build-system section" case, then the directory containing setup.py should end up on sys.path again. There are two main ways to handle that:

  1. Get pip to do it. This has the unfortunate side effect of making the behaviour front-end dependent, and thus potentially see existing projects fail to install unless all frontends implement the same workaround
  2. Get the setuptools PEP 517 backend to do it, by inspecting pyproject.toml for a build-system section, and injecting the setup.py directory into sys.path if both the section and path entry are missing. A new pip is still needed in that situation, since it has to specify a minimum version of setuptools that correctly handles the "no build-system section" case.

In the interests of getting things working for end users as quickly as possible, without painting ourselves into any unfortunate design corners, I'd actually propose a 3 step resolution:

  1. Do a new pip release (19.0.1?) now that adds the extra sys.path entry for the "no build-system entry" case. At this point, the compatibility issue should largely go away for end users.
  2. Do a new setuptools release that handles the sys.path addition if the front end hasn't already done so.
  3. In a later pip release, change the "no build-system entry" case to drop the special casing, and instead set a stricter minimum version for setuptools.

This proposal is based on the fact that I think the setuptools PEP 517 backend is the "right" place to handle setup.py backwards compatibility issues, but I also think that tweaking pip directly is likely to be a much simpler change in the near term, and it's a change that will contain the problem while the more architecturally preferable fix is worked on.

I've created the build_meta_legacy backend in pypa/setuptools#1652. I would really prefer it if pip would switch to using setuptools.build_meta_legacy as the default backend, but I think creating a built-in "legacy shim" backend is about as far as I'm comfortable going in setuptools. I do not want to get stuck indefinitely supporting full "python setup.py install emulation" in the main setuptools PEP 517 backend.

I'm confused a bit why setuptools doesn't want to emulate running the script natively here TBH. It seems like it's asking for end users to be confused when python setup.py bdist_wheel works correctly, but pip wheel does not (assuming they've set to the non legacy build backend).

What's the rationale?

What's the rationale?

My long-term plan is to deprecate all direct invocation of setup.py in favor of indirect invocation by either PEP 517 frontends or something equivalent (for other commands). If we're moving to an "all isolated environments" world, I don't want to be forced to keep the semantics of setuptools command invocations compatible with invoking python setup.py, for the same reason PEP 517 specifically said that frontends shouldn't do this - it has the potential to break the build isolation and generally create undesirable situations.

I think it's fairly rare for this to be needed except as part of an anti-pattern. Anyone with a legitimate need for it can pretty easily manipulate sys.path in their setup script.

Just as a note, I don't think PEP 517 ever had as a design goal that all build tool access would be possible via the hooks. For example, flit has its own set of commands flit build etc, and there's no intention that I'm aware of to deprecate those. So it is possible that you could find edge cases where the intention behind PEP 517 was that tool-specific commands needed to be used. An obvious example is building the backend itself (as I mentioned above), but other cases (not currently standardised, although future PEPs may add hooks) such as editable installs and in-place builds (reusing artefacts from previous builds) exist.

Getting agreement that all tools could support "build sdist" and "build wheel" was hard enough that I don't anticipate extending the list of standard operations any further will happen soon.

Just as a note, I don't think PEP 517 ever had as a design goal that all build tool access would be possible via the hooks.

This is not what I suggested, and it's a bit of a digression. My point was that the problems that necessitated PEP 518 also exist more generally for all setup.py commands, and those are best solved by moving people away from invoking setup.py at all. No standard is necessary in order to do this in the same way flit does not need a PEP to add subcommands.

Ah, OK. Sorry for the misunderstanding.

@ncoghlan's proposed transition seems good to me. If no one has issues with that, let's go ahead with it.

Thanks for digging up the archives @cjerdonek!

Given that there's already a PR that fixes this in setuptools (through the introduction of the legacy backend). Is there any reason not to skip step 1 in the above? We're installing setuptools into an isolated build environment, so depending on a very recent version shouldn't be an issue.

Is there any reason not to skip step 1 in the above?

Nope. We can directly bump the current minimum required version.

My transition plan is based on the assumption that folks will need more time to work out the long-lived transition mechanism on the setuptools side -while @pganssle's dedicated transitional backend looks promising on that front, I'm not sure it makes sense to leave the status quo in place while that change is being reviewed.

That said, I'm not familiar with the details of how the PEP 517 implementation in pip works, so my assumption that working around the issue in the front end would be straightforward may be incorrect.

Because backends are run in a subprocess (and in fact, using the vendored pep517 project, which adds an extra level of indirection) it's not at all obvious to me how we'd set sys.path for a backend hook. At a minimum we'd need to involve PYTHONPATH which means worrying about cases where the user already has PYTHONPATH set, and how the isolation code uses it.

Essentially, setting sys.path in pip is, I suspect, distinctly non-trivial. I'm unlikely to have time to look into the details, much less write a patch (and ensure it's properly tested!) myself.

I think getting a fixed setuptools backend is the fastest way forward.

My long-term plan is to deprecate all direct invocation of setup.py in favor of indirect invocation by either PEP 517 frontends or something equivalent (for other commands).

@pganssle ouch. Procedural setup.py must die. It should not exist for any reason. Because this "flexibility" is an endless source of pain and breakdowns. It is impossible to migrate setup.py and hence all the problems with Python packaging.

I would replace setup.py with package.json and just add Python section for it instead of yet another package description format.

At least then I can use ==1.x version specifiers.

@techtonik Please go away. Your non-constructive contributions remain unwelcome in any project I am a participant in.

Working on #6210 has let me answer my own question from above about how hard it will be to address this at the pip level: the challenge is that the information about whether or not the source directory should be inserted as sys.path[0] needs to be tunnelled from the code that reads pyproject.toml through to the PEP 517 hook caller, and then from there into the actual in-process wrapper script.

Those are doable without major architectural changes (a pip._implicit. prefix on the build backend name for the first part, and a PEP517_SYS_PATH_0 env var for the latter), but it means temporarily modifying the vendored pep517 code until the proper fix in setuptools is ready.

My transition plan is based on the assumption that folks will need more time to work out the long-lived transition mechanism on the setuptools side -while @pganssle's dedicated transitional backend looks promising on that front, I'm not sure it makes sense to leave the status quo in place while that change is being reviewed.

Yes, I agree with this. I suggested just this in a distutils post, there are two things that I'd say are important:

  1. Getting a fix out there for users ASAP
  2. Getting the right transition in place.

I personally think that the right thing to do is to remove the "pyproject.toml's existence opts you in to PEP 517" logic until we have the transition logic in place. Considering that the changes on setuptools have consequences for its public-facing API and in pip it would just be delaying what has turned out to be a backwards-incompatible change, it makes sense to me to do the quick fix in pip while we're reviewing the way forward for setuptools.

With both approaches operational on my local machine, I tested both #6210 and #6212 against the original problematic PyInstaller==3.4 requirement.

I don't know whether the #6212 failure is a test setup problem, or if there's actually a further problem in the setuptools pre-release, I only know that there's further integration work to be done before we can be confident in that solution, whereas I'm confident in #6210 now - the only thing wrong with it is that it's a horrible compatibility hack that we don't want pip to have to carry forever.

It looks like --no-cache is responsible for that failed assert on #6212. Testing with --cache-dir instead gave a successful local test: https://github.com/pypa/pip/pull/6212#issuecomment-458166386

(I'm going to be offline for ~18 hours for sleep & work, so folks should feel free to run with whichever of #6210 or #6212 makes sense in the meantime)

I've updated the title to better reflect the situation. Thanks @ncoghlan for the PRs!


Moderation note: I've also gone ahead and hidden the non-productive comments (and responses to them) as "Off Topic", and will do so for any future comments along those lines. If anyone wants to have a discussion about the moderation, file a new issue or ping me over email; this thread is not the right place for bringing up any issues with moderation you may have.

I've also gone ahead and pinned this, to avoid people creating duplicates and to clearly signal that we know about this.

Even if we make pep517 opt-in, this is a behavior change that was not announced and will therefore break libraries _even if they already opted in_ (see PyInstaller, for example). In the case where a library declares an opt-in to pep517 builds _and_ imports things locally that it expects to find on sys.path, there is no assumption being made by pip (because the library is explicit) but the library is still suddenly broken.

In that case I really don't see an alternative besides just including cwd in setuptools, because this is just broken. Unless the proposal is to tell people to go back and fix the releases they've cut in between pep517 and pip 19, which, if any user has those versions strictly pinned, may suddenly stop being installable, I really feel we should consider the impact of these decisions on the user experience. Based on this discussion and the current proposals some of these libraries will not be installable with new versions of pip + setuptools going forward using the defaults unless pep517 builds are explicitly disabled.

This is pretty impactful if you're just trying to install a package with the tooling that is provided to you by python, but actually it can't install things somewhat randomly. I say this to draw the focus off of the technical aspects for a moment and onto the impact to the end user who may get frustrated with the tooling, the ecosystem, the libraries, or the language itself, because suddenly (and yes, only under specific circumstances), things they could install just fine now can't be installed. I really do think we should close this gap in any solution that is implemented.

We currently use a failure stack to handle failed installations in pipenv and I'm adding --no-use-pep517 when available to handle failures as a result of these changes. I'm not sure that will be intuitive to the average user, since it's probably not even immediately clear what the cause of the problem is. I say this just to point out that we have a workaround, but it feels important to try and close this gap to help users out a bit on this one

(edit: also big thanks to pganssle, cjerdonek, pfmoore, pradyunsg, ncoghlan, and everyone else who has been putting in a bunch of time and effort on this)

The expected use case when we developed PEP 517 was that backends would be shipped as wheels on PyPI (or a custom index) which would be downloaded and installed into the build environment. How backends were built was explicitly out of scope - my personal presumption was that they would not use the PEP 517 mechanisms, but rather would use a lower-level command (setup.py bdist_wheel or flit build, for example). Recursively using a PEP 517 backend to build itself seemed like a step too far in complexity. It was considered as part of the PEP 518 implementation in pip (there's a potential fork bomb exploit if a backend is shipped as a sdist, and uses itself to build itself, that we had to prevent before we could even support backends not distributed as wheels) but only in the context of downloading the backend from PyPI.

Just to circle back to this - the new setuptools.build_meta_legacy default solves the problem for using PEP 517 for everything except PEP 517 backends, which is a problem for setuptools itself. If we don't solve that, then as @benoit-pierre points out in pypa/setuptools#1644, it will not be possible for people to use pip install --no-binary :all: for any project that depends on setuptools (or presumably any PEP 517 backend provider).

Should we discuss that in this thread, or create a new thread to discuss it?

Should we discuss that in this thread, or create a new thread to discuss it?

I'd split that out. My immediate feeling is that the problem is that --no-binary :all: has significant unintended consequences here (similar in practice to the impact of using that flag in the presence of a project that only distributes wheels, not sdists) and I'd like to avoid digressions into (for example) the advisability of using --no-binary :all: further distracting from this thread.

I do not think that fixing the issue with --no-binary :all: building build backends is anywhere near as critical as this one. If a user is already specifying --no-binary :all:, they can relatively easily add --no-use-pep517.

New PR #6229 that:

  1. Implements @pganssle's suggested interim workaround of making a build-system section in pyproject.toml the initial requirement for automatically opting in to PEP 517 (deferring the "any pyproject.toml file" opt-in to a later release)
  2. Adds 3 dedicated test cases covering importing an adjacent package from setup.py

I'm going to close the other 2 PRs in favour of that one, since it's the minimal fix that should get things working again for end users, and doesn't require any horrible hacks like #6210 or a new setuptools release like #6212

So where does that leave setuptools.build_meta_legacy? Is the proposal now to require projects that need "import adjacent package" functionality to explicitly specify it in pyproject.toml? If so I'd strongly suggest that needs documenting somewhere, along with the fact that importing adjacent packages without that specification is deprecated and will be removed in pip 19.X (we need to agree what X is), We can't make that a programmatic deprecation (I don't think) but that's all the more reason to document it clearly, so we don't get accused (again) of removing functionality without sufficient notice.

Edit: But otherwise, thanks for the new PR and summary.

Edit 2: I see you closed the setuptools.build_meta_legacy proposal. I'm not sure I like that, as it loses us the opportunity to say now what our longer-term plan is, so extending any deprecation period, as I mentioned above...

@pfmoore No, it means that getting back to the desired behaviour should be covered by a new issue associated with removing the #6163 workaround (probably by updating to a setuptools that provides setuptools.build_meta_legacy).

Edit: for now I've just reopened #6212, but retitled it to make it clear that I don't think we should wait for the whole build_meta_legacy discussion to be resolved before fixing currently failing install commands.

My proposal was actually that the opt-in for PEP 517 is specifying build-system.build-backend not the existence of build-system at all, and that between now and the 19.1 release, setuptools would add build_meta_legacy and pip would use it as the default backend.

I agree that in 19.1, probably if pip can't find setuptools.build_meta_legacy, it should fall back to the old code path. That will give us minimal breaking changes while opting in the maximum number of people.

My proposal was actually that the opt-in for PEP 517 is specifying build-system.build-backend

... which can be handled by simply setting use_pep517 = False in the fallback case (rather than setting it to has_pyproject which is what we do now.

in 19.1, probably if pip can't find setuptools.build_meta_legacy, it should fall back to the old code path

I don't think this is worth doing. We'll be specifying a sufficiently recent setuptools version that we are sure we'll be getting the legacy backend, and there's no need to cater for the possibility that setuptools remove that backend in a future version (or rather, if they do, we simply blame them for the resulting problems ;-))

Note: the use_pep517 = False default is what #6229 started with, but it caused failures in the PEP 518 tests.

  1. The "build-system.requires is set" case has to use build isolation, so that it can install the requested dependencies without affecting the parent environment. The easiest way to do that given the current code structure is to set use_pep517 = True in this case, so I did that, and got the test case passing again.
  2. A missing [build-system] table indicates pyproject.toml is just being used to store settings in the [tools] table, so that has to result in use_pep517 = False to be an effective workaround for the originally reported issue, so I marked this test case as an expected failure.
  3. That leaves the "empty [build-system] table" case, which I think can reasonably be resolved in either direction. However, as I don't think this particular case is going to come up very often in practice (why would anyone go to the trouble of adding a build-system table without setting either requires or build-backend?), I chose to resolve it in a way that meant the previously defined PEP 518 test case passed, rather than by adding a second expected failure marker.

To resolve 3 in the other direction, we'd need to change the line:

use_pep517 = build_system is not None

to instead be:

use_pep517 = build_system is not None and build_system.get('requires', None)

That way only a non-empty requires would opt in to the PEP 517 build isolation (which would also mean adding a 4th test case, since empty and non-empty build-system.requires fields would now behave differently).

Sorry for the uninvited contribution here, but I can’t help noticing that this all feels like a pretty elaborate way to avoid having the cwd on sys.path and ultimately will leave things broken that used to work, which feels quite disruptive from a UX perspective.

A nontrivial number of users and packages are impacted by this. At least some of those have a [build-system] section defined and _also_ rely on the old behavior, and will therefore remain broken for anyone who has those versions pinned.

@techalchemy Yeah, the original assumption in pip was that setuptools.build_meta worked the same way from a sys.path perspective that direct invocation of setup.py does, and that assumption proved incorrect. Once a setuptools release containing the setuptools.build_meta_legacy backend defined in https://github.com/pypa/setuptools/pull/1652 is available then #6212 can be completed, and that would be the long term resolution. However, there's no current ETA for such a release, so we need to continue exploring pip-only resolutions to get the source directory back on sys.path for packages that aren't explicitly "PEP 517 native".

In #6229 I implemented @pganssle's suggestion of simply postponing the adoption of "PEP 517 by default", but it looks like that causes more problems than it solves due to another change in pip 19: the processing of build-system.requires is now tied to the processing of build-system.build-backend, so --no-use-pep517 disables PEP 518 as well (which causes test suite failures, and may also cause real world install failures if the build dependencies aren't preinstalled).

In #6210, I instead locally patch pep517 to support injecting sys.path[0] into the subprocess, effectively doing the same thing that setuptools.build_meta_legacy is expected to do in a future setutpools release. This appears to behave the way we want - build-system.requires is still processed, and the source directory is sys.path[0] when setup.py gets executed. It's also very similar to what I proposed in https://discuss.python.org/t/pep-517-backend-bootstrapping/789/29?u=ncoghlan and @takluyver has drafted in https://github.com/pypa/pep517/pull/42/ to handle self-bootstrapping backends in a way that's compatible with --no-binary :all:

Sorry for being the AWOL RM. Things happened that I hadn't anticipated.

I obviously prefer the setuptools-side fix for this but #6210 is cool with me too -- as a short term fix.

I agree with @techalchemy and @pradyunsg - I think the setuptools-side fix is the correct approach here. While I appreciate the work on trying to find a fast fix within pip, wouldn't such time be better spent expediting a new release of setuptools with _build_meta_legacy? I haven't been watching what's happening on setuptools, so I'm not at all clear why there's a problem with releasing a quick fix in setuptools (the setuptools release cycles are way faster than pip's).

I'm OK with a short-term pip-side fix, but I'd like to clarify when we can expect a setuptools fix.

Hi all!

I have the same problem:

**Collecting pyinstaller==3.4
  Using cached https://files.pythonhosted.org/packages/
a0cc/PyInstaller-3.4.tar.gz
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  Complete output from command "c:\program files (x86)\
gram files (x86)\microsoft visual studio\shared\python3
requires_for_build_wheel C:\Users\ASUS\AppData\Local\Te
  Traceback (most recent call last):
    File "c:\program files (x86)\microsoft visual studi
process.py", line 207, in <module>
      main()
    File "c:\program files (x86)\microsoft visual studi
process.py", line 197, in main
      json_out['return_val'] = hook(**hook_input['kwarg
    File "c:\program files (x86)\microsoft visual studi
process.py", line 54, in get_requires_for_build_wheel
      return hook(config_settings)
    File "C:\Users\ASUS\AppData\Local\Temp\pip-build-en
, line 115, in get_requires_for_build_wheel
      return _get_build_requires(config_settings, requi
    File "C:\Users\ASUS\AppData\Local\Temp\pip-build-en
, line 101, in _get_build_requires
      _run_setup()
    File "C:\Users\ASUS\AppData\Local\Temp\pip-build-en
, line 85, in _run_setup
      exec(compile(code, __file__, 'exec'), locals())
    File "setup.py", line 20, in <module>
      from PyInstaller import __version__ as version, H
  ModuleNotFoundError: No module named 'PyInstaller'**

Does anyone know if the problem is solved? Or how to solve it temporarily?

Thanks all!

@jce94, use pip<19 for now.

@altendky thankΒ΄s for the info!

I was unable to resolve this issue with the suggested workarounds while working with pipenv. Freezing pip to 18.1 in the Pipfile seems to have no effect as pipenv keep forcing latest pip version. I can manually set pip to 18.1 but when I recreate the pipenv virtual environment Pipenv would upgrade to the latest pip no matter what...Any recommendations to make it stick?

@altendky Sadly, using a predefined version of pip is not possible at the time for pipenv (and also for poetry I think) users. Both use latest versions. So I'm guessing many people are stuck with broken pipelines for now

What's even more weird, it's not happening consistently. I've rerun an Appveyor job, the first one passed, the second one failed although they are strictly identical

In case anyone's wondering about the timeline, I expect we'll be able to have a fix ready by the end of this week or early next week and make a subsequent bug-fix release of pip soon after that.

The new version of setuptools, version 40.8.0 is now available with the build_meta:__legacy__ backend.

Thanks! And is this something we should point PyInstaller to use? They were
quite unhappy with the changes... Any documentation or PEP I can present
them with to support the changes?

On Tue, Feb 5, 2019, 10:24 Paul Ganssle <[email protected] wrote:

The new version of setuptools, version 40.8.0 is now available with the
build_meta:__legacy__ backend.

β€”
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/pypa/pip/issues/6163#issuecomment-460747909, or mute
the thread
https://github.com/notifications/unsubscribe-auth/ADtXZjnOfu56IYR6VEKK4yowMg3XdcFEks5vKcxBgaJpZM4aNvmP
.

@AlmogCohen No, this is not something you should use directly, it is only for use by PEP 517 front-ends. The next step is for pip to start using build_meta:__legacy__ as its default backend. This is not actionable from an end-user's perspective.

Any ETA on the new release that will integrate the fix ?

In a few hours. :)

See the pinned issue.

pip 19.0.2 has been released with a fix for this.

I am unable to install pyinstaller with the lastest version of pip, even when using --no-use-pep517

pip install pyinstaller --no-cache-dir --no-use-pep517
Collecting pyinstaller
  Downloading https://files.pythonhosted.org/packages/03/32/0e0de593f129bf1d1e77eed562496d154ef4460fd5cecfd78612ef39a0cc/PyInstaller-3.4.tar.gz (3.5MB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 3.5MB 273kB/s
  Installing build dependencies ... done
    ERROR: Complete output from command python setup.py egg_info:
    ERROR: Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "C:\Users\Raffaele\AppData\Local\Temp\pip-install-5e9w2p2c\pyinstaller\setup.py", line 20, in <module>
        from PyInstaller import __version__ as version, HOMEPATH, PLATFORM
    ModuleNotFoundError: No module named 'PyInstaller'
    ----------------------------------------
ERROR: Command "python setup.py egg_info" failed with error code 1 in C:\Users\Raffaele\AppData\Local\Temp\pip-install-5e9w2p2c\pyinstaller\

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

Was this page helpful?
0 / 5 - 0 ratings