Pip: Un-deprecate `--process-dependency-links` until an alternative is implemented

Created on 19 Dec 2016  ·  72Comments  ·  Source: pypa/pip

  • Pip version: 8.1.1
  • Python version: 3.5.2
  • Operating System: Ubuntu 16.04

Description:

The --process-dependency-links was re-added to pip a while ago because there are a number of valid use-cases for the flag: https://github.com/pypa/pip/pull/1955

For this reason, it should probably be un-deprecated until an implementation of PEP 508/440 is in pip. The dependency_links argument to setuptools's setup is also still documented and not deprecated: http://setuptools.readthedocs.io/en/latest/setuptools.html#dependencies-that-aren-t-in-pypi

What I've run:

$ pip install request --process-dependency-links
Collecting request
  Downloading request-0.0.12.tar.gz
  DEPRECATION: Dependency Links processing has been deprecated and will be removed in a future release.
... 
awaiting PR auto-locked needs discussion maintenance

Most helpful comment

What's the proper workflow for this?

Let 's say there's a library that I need to add a feature to. I fork it on Github. Feature isn't going to be merged any time soon, so I have a tool that I set to use my Github fork instead of PyPi.

Now all of my users have to add --process-dependency-links when installing my tool with pip, which is a deprecated and even once removed flag?

Is there some option in setup.py that I'm missing or is there really no way around this? It seems like the only viable way to do this is to push up a forked pypi package. That is just going to add to user's confusion once your pull request does get merged, anyway.

All 72 comments

Agreed. Current behaviour of pip for dependency-links really sucks.

4295

3610

[Wrote in #3939, moving my comment here]

So, here's the main issue: If I don't want to use a requirements.txt file (because, say, I want declarative dependencies all specified in setup.cfg), how am I supposed to specify a URL as dependency?

This doesn't work:

[options]
install_requires = 
  requests==2.18.4
  https://github.com/example/example_lib/archive/master.zip

I also think dependency links are weird and fine to drop, but canonically how is this use case served if not with those?

Another issue is that dependency_links apparently cannot be specified in setup.cfg, only as setup.py arguments. If they are to be undeprecated, that should be fixed.

Is there any news regarding this? Is there any alternative in progress? dependency_links_ seems to be the only option for internal/private distribution of libraries.

Instead of deprecating it, it should be fixed, a lot of people forget or get confuse because they forget to put both the name-version of the library in the the install_requires and the dependency link that contains the tarbal together with the egg. As an alternative, I would love to see that @jleclanche has suggested, a simple link to the tarball in the install_requires (specifying the egg could be optional).

I don't think we are alone: https://stackoverflow.com/questions/3472430/how-can-i-make-setuptools-install-a-package-thats-not-on-pypi

@robertour tools like devpi support a more clear approach since years now

@dstufft btw - i would like to argue that the existene of devpi demonstrates a more clean solution to the problem making dependency links unnecessary

IMHO index inheritance + white listing/blacklisting is more consistent and more comprehensible at a ops level

@RonnyPfannschmidt, thanks for the suggestion, it seems I am missing something important (discussions about this deprecation have been going for too long) but generally speaking I cannot agree more with this comment from reddit:

devpi could be a solution, but bringing another service into the game that needs maintenance and having to wait for some administrator to setup, will not get me into working very soon. Also we still would have the need to build packages and then put them onto devip (or has devpi a feature to track git repositories??)

Ideally, I would like an alternative that won't take me more than adding a file in my library, or few lines in the setup.py.

I basically don't understand the underlying decision of removing it, I can see how it can lead to bad practices and insecurities, but as long as it is keep for internal use, it should not be a major issue.

4175 means that PEP 508 URL support will be there in pip 10. I think this can be closed.

Thoughts @pfmoore @dstufft?

I thought https://github.com/pypa/pip/pull/4175/commits/86b07792e7ae762beeaeb471ab9db87e31bc285d meant that the @ syntax cannot be used for dependencies, so this means #4175 isn't a solution for this issue. The comments over there were that we shouldn't implement @ support for dependencies until PyPI was able to block uploads that used it (because we don't want installing something from PyPI to be able to grab stuff from arbitrary URLs, presumably).

It shouldn't be deprecated until there is an alternative in place, i.e. PEP 508 support. Until then, too many people are using it.

What's the proper workflow for this?

Let 's say there's a library that I need to add a feature to. I fork it on Github. Feature isn't going to be merged any time soon, so I have a tool that I set to use my Github fork instead of PyPi.

Now all of my users have to add --process-dependency-links when installing my tool with pip, which is a deprecated and even once removed flag?

Is there some option in setup.py that I'm missing or is there really no way around this? It seems like the only viable way to do this is to push up a forked pypi package. That is just going to add to user's confusion once your pull request does get merged, anyway.

Let's come around to this.

--process-dependency-links does not really have an alternative today. So, I'll suggest that we go ahead and un-deprecate it.

However, the flag does mean that pip would reach out to arbitrary URLs -- it should have a warning attached to it about this.

@pradyunsg unless there is an actual plan to kill it off for good, i strongly suggest to let that zombie rest or be removed

i dont think the people that actually need it have any incentive to fix the situation unless pip says "NO i dont take on your debt"

As a pip maintainer, my concern is not allowing pip to be a vector for exploits. Pip assumes PyPI as the default index, so there's implicit trust in PyPI there. So my first questions are:

  1. Do packages exist on PyPI that have dependency_links metadata?
  2. Could PyPI be modified to refuse upload of projects with dependency_links (or does it do that already)?
  3. Alternatively, could pip simply refuse to process dependency links (regardless of whether the option is set) for packages downloaded from PyPI?

If we could be sure that there was no way for pip users to be bitten by packages on PyPI with malicious dependency_links, I'd be less worried. Users opting to use a custom index have to decide for themselves whether to trust it. (I'd still prefer to retain the flag even then - this is potentially a remote exploit and should always be explicitly opt-in).

So, with the following changes:

  1. Pip will never process dependency links from PyPI (either explicitly, or just because we know they aren't there)
  2. The --process-dependency-links flag is un-deprecated, but only works for custom indexes.

I'd be OK with accepting that as the status quo. People needing a dependency link style feature can then thrash out amongst themselves how (if at all) they want to move forward.

If someone wanted to, we could put all of that in a standard defining dependency link metadata, and the current deprecation would then become "the current behaviour will be dropped in favour of the behaviour specified in the standard". But from a pip-centric point of view, I'd rather just do it - let the people who need dependency links take on the work of standardising (it's to their benefit after all, as they then have a route - getting a standard change - to modifying pip's behaviour with full community consensus).

Basically we have a feature that is meant to replace dependency links, but pip hasn't started respecting it yet, because we don't yet have a way to block uploads to PyPI that use it.

Do you mean #4175? Which is blocked because of 86b0779? Maybe we could modify https://github.com/pypa/pip/blob/master/src/pip/_internal/req/req_install.py#L167 to only give the error if comes_from is PyPI?

Then we could change the deprecation warning for --process-dependency-links to say that users should switch to the @ syntax, and drop --process-dependency-links after a reasonable deprecation period (we'd have to start the clock ticking from the point where @ syntax becomes available).

@pfmoore Yea that is exactly it, and that seems like a possible path forward.

OK. If no-one else gets to it first, I'll see about writing a PR.

From what I gather, the user issue here is that they need to pass an option that explicitly says "hey, I'm deprecated" when you pass it for functionality that doesn't have an alternative.

I'm in favor of dropping dependency_links support entirely over a deprecation period. And in the mean time, adding PEP 508 URL dependency support, which I do think pip should still be loud about when used and that they should still be opt-in.

That would mean:

  • put the ability to process PEP 508 url-deps behind a flag

    • aside: should name the flag such that a security concerned user would check the docs

  • start refusing to install PEP 508 url-deps when they come from a PyPI package
  • dependency-links stay deprecated but we have a date to we'll drop them

    • honestly, I don't want to give this any longer a deprecation period than a single release printing the alternative.

I typed that yesterday but didn't send it. I now realize that it's basically the same thing as @pfmoore's last comment.

yay! :)

Should URL deps require an opt-in flag? PEP 508 doesn't say so, and the current implementation doesn't do that. I can see the logic, but I don't trust my judgement on security vs ease of use questions.

Also, I'd like to see some clear documentation on how projects should switch from dependency links to @ syntax before we start pushing the switch. It's hard enough already to reach people who will be affected by deprecations, and warnings that they can't work out how to address won't help. Ideally, the warning should include a pointer to a "how to change over" document, IMO.

I don't trust my judgement on security vs ease of use questions.

I'm inclined toward: "be secure by default" and opt-in for less secure behavior.

clear documentation on how projects should switch from dependency links to @ syntax before we start pushing the switch

Sounds good to me.

I'm inclined toward: "be secure by default" and opt-in for less secure behavior.

As someone who sits behind hard to deal with firewall and security constraints at my work, I'm very aware that we don't have much understanding of the sort of environments where this feature is likely to be most useful (internal development, closed source, tightly controlled workflows and network layouts, etc). So my inclination is to be secure for the default workflow (PyPI) but not put additional roadblocks in the way of people with use cases that are clearly important to their workflow, but where we don't properly understand the trade-offs they have to make.

To my mind, "if the package is coming from a server that you explicitly opted into using" is sufficient opt in to allow URL links.

See? It's not that I don't have opinions, just that I'm not sure my biases aren't showing :smile:

not put additional roadblocks in the way of people with use cases that are clearly important to their workflow, but where we don't properly understand the trade-offs they have to make.

Fair enough. If some users could provide insights into their workflow, that would be nice.

"if the package is coming from a server that you explicitly opted into using" is sufficient opt in to allow URL links.

I'm not sure here. :/

See? It's not that I don't have opinions, just that I'm not sure my biases aren't showing 😄

Hehe.

Fair enough. If some users could provide insights into their workflow, that would be nice.

To my mind, "if the package is coming from a server that you explicitly opted into using" is sufficient opt in to allow URL links.

Just allowing sources from GitHub would solve 99% of the issue for me. Upstream packages have bugs or missing features. I fork and fix them, issue a PR, then wait for a possibly really long time to get them merged and then released on PyPI. In the meantime my packages rely on the fixes in those packages.

This can be opt-in, as long as it can be done in one line.

It's not really a security issue to use direct links (assuming HTTPS/etc), it's just an availability one. Since we're disallowing them from PyPI then I'd say that's good enough and we don't need another flag.

At my work we would fall in some of the use cases @pfmoore mentioned, namely, internal development before a package is open sourced, closed source, etc. (always internal packages depending on other internal packages, all of them in a version control server).
Although, I also see the possible use case of someone referring to a file system location...
Would it make sense to provide a list of whitelisted hosts/locations? The name of the flag should IMO, as @pradyunsg suggested, be descriptive enough to make the user aware of the risks involved. Maybe --whitelisted-host?

@masdeseiscaracteres I think you may have misunderstood - I was suggesting that if a package was retrieved from PyPI, we would fail if any of its dependencies were specified via an @ reference. But there shouldn't be any such cases on PyPI anyway (we're expecting to reject them at some point, we just haven't been able to set that up yet). All other uses of @ references would be OK.

Looks like we're going to go ahead with a PR that does what's described by @pfmoore in https://github.com/pypa/pip/issues/4187#issuecomment-389846189.

PRs welcome. :)

Note that I haven't had time to get to this - not because the change is difficult (as noted, it appears to simply be a check against comes_from) but rather because I don't know how to provoke the error myself (and more important, I don't know how to write a test that does so). If anyone can provide an example of such a test case that would be immensely helpful.

If anyone can provide an example of such a test case that would be immensely helpful.

There's an existing test test_install_pep508_with_url_in_install_requires that demonstrates this.

As for erroring out when installing from PyPI, I don't see a better option than actually uploading something on PyPI that has a URL requirement. 😞 I uploaded a package on PyPI for this purpose. https://pypi.org/project/pep-508-url-deps/


Another thing is -- comes_from is not a URL or path, it's a string along the lines of 'box==0.1.0 from file:///private/tmp/box'. Whoever's looking to fix this issue, now has to figure out a better way to error out, so that we have information about where a package originates. :)

@pfmoore this issue is near and dear to my heart 😄 Does @pradyunsg 's upload give you what you need, and are you still planning to tackle this? If not, I could take a swing at it.

@bstrdsmkr Nope, and as @pradyunsg says, it's not as simple as I thought, as comes_from isn't the source URL. So I don't know when I'll have time now (I have no personal use for this feature, so it's not high on my priority list).

As noted above, PRs welcome :-)

I'll add that the uploaded package doesn't help in any way for implementation, it's only helpful for testing the functionality.

My understanding is that the desired fix is something like:

if req.url and is_like(req.url, PYPI.url):
    raise

in https://github.com/pypa/pip/blob/master/src/pip/_internal/req/req_install.py#L172
where is_like() returns True if the urls root at the same domain. Is that correct?

Yeah. I think so. That'll be the code change.

And, we'll need to add/update tests and a NEWS entry.

I also think that this change is important enough to warrant a chance the deprecation message providing link to docs + an additional section in user-guide telling people how to switch to the alternative.

As a pip maintainer, my concern is not allowing pip to be a vector for exploits. Pip assumes PyPI as the default index, so there's implicit trust in PyPI there.

No, there is explicit trust. And adding safeguards against using external sources in dependencies IMHO won't improve the situation: it's more convenient to hide malware in pypi than in a publicly available VCS.

IMHO a better approach would be to use developer's VCSs as primary sources and keep pypy as a registry pointing to them and a caching proxy with some strong cryptographical proof that the content in VCS is identical to the one got from pypy. I mean

0 registration

dev -- public key --> pypa

1 uploading

setuptools -- git+https:/.... --> pypa
pypi --> Tor --> give me that commit --> vcs
pypi <-- Tor <-- here -- vcs
pypi checks the signature matches the dev

2 idle:

pypi --> Tor --> give me that commit --> vcs
pypi <-- Tor <-- here -- vcs
pypi checks it

2 retrieving

flips a coin
if coin == 1:
  pip -- give me package git+https:/... --> pypi
  pip <-- signature || content -- pypi
  pip -- give me the signature --> vcs
  pip <-- signature -- vcs
else:
  pip -- give signature of package git+https:/... --> pypi
  pip <- signature -- pypi
  pip --> Tor --> give me that commit --> vcs
  pip <-- Tor <-- here -- vcs


pip checks if the signature matches the public key and signature from pypa
pip -- give me public key --> keyserver
pip <-- PK -- keyserver
pip checks signature given by VCS against the sdist given by pypy
pip caches public key and repo location

1 the sources installed match the ones in VCS because of signature
2 the author matches too
3 pypa can cheat with new users, but cheating may be detected by old ones
4 author can cheat by showing in web interface branch other than given to pypy and send to pypy another branch. It won't work well if we make the branch a mandatory part of URI.

@KOLANICH I think you mean PyPI (Python Packaging Index) when you say pypy/pypa. PyPy is an alternative implementation of Python. PyPA (Python Packaging Authority) is a group of volunteers who maintain major projects in the Python Packaging space. Please, do get the acronyms correct or refrain from using them.

You're suggesting fundamentally changing the design of an existing well established service. If you wish to have such major changes, you're welcome to provide at least a (feasible) transition plan and a POC implementation, for managing/changing the architecture to allow transition/minimize breakage of existing workflows. Note that depending on external hosting is something that was explicitly removed from PyPI in the past in PEP 470, however that was fairly different case from what you're suggesting.

PyPI is maintained by volunteers, running on donated/sponsored infrastructure. You're suggesting that it connect through Tor, another volunteer maintained/run service. Both of these are major projects and they have a cost associated with keeping them running, even though it is not directly bourne by/visible to it's users.


None the less, this is not the right place for this discussion. If you do wish to propose a redesign of PyPI, I suggest you get a discussion started over at https://github.com/pypa/warehouse.

Thank you for the suggestions.

See #5571 -- the PR for this for why this has been pushed to a later release.

The warning in PIP installation log gives this URL, but there is no solution to the problem neither here nor in the other tickets mentioned.

Moreover, this is even more confusing: what do you mean when you say PyPI? Do you mean any server that implements the PyPI interface (eg. Artifactory), or, specifically, pypi.org?

Now, obviously, someone who wants to support both installing a package through setuptools (a.k.a. running setup.py install) and through using pip install will be now in a pickle. Specifying dependency links is the only way for someone in this situation to deal with multiple interdependent packages.

Now, correct me if I'm wrong, but so far it seems like if you take this away based on whatever decision PyPA has made about uploads to their servers, you, basically, make pip useless for Artifactory users / companies who have private repositories with interdependent packages.

Please tell me that I'm wrong and missed something in this whole story (I was following it on and off for a while). I red PEP 508, but it really makes no difference in this regard, at least, I cannot see how it would make things better or worse.

@wvxvw-traiana I think you missed PR #5571
Before that PR (and in the current release -- 18.0) pip will refuse to install any dependency specified via PEP508 syntax.

After that PR (already merged so should be in 18.1+) pip will only refuse such dependencies if the package that depends on them comes from pypi.org.

If you install a package from your private repo which depends on stuff from pypi, obviously that's fine.
If you install a package from pypi.org which depends on stuff from your private repo, that fails.
If you install a package from your private repo that depends on stuff in your private repo, that's fine.

Hope that helps clear things up

@bstrdsmkr Is it same origin or is pypi a special case? I.e.

If you install a package from your private repo which depends on stuff from a different private repo.

To add some further context on the reasons behind this:

  1. Direct URL links allow a package to trigger downloading and running arbitrary code from anywhere on the internet. That's OK if you're taking responsibility for the package doing that, so we allow direct URL links on that understanding.
  2. People expect to install from PyPI (specifically PyPI, not package indexes in general) and trust the packages they download from there. To prevent a compromise of a PyPI package exposing a large number of Python users, we do not allow packages coming from PyPI to depend on direct URL links (because it imposes too much of a burden on our users to insist that they have to audit everything from PyPI that they use for links to external code).
  3. Dependency links are a setuptools-specific mechanism, and are processed by setuptools' internal machinery, not by pip. So unlike direct URL links, we don't have any control over what they do. That's why we deprecated them in favour of the standard direct URL form, which we do handle ourselves.

someone who wants to support both installing a package through setuptools (a.k.a. running setup.py install) and through using pip install will be now in a pickle.

If true, that's because setuptools hasn't implemented support for direct URL links, which is an agreed standard. Feel free to raise that with them.

If you install a package from your private repo which depends on stuff from a different private repo.

That works fine. PyPI isn't involved.

OK, so, on one hand, I'm happy that this doesn't affect me.

On the other hand, this "fix" seems, how'd I put it... too easy to work around.

echo "not-pypi 151.101.61.63" >> /etc/hosts
pip install --index-url not-pypi

Not my business, really, but seems like a really surface-level approach. (The other attack vector was mentioned in other comments, where you can simply download whatever you want in setup.py).

It's not designed to be hard for users to work around by users. It's a means of offering a (limited) solution to the fact that PyPI doesn't yet have a way to block people from uploading packages with direct URL dependencies. See https://github.com/pypa/pip/pull/4175#issuecomment-266305694 for some context.

@dpwrussell pypi.org is a special case. Any private repo to any private repo works fine after the change.

@wvxvw-traiana it isn't meant to prevent you from doing that yourself. It is meant to keep someone else from doing that to you when you think you're just installing a package from pypi.org

Unrelated to the current discussion, I'm reopening this since we haven't actually updated the deprecation warning for this.

5726 adds language suggesting the use of PEP 508 URLs, which IMO is the last bit needed for this.

Alrighty then. Let me summarize:

  • dependency links are still deprecated and are now scheduled for removal in pip 19.0.
  • standard backed replacement for it is PEP 508 URL dependencies
  • When installing from PyPI, if a package has PEP 508 URL dependencies, it will result in pip aborting the installation.

@pfmoore has elaborated the reasons for these decisions here: https://github.com/pypa/pip/issues/4187#issuecomment-415067034

@pradyunsg When will PEP 508 URL dependencies be allowed in install_requires in the setup.py? Is there a date set?

In the next release of pip -- that's 18.1, scheduled for October. You can read more about pip's 3 month release cycle at https://pip.pypa.io/en/latest/development/release-process/. :)

https://github.com/pypa/wheel/issues/249 needs to be addressed before PEP 508 URL dependencies become a viable alternative.

pypa/wheel#249 needs to be addressed before PEP 508 URL dependencies become a viable alternative.

This has been addressed.

In the pip 18.1 release notes it says

Allow PEP 508 URL requirements to be used as dependencies.

As a security measure, pip will raise an exception when installing packages from PyPI if those packages depend on packages not also hosted on PyPI. In the future, PyPI will block uploading packages with such external URL dependencies directly. (#4187)

So this basically means dependencies can be specified using URLs, but if those are not PyPI URLs the package cannot be installed using pip? Maybe I am completely getting it wrong, but how are the URL dependencies supposed to be used then?

@JarnoRFB packages that are hosted at PyPI cannot have url dependencies.

Packages that are NOT hosted at PyPI can have url dependencies. If you install the package directly from github (for example) the url dependencies will be resolved and installed. An example of such an installation:

pip install git+https://github.com/bstrdsmkr/some_package.git

Essentially, if you install from a url, it can depend on urls, otherwise it can't. And just for clarity, it can also have both url and non-url dependencies

A minor addition:

...if you install from a url, it can depend on urls

...if you install from a URL (VCS or otherwise) or a local file or an package index that is not PyPI, it can...

So is there a version of pip which processes install_requires as per the above descriptions? I can't work out off the tags above what the final state was and the current pip documentation points to the install_requires documentation in setuptools which still says to use dependency_links.

Can't speak to the docs myself, but this "relaxation" to allow non-PyPI packages to install dependencies from urls was released in pip 18.0

AFAIK, URL dependencies in install_requires are supported since pip 18.1:

Allow PEP 508 URL requirements to be used as dependencies.

Source: release notes

Ugh, typo on my part -- @pietrodn is obviously correct

I want to leave a small but successful example here for those who first came upon this issue terrified (like me) about what to do with out --process-dependency-links. Using this workflow, my users are now been able to pip install from GitHub or from local sources (not PyPI), or to pip install requirements.txt, or to python setup.py install, and to work on Travis-CI, with pip version 18+, so I think it covers a lot of bases. I hope it is useful to someone, and my apologies if this seems off-topic to others:

In your requirements.txt file, assuming you want people to be able to depend on the GitHub dev branch of package "foo", e.g.:

scipy>=0.17
matplotlib>=2.0
foo @ git+https://github.com/foo-organization/foo@dev#egg=foo-9999

In your setup.py file:

import os, sys
from setuptools import setup, find_packages

def read_requirements():
    """Parse requirements from requirements.txt."""
    reqs_path = os.path.join('.', 'requirements.txt')
    with open(reqs_path, 'r') as f:
        requirements = [line.rstrip() for line in f]
    return requirements

setup(
    ..., # Other stuff here
    install_requires=read_requirements(),
    )

Some would say that conflating install_requires and requirements.txt is ill-advised, and for a released version, I agree, but I think this works well for development.

Ah, neat. So if say packages "A" and "B" both use this method, and package "A" lists "B" as a dependency, when you go to install "A" it actually ends up processing the requirements.txt for "B" (which it doesn't normally) - is that right?

I also read this morning that install_requires itself was kinda bad because those installs were done by setuptools, which means any pip options were ignored, but I don't know if that info is outdated ...

I also read this morning that install_requires itself was kinda bad because those installs were done by setuptools, which means any pip options were ignored, but I don't know if that info is outdated ...

You're confusing install_requires with setup_requires.

Ah, neat. So if say packages "A" and "B" both use this method, and package "A" lists "B" as a dependency, when you go to install "A" it actually ends up processing the requirements.txt for "B" (which it doesn't normally) - is that right?

@stevebrasier Yes, I think it would, which might be a problem if you've pinned different versions of other required packages in A than in B.

Hi guys, I just like to note that the deprecation pathway in this case was way too short. I know that dependency links have been marked as deprecated for quite some time, but PEP 508 URLs, which can be used to replace them, haven't been implemented until 18.1. As a result there was only a 3 month long window to switch from dependency links to URL requirements, which is very short time for large projects.

@rgerkin Hi, I am trying to follow your instructions to no avail,

Searching for PACKAGE@ git+ssh://[email protected]:OWNER/PACKAGE.git@BRANCH
Reading https://pypi.org/simple/PACKAGE/
Couldn't find index page for 'PACKAGE' (maybe misspelled?)
Scanning index of all packages (this may take a while)
Reading https://pypi.org/simple/

PACKAGE@ git+ssh://[email protected]:OWNER/PACKAGE.git@BRANCH, this is in install_requires.

Would you have an idea why I am getting the above?

@KevinMars There are a few differences between my example and what you have, including the use of git_ssh, bitbucket, a.git suffix, a named branch, and no version tag. Maybe one or more of those things is leading pip to search on PyPI instead of at your URL. What version of pip are you using?

To remark something that I've found: Using setup.py to install the package with python setup.py install still requires declarations of external dependencies in dependency_links.

In your setup.py file:

import os, sys
from setuptools import setup, find_packages

def read_requirements():
    """Parse requirements from requirements.txt."""
    reqs_path = os.path.join('.', 'requirements.txt')
    with open(reqs_path, 'r') as f:
        requirements = [line.rstrip() for line in f]
    return requirements

setup(
    ..., # Other stuff here
    install_requires=read_requirements(),
    )

@rgerkin Thanks for sharing this solution. But what if I'm using pbr to setup my Python package? How to adapt this to fit pbr?

@KevinMars I have the same exact issue. Did you ever figure out the fix? I am trying to require a specific branch of a private bitbucket repo over SSH.

I just realized --process-dependency-links didn't existed anymore. I'm grateful for all the work the community does. Trying to justify the decision in never ending discussions and a maze of issue closing and redirection was the chosen solution but I still think leaving this option would have harmed nobody.

Was this page helpful?
0 / 5 - 0 ratings