Pipenv: Updating only one locked dependency

Created on 24 Oct 2017  ·  82Comments  ·  Source: pypa/pipenv

Sometimes I'm doing a PR and I want to update a specific dependency but I don't want to deal with updates of all my dependencies (aiohttp, flake8, etc…). If any breaking change was introduced in those dependencies, I want to deal with it in another PR.

As far as I know, the only way to do that would be to pin all the dependencies that I don't want to update in the Pipfile. But I find it to defeat the purpose of Pipenv in the first place :) .

So my feature request would be to be able to do something like:

$ pipenv lock --only my-awesome-dep

That would generate a Pipfile.lock with updates for only my-awesome-dep and its dependencies.

I can probably make a PR for that, but I would like to get some feedback first.

Type

Most helpful comment

Agree 100% - and I'll go a bit farther: this should be the default.

That is, pipenv install foo should never touch anything besides foo and its dependencies. And pipenv lock should certainly never upgrade anything - it should just lock what's already installed.

AFAICT, this is how npm, yarn, gem, etc. work; it makes no sense to have a lockfile that doesn't actually lock packages, but trusts package authors to not break things in patch releases, and therefore upgrades them without being asked. I can see the use of allowing upgrades, but that should be opt-in, since it's more surprising than not upgrading them.

I apologize if I'm hijacking this issue for something else, but since this is so closely related to an issue I was about to create, I thought I'd start the conversation here. Feel free to tell me I should make a new one.

All 82 comments

That could also be useful for pipenv install, as sometimes I want to install a new dependency without updating others.

There's a little thing to take into account here: Changing a single dependency could change the overall set of requirements.
Ex: Updating foo from 1.0 to 2.0 could require to update bar to >=2.0 (while it was <2.0 before), and so on.

I know that in the context of pip-tools itself (from which pipenv takes its dependency resolution algorithm), running the dependency resolution will only "update" the required packages when "re-locking" if there's an existing lock file. It does so by checking if the existing pins in the lockfile are valid candidate first when selecting candidate in the resolving. pipenv could probably do the same.

I think its a reasonable idea. Otherwise, if you want to update absolutely only one dependency, pipenv would have to have a mode to block if changing a dependency causes other changes, or else you would loose the guarantee of a valid environment.

I hope this helps!

Indeed, that was what I meant by:

That would generate a Pipfile.lock with updates for only my-awesome-dep and its dependencies.

Agree 100% - and I'll go a bit farther: this should be the default.

That is, pipenv install foo should never touch anything besides foo and its dependencies. And pipenv lock should certainly never upgrade anything - it should just lock what's already installed.

AFAICT, this is how npm, yarn, gem, etc. work; it makes no sense to have a lockfile that doesn't actually lock packages, but trusts package authors to not break things in patch releases, and therefore upgrades them without being asked. I can see the use of allowing upgrades, but that should be opt-in, since it's more surprising than not upgrading them.

I apologize if I'm hijacking this issue for something else, but since this is so closely related to an issue I was about to create, I thought I'd start the conversation here. Feel free to tell me I should make a new one.

Just found this related issue as well: https://github.com/kennethreitz/pipenv/issues/418

Being able to specify pipenv install --upgrade-strategy=only-if-needed seems like what I'm looking for, though of course as I mentioned I think that should be the default, as it's becoming in pip 10. But being able to specify it semi-permanently via env var would be something, anyway.

I would be surprised if that change breaks anyone's workflow (famous last words), since it's more conservative than --upgrade-strategy=eager.

Tried to work around this by setting export PIP_UPGRADE_STRATEGY=only-if-needed in my shell config. This doesn't work, and pipenv lock exhibits these surprising behaviors:

  1. It "upgrades" packages that don't need to be upgraded (but...)
  2. It actually doesn't upgrade the installed versions! i.e. pip freeze and Pipfile.lock show different versions!

Guessing pipenv is delegating to pip for the install, and pip respects its environment variable settings, but pipenv lock doesn't.

@k4nar What happens right now that you are finding undesirable? Because if you upgrade a dependency that has cascading requirements obviously it will have consequences for other dependencies. Are you suggesting some kind of resolver logic to determine the most current version of a specific package _in the context of the current lockfile_? I am hesitant to encourage too many hacks to resolution logic, which is already complicated and difficult to debug.

@brettdh I think I can shed some light because you have most of the pieces. pipenv lock doesn't install anything, and it doesn't claim to. It only generates the lockfile given your host environment, python version, and a provided Pipfile. If you manipulate your environment in some other way or if you use pip directly/manipulate pip settings outside of pipenv / are not using pipenv run or using pip freeze inside a pipenv subshell, it is quite easy for a lockfile to be out of sync from pip freeze. The two aren't really related.

To be clear:

  1. Pipfile.lock is a strictly-pinned dependency resolution using the pip-tools resolver based on the user's Pipfile
  2. If you want to maintain strict pins of everything while upgrading only one package, I believe you can do this by strictly pinning everything in your Pipfile except for the one thing you want to upgrade (correct me if I'm wrong @vphilippon)

As for your lockfile and pip freeze disagreeing with one another, I'd have to know more information, but I believe we have an open issue regarding our lockfile resolver when using non-system versions of python to resolve.

@techalchemy : If I have a Pipfile.lock with A, B and C where B is a dependency of A, I would like to be able to update A and B without updating C, or C without updating A and B.
Again of course I can pin all my dependencies & their dependencies in my Pipfile in order to do that, but that would be a burden to maintain (like most requirements.txt are).

I concur with everything @k4nar wrote. Sure, I could even just pin
everything in requirements.txt and not use pipenv. The point of pipenv is
to have one tool that makes that (and the virtualenv stuff, of course)
simpler to manage; i.e. all packages are locked by default to a version
that’s known to work, but it should be straightforward to upgrade a select
few (without unexpectedly upgrading others).
On Thu, Oct 26, 2017 at 4:28 AM Yannick PÉROUX notifications@github.com
wrote:

@techalchemy https://github.com/techalchemy : If I have a Pipfile.lock
with A, B and C where B is a dependency of A, I would like to be able to
update A and B without updating C, or C without updating A and B.
Again of course I can pin all my dependencies & their dependencies in my
Pipfile in order to do that, but that would be a burden to maintain (like
most requirements.txt are).


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/kennethreitz/pipenv/issues/966#issuecomment-339591307,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AAFlnqUOEKARiFD8kEk3GVczF3NXBdVOks5swEKcgaJpZM4QEf--
.

Hm I see what you guys are saying. The premise of passing a setting to pip is not what I’m worried about, it’s resolving with pip-tools that concerns me. What does this behavior look like right now?

@techalchemy I mentioned the pip freeze difference as a shorthand for "the package versions that pipenv install installs differ from the package versions that pipenv lock saves to Pipfile.lock."

True, this only happens when I've changed pip's default args via environment variable; I was just pointing out that it was surprising that pipenv delegated to pip for installation but not for version locking; i.e. rather than locking what's installed, it locks what it thinks should be installed, potentially with unrequested upgrades.

Could you clarify your question a bit? I think "resolving with pip-tools" is referring to what pipenv lock is doing, and the reason it's not affected when I set pip defaults? And could you be more specific about what you mean by "this behavior"?

@brettdh The locking mechanism include a notion of "dependency resolution" that does not exist in pip. Its handled by pip-tools (or rather, a patched version of it, integrated in a special way by pipenv that bring a few differences with the original tool). In short, the locking mechanism reads the Pipfile and performs a full dependency resolution to select a full set of package that will meet every constraints defined by the required packages and their dependencies.

@techalchemy

[...] it’s resolving with pip-tools that concerns me.

I'm not sure how those --upgrade-strategy would affect pip-tools, because it works on some low-level internals of pip. I have the feeling this would not give the expected result, as these option take into account what's installed, and that's not what's being dealt with in that mechanism. But we have another approach to this in pip-tools that could be done here.

The "original" pip-tools behavior is that it only updates what's is needed in the lockfile (in its context, its the requirements.txt), but this was "lost" in the way the resolver was integrated in pipenv. Let me explain why.

Pointing back to my resume of how pip-tools works: https://github.com/kennethreitz/pipenv/issues/875#issuecomment-337717817
Remember the "select a candidate" part? That's done by querying the Repository object.
In pipenv, we directly configure a PyPIRepository for the Resolver, but pip-tools does something else, it uses a LocalRequirementsRepository object, which keeps the existing pins from the previously existing requirements.txt (if found), and "fallbacks" on PyPIRepository.

So in pip-tools, the following happens when selecting a candidate:

  1. Query LocalRequirementsRepository for a candidate that match foobar>=1.0,<2.0.
  2. Check if an existing pin meets that requirements:

    • If yes, return that pin as the candidate.

    • If not, query the proxied_repository (PyPIRepository) for the candidate.

  3. Use the candidate returned

Effectively, it means that existing pins are given a "priority" as candidate to try first.

But in pipenv, currently, it simply:

  1. Query PyPIRepository (directly) for a candidate that match foobar>=1.0,<2.0.
  2. Use the candidate returned.

So, I think the same behavior for the locking in pipenv could be done by parsing the Pipfile.lock to get the existing pins and use a LocalRequirementsRepository, like pip-tools does in its pip-compile command.

@vphilippon do you have a sense of how difficult implementation on that would be?

@techalchemy

  • Parsing the Pipfile.lock to extract the existing pins: Haven't looked at that. Depends on how things are structured in pipenv. We need a set of InstallRequirements that represents the pins in the Pipfile.lock.
  • Using LocalRequirementsRepository: Fairly easy: change our current PyPIRepository for a LocalRequirementsRepository.

But, as I'm looking into this, and following @brettdh comments, I realize a few things:

  1. The current default pipenv install behavior doesn't match the pipenv lock behavior. Doing pipenv install requests alone won't update requests if a new version comes out (much like straight pip install). However, doing pipenv lock will update the Pipfile.lock with the latest version of requests that matches the Pipfile specifier, and the dependency constraints.
    There's 2 main way to see this:

    • A) The Pipfile.lock should stay as stable as possible by default, not changing pins unless required, in order to stay like the current environment, and only change in the event that we change the environment.

    • B) The Pipfile.lock should get the newest versions that respect the environment constrains/dependencies in order to freely benefit from the open ranges in the Pipfile and lib dependencies, allowing to continuously acquire new compatible versions in your environment. You can then run pipenv update to benefit from the fresh lock.

IMHO, I would align the default behavior, which would be to go with A) by default. Because right now, everytime a lock is performed (i.e. after each installation), new versions can come in, which make the lockfile *drive the update of the environment*, which seems weird. But, this is arguable of course. While in development, I might want to continuously update my requirements to no get stale, like with B), so that should also be easily doable.
  1. Even if we use LocalRequirementsRepository to avoid updating correct existing pins, and end up aligning the default behaviors, we then need to address the equivalent of --upgrade and --upgrade-strategy for the locking part. Currently, defining some environment variable (like PIP_UPGRADE and PIP_UPGRADE_STRATEGY) will affect the pipenv install behavior, but will not affect pipenv lock, as it doesn't affect the behavior of pip-tools (I confirmed that, as I was unsure at first).
    Otherwise, there will be no way to update the environment without either deleting the Pipfile.lock (feels clunky, and "all or nothing") or requiring a newer version (I mean doing an explicit pipenv install requests>2.18.4, which requires you to know that a new version is out, and changes the specifier in the Pipfile itself, increasing the lower bound), which is wrong. As the "original pip-tools" doesn't deffer to pip to deal with this (as it's not related that what is currently installed), it offers an option to specify the dependencies to update in the lockfile, and simply remove the pins for these packages (or all) from the existing_pins list, effectively falling back to querying PyPI. I'm not sure how we can match the notion of "--upgrade-strategy" with this.

@techalchemy
So while I was saying it was fairly easy to just "align the default behavior", I now realize that this would cause some major issue with being able to update the packages (as in: just fetch the latest version that match my current constraints).

If there's something unclear, ask away, a lot of editing went on when writing this.

(Dependency resolution is not easy. Good and practical dependency resolution is even worst 😄 )

@vphilippon that's exactly what I meant. Keeping the things that pip installs in sync with the things that pip-tools resolves is non-trivial unless you drive the process backwards, using the resolved lockfile to do the installation. I'm pretty sure that was why things were designed the way they were.

B) The Pipfile.lock should get the newest versions that respect the environment constrains/dependencies in order to freely benefit from the open ranges in the Pipfile and lib dependencies, allowing to continuously acquire new compatible versions in your environment. You can then run pipenv update to benefit from the fresh lock.

This workflow can possibly work with the current configuration. You can use pipenv lock to generate a lockfile, but pipenv update will reinstall the whole environment. I'm pretty sure we can use one of our various output formats to resolve the dependency graph (we already have a json format as you know) and only reinstall things that don't align to the lockfile. This might be more sensible, but I would be curious about the input of @nateprewitt or @erinxocon before making a decision

@vphilippon Totally agree that A and B are desirable workflows in different situations. Some of your phrasing around B confused me a bit, though, seeming to say that pipenv lock might result in a lockfile that doesn't actually match the environment - I particularly heard this in that one would need to "run pipenv update to benefit from the fresh lock" - as if the lock is "ahead" of the environment rather than matching it.

Regardless of whether you are in an A workflow or a B workflow, a few things seem constant to me, and I think this squares with what @techalchemy is saying as well:

  • The result of pipenv lock should always be a lockfile that matches the environment.
  • The result of pipenv install should always be an environment that matches the lockfile.

I'm ignoring implementation details, but that's kind of the baseline behavior I expect from a package manager with a lockfile feature.

Running pipenv update periodically allows you to stay in B mode as long as you want everything to be fresh, and having the ability to pipenv install --upgrade requests would allow specific updates of one package and its dependencies, without affecting packages that don't need to be upgraded unnecessarily.

Am I missing any use cases? I can think of optimizations for B - e.g. a flag or env var that tells it to always update eagerly - but I think that covers the basics. I also know I'm retreading ground you've already covered; it's just helpful for me to make sure I understand what you're talking about. :)

Some of your phrasing around B confused me a bit, though, seeming to say that pipenv lock might result in a lockfile that doesn't actually match the environment

@brettdh this is correct -- the pip-tools resolver we use to generate Pipfile.lock doesn't ask the virtualenv for a list of which packages have been installed. Instead, it compiles a list of packages that meet the criteria specified in the list of pins from the Pipfile. Because the resolver itself runs using the system or outer python / pipenv / pip-tools install, we are doing some supreme fuckery to convince it to resolve packages with the same version of python used in the virtualenv. The assumption would be that pip install would resolve things similarly, but that isn't always the case, although even I'm not 100% sure about that. But yes, pipenv lock is not generated based on the virtualenv, it is generated based on the Pipfile. It is a dependency resolution lockfile, not an environment state pin.

As a potential resolution to this: something that pip itself currently supports, but pip-compile doesn't, is the notion of a constraints file.

A constraints file differs from a requirements file, in that it says "If this component is installed, then it must meet this version constraint". However, if a particular package in the constraints file doesn't show up in the dependency tree anywhere, it doesn't get added to the set of packages to be installed.

This is the feature that's currently missing from pipenv, as the desired inputs to the Pipfile.lock generation are:

  1. The updated Pipfile contents as a new requirements input file
  2. The full set of existing dependencies from Pipfile.lock as a constraints file, excluding the packages specifically named in the current command

Constraints file support at the pip-tools resolver level would then be enough for pipenv to support a mode where attempted implicit upgrades of dependencies would fail as a constraint violation, allowing the user to decide whether or not they wanted to add that package to the set being updated.

currently not supported, thanks for the feedback

@kennethreitz

Do you mean:

  1. This behavior should be changed, but it's not currently a priority,
  2. This behavior should be added as something optional, but it's not currently a priority, or
  3. This behavior should not be added?

This is a sufficient inconvenience given the inconsistency with how other similar locking package managers work that it would be good to keep this open as a solicitation for PRs.

If instead it's (3), and this will not be added, then I think a number of us on the issue will need to adjust our plans for our choice of Python package management tools.

I mean that this is currently not supported, and I appreciate the feedback.

I understand that it's not supported. Are you also saying that you would not accept PRs either changing this behavior or adding this as an option?

I have no idea.

@k4nar still interested in doing a PR for this? Specifically, something like pipenv install --only <dep-to-update which prevents unrelated deps from being updated. Since @kennethreitz seems uninterested in discussing further, it seems to me that that's the only way to find out whether that behavior addition/change could be acceptable (and, by extension, whether folks like @taion and I can continue using pipenv).

I'm interested but I'm not sure to know how would be the best way to implement this. There are a lot of components in action (pip, pip-tools, pipfile, pipenv…) and probably a lot of possible solutions.

Per https://github.com/kennethreitz/pipenv/issues/966#issuecomment-339707418, it should be relatively straightforward. That dep resolution logic is largely just from pip-tools. I was planning on submitting a PR, but I can't justify spending the work if we're not willing to talk about how we want the API to look before we spend time writing code.

I'm currently looking at taking an alternative approach – as Pipfile is a standard, interactions with it don't need to go through pipenv, and I'd like to work around some of the other odd semantics here like wiping existing virtualenvs per https://github.com/kennethreitz/pipenv/issues/997.

Sorry to comment on a closed issue, but I'd like to point out that, to my understanding, using pipenv in my projects currently requires a workflow like this:

pipenv install foo
vim Pipfile.lock  # Manually remove all the unwanted updates
git add && git commit && git push

I find it really annoying having to communicate this to my team members. The alternative seems to be to pin everything to exact versions in Pipfile, but that defeats much of the purpose of using pipenv in the first place.

IIUC, this behavior is the equivalent of apt performing an implicit apt dist-upgrade whenever you run apt install foo.

This is made worse by the fact that pipenv install updates stuff in Pipfile.lock, but does not install the updates into the local virtualenv. If the developer does not carefully examine the diff of Pipfile.lock, they are still using the older versions locally, but once they share the code, all other environments see the surprising updates. People have a tendency to ignore the diff of Pipfile.lock because it's considered an auto-generated file.

I am strongly convinced that "update everything to the latest version allowed by Pipfile" should be an explicitly requested operation that is separate from "install foo".

should be fixed in master

The behaviour is still present, I tested it in pipenv 11.8.3, @kennethreitz.

@marius92mc The "fixed in master" comment is referring to the --selective-upgrade and --keep-outdated options added in recent releases: https://docs.pipenv.org/#cmdoption-pipenv-install-keep-outdated

That allows folks that need or want more control over exactly when upgrades happen to opt in to that behaviour, while the default behaviour continues to respect OWASP A9 and push for eager upgrades at every opportunity.

@ncoghlan I think one thing that is needed (easy to ask for, not as easy to do) is an FAQ on _how_ those options behave (at least it's still confusing for me).

For instance: Using --selective-upgrade and --keep-outdated will still cause outdated libraries in the Pipfile.lock to be updated, if they're not directly related to the "selected" package to be updated.

It sounds like there may be implementation bugs, then.

They are intended to leave the pipfile.lock as-is, except for the new change.

Let me know if it's helpful to provide a test Pipfile+.lock.

I think you've provided enough information for us to investigate. I'll try to do that now.

Actually, your pipfile/lock would be great, if it contains outdated results.

@ncoghlan, thank you for providing the details.
I tried again with your mentioned options and the result seems to be the same, it still updates the other packages as well, changing them in the Pipfile.lock file.

There are any updates about this issue, @kennethreitz?

Sorry for the slow answers on this. We haven’t nailed down the root cause for the regression here yet (I know I personally have been handling a data center migration this weekend so I’ve been kinda slow) but we will get this sorted in the next few days.

Contributions welcome as always!

I think there is a missing use case that can use this same change: when I am developing an application I often need to upgrade a single dependency's version. The steps I would like to follow are:

  1. Update the version restriction for the dependency in setup.py
  2. Run either pipenv lock --selective-upgrade ; pipenv sync or pipenv install --selective-upgrade "-e ."

@wichert If Pipfile has been edited in a way that increases the minimum required version beyond what's in the current lock file, then --keep-outdated should already cover what you need. --selective-upgrade is for the case where Pipfile hasn't changed, but you want to update to a new pinned version anyway.

@ncoghlan Pipfile has not changed in this scenario, only setup.py by changing the minimum version requirement for a dependency, typically to something more recent and currently in Pipfile.lock.

@wichert pipenv doesn't capture changes to your setup.py automatically because it isn't setuptools. You have to run pipenv lock if you want that to happen.

What's the current status of this? On March 25th someone said they thought implementation issues would be resolved "in the next couple days", and other bug reports have been closed due to being tracked here; but as of 2018.7.1 I still see the bug reported by Simon Percivall (indirect dependencies are always updated) and that bug hasn't been discussed since the original report. Is the problem still being tracked?

(I'm currently living in a second-tier city in Senegal so my Internet is terrible and it would be a game changer not to blow my data cap on updating indirect dependencies if possible :P )

PS: Thanks for making Pipenv, it's awesome <3

Yes for sure. We are rewriting the resolver to support this right now. Whether it lands in this release or next release remains to be seen

I’m not that confident with my coding skill to estimate when the resolver would land :p Seriously, this is a completely volunteer project, and we don’t have a deadline mechanism as you would in commercial settings (we don’t even have a boss or a project manager or whatever you have in your company that decides when a thing needs to be done). If you want a thing to be done in a timeframe you desire, you need to do it yourself, or at least provide real motivation for others to do it.

@uranusjr FWIW, I didn't see any demands for expediency in @benkuhn 's comment above - just a question about where things are at; i.e. what work has been done, so that outside observers can make their own estimates/decisions.

I understand that pipenv is a volunteer project and that non-contributors cannot ask for a thing to be done by a date without signing up to make it happen. I do wonder, whether there is room for more transparency in the project's development process, or if I'm just not looking in the right places. Usually the answer is either "if the issue hasn't been updated, there's been no movement" or "look at this WIP pull request," but this issue in particular seems to have triggered a much larger effort, so the dots can be difficult to connect for those not directly involved.

As always, much thanks to you and everyone who gives their valuable time towards the improvement of pipenv. 👏

For sure, this one doesn’t have activity or a work in progress PR because it is a lot more complicated than that. We are talking internally mostly about how we even want to structure this with respect to the larger project, and working iteratively to establish an approach that might even begin to work properly. Once we can sort that out we can build resolution logic.

In the meantime the resolver stack in pipenv is super convoluted and I wouldn’t be comfortable asking people to invest too much effort trying to untangle it for this purpose. Even the simplest use case here will take a significant refactor. We’d be happy to review / discuss any proposed refactor if someone is interested in helping tackle this, but the two things are tightly coupled.

If someone has expertise in dependency resolution and sat solving we would certainly be interested in input but there just isn’t a single concrete idea yet. We’ve been through several iterations that we never planned to carry forward as more than proof of concept. Not all code becomes a PR, and not all code organization decisions happen on the issue tracker. Sometimes we chat synchronously and propose and scrap ideas in real time.

Something I was _going_ to suggest as an alternative workflow that might address this is making it easy to pin to a specific version in the _Pipfile_ when installing.

I think it's slightly surprising but not completely unreasonable that pipenv interprets foo = "*" to mean "I just need to make sure _some_ version of foo is installed, the user doesn't care which". To that end, having something like pipenv install --pin foo which results in foo = "==1.2.3" instead of foo = "*" in the Pipfile (where 1.2.3 is the current latest version of foo) seems like it might help.

The issue with this though is that the behavior of a lot of packages can change a lot based on their dependencies (e.g., the same version of pylint can do totally different things depending on what version of astroid it's using), and packages don't pin their own deps exactly. So I don't think this actually gets anyone very far. :/

(Just realised I have been commenting to the wrong issue. Sorry for the mess up, please ignore me) 😞

An actual use case that I've struggled with for some hours now: I want to measure test coverage in a Django 2.0 project. Even pipenv install --keep-outdated --selective-upgrade --dev coverage insists on updating the non-dev Django package to version 2.1, which because of breakage elsewhere I absolutely cannot use yet. There really must be a way to change the set of installed packages without upgrading completely unrelated packages to possibly breaking versions. The possibility of breakage in the latest version will always exist.

I'll try @rfleschenberg's workaround, but I don't know whether having a presumably incorrect _meta hash property will break anything.

@l0b0 If your application genuinely cannot handle a particular version of Django, I think it makes sense to state that restriction in your Pipfile?

@AlecBenzer That sounds like something for setup.py to me.

@wichert That might make sense too (I'm actually not totally clear on in what circumstances you'd want to have both a setup.py and a Pipfile), but if you have a line in your Pipfile that says:

Django = "*"

you're telling pipenv that you want it to install _any_ version of Django. If what you really want it to do is install 2.0 or lower, tell it that instead:

Django = "<=2.0.0"

While in this particular case pipenv is upgrading Django for no real reason, it could be that somewhere down the line you try to install a package that requires Django >2.0.0, at which point pipenv will happily install it if you haven't told it that you need <=2.0.0.

If what you really want it to do is install 2.0 or lower, tell it that instead

@AlecBenzer on reflection, it now occurs to me that this is what npm/yarn do by default when you install a package; they find the latest major.minor version and specify ^major.minor.0 in package.json, which prevents unexpected major version upgrades, even when an upgrade-to-latest is explicitly requested. I wonder if Pipenv should do the same - but that would be a separate issue.

Of course, their lock file also prevents accidental upgrades of even minor and patch versions, which is what's being requested here.

I think it's been discussed above and elsewhere, but there is a tension/tradeoff in the design space between npm/yarn and pipenv. Any package manager ostensibly has these goals, with some relative priority:

  • Make it easy to install and upgrade packages
  • Make it hard to accidentally break your app with an errant dependency upgrade

The trouble with pinning an exact version in the Pipfile is that it's then harder to upgrade packages; this is why pip-tools exists (though it's for requirements.txt).

The --keep-outdated flag does not seem to be working as intended, as was stated when the issue was re-opened. Whether that behavior should or should not be the default and how it aligns with other package managers is not really the central issue here. Let's fix the thing first.

@brettdh

on reflection, it now occurs to me that this is what npm/yarn do by default when you install a package; they find the latest major.minor version and specify ^major.minor.0 in package.json, which prevents unexpected major version upgrades, even when an upgrade-to-latest is explicitly requested. I wonder if Pipenv should do the same - but that would be a separate issue.

Yeah that's along the lines of what I was trying to suggest in https://github.com/pypa/pipenv/issues/966#issuecomment-408420493

Really excited to hear this is being worked on!

In the mean time, does anyone have a suggested workaround that's less laborious and error-prone than running pipenv lock and hand-reverting the resulting lockfile changes that I don't want to apply?

@benkuhn Not that I know off - I do the same lock & revert dance all the time.

Ah ok, you can at least sometimes avoid hand-reverting:

  1. pipenv lock
  2. git commit -m "FIXME: revert"
  3. pipenv install packagename
  4. git commit -m 'Add dependency on packagename'
  5. git rebase -i
  6. Drop the FIXME: revert commit

Unfortunately it's still possible to create an inconsistent Pipfile.lock if your Pipfile.lock starts out containing a version of a package that's too old to satisfy the requirements of packagename, but perhaps pipenv will complain about this if it happens?

--keep-outdated seems to systematically keep outdated only the explicit dependencies that are specified (unpinned) in Pipfile, while all the implicit dependencies are updated.

Am I correct that it is not possible to update/install single dependency using pipenv==2018.7.1 without updating other dependencies? I tried different combinations of --selective-upgrade and --keep-outdated with no success.

Editing Pipfile.lock manually is no fun...

Same than @max-arnold , it's my first day using the tool in an existing project, and I have to say I'm really disappointed, before I started to use it, I checked the doc site and the video demo, it looked impressive to me, and now this: in real project, work with pip or pipenv is almost the same, i don't see the point, like many other said in the thread, if I have a lock file, why you are updating my other dependencies if there is no need to update them.

Of course, ### if the update is mandatory, it's OK to update all the necessary dependencies, but just those, not all the outdated instead.

Also the options --selective-upgrade and --keep-outdated are not clear for what are useful for, there is another issue highlighting this here #1554 , and nobody is able to respond what these options do, incredible.

But my major disappointing is why this package was recommended by the Python official documentation itself, these recommendations should be more careful conducted, I know this can be a great project in the feature, have a lot of potential, but simple things like this (we are not talking about a bug or a minor feature), make this project not eligible for production environments, but suddenly because it was recommended by the Python docs, everybody are trying to use it, instead of looking for other tools that maybe work better, or just stick with pip, that doesn't solve also these issues, but at least it's very minimalist and it's mostly included in any environment (does not add extra dependencies).

@mrsarm Thank you for your opinion. Sorry things don’t work for you. I don’t understand where the disappointment comes from, however. Nobody is forcing Pipenv on anyone; if it doesn’t work for you, don’t use it. That is how recommendations work.

Your rant also has nothing particularly related to this issue. I understand it requires a little self-control to not dumping trash on people when things don’t go your way, but please show some respect, and refrain from doing so.

@uranusjr it's not trash, it's an opinion, and some times it's not an option, like my case, where somebody chose pipenv to create a project where I started to work now, and I have to deal with this.

But things get worst just now, and what I going to say it's not an opinion, it's a fact.

After trying to add one dependency that just I dismissed to avoid to deal with this issue (because it's a dev dependency, so I created a second environment with pip and the old requirements-dev.txt approach, just with that tool), I needed to add another dependency.

The new dependency is PyYAML, let say the latest version. If you install it in any new environment with pip, you will see that the library does not add any dependency, so only PyYAML is installed, is that simple in these cases with Pip. But adding the dependency with Pipenv (because a project that I didn't create is managed with Pipenv) the same issue happened, despite PyYAML doesn't have any dependency, and it's not previously installed in the project (an older version), pipenv updates all my dependencies in the lock file and the virtual environment, but I don't want to update the others dependencies, I just one to add a single new module without any dependency.

So the conclusion (and again an opinion, not a fact like pipenv broke all my dependencies) it's that Pipenv instead of help me to deal with the dependencies management, it turn it into hell.

I've followed this thread for months, and I think any real project will ultimately stumble upon this issue, because the behavior is unexpected, counter-intuitive, and yes: dangerous.

About a month ago I tried out a more-comprehensive alternative to pipenv, poetry; it solved the problems _I_ needed to solve:
1) managing one set of dependencies (setup.py, setup.cfg, pip, and pipfile -> pyproject.toml)
2) future oriented, backwards-compatible (again pyproject.toml)
3) fairly un-opinionated (no i'm really not asking to install redis)
4) and the solution to the classic Pipenv problem: "Also, you have to explicitly tell it [pipenv] to not update the locked packages when you install new ones. This should be the default." [[1](https://github.com/sdispater/poetry#what-about-pipenv)] [[2](https://github.com/pypa/pipenv/issues/966#issuecomment-339117289)]

I weighed sharing these thoughts on the pipenv issue, but as @uranusjr said, "no one is forcing Pipenv on anyone", and I'm not forcing Poetry. I like it, it works well, and it solves my problems, but I'm just sharing an alternative, more-comprehensive solution to the problem I was having. Just take all that as my 2¢.

  • as a disclaimer, I am not a member of the Poetry team or affiliated with them.

p.s. I think the concern about Pipenv being the "official" solution is due to it's first-class integrations – something that you, @uranusjr, might see it as a simple recommendation – the industry at large is taking it as the "blessed approach going forward". Frankly, that recommendation is more authoritative in the community than certain PEPs that have been around for more than a year.

Nobody is forcing you to participate in our issue tracker; if you don’t have a productive comment please find a forum that is not for triaging issues.

For users who are interested in trying the alternative resolver @uranusjr and myself have been implementing for several weeks now, please try out https://github.com/sarugaku/passa which will generate compatible lockfiles. Poetry does a lot of different things, but it also has limitations and issues itself, and we have a design philosophy disagreement about scope.

This is a project we manage in our spare time; if you want to see something fixed or you have a better approach, we are happy to accept contributions. If you are here to simply tell us we ruined your day and your project, I will ask you only once to see yourself out.

We have not forgotten or ignored this issue, we have a full implementation of a fix in the resolver linked above. Have patience, be courteous, or find somewhere else to talk. To those who have been waiting patiently for a fix, please do try the resolver mentioned above— we are eager to see if it meets your needs. It implements proper backtracking and resolution and shouldn’t handle this upgrade strategy

In the shorter term I think we can get a band aid for this into pipenv if we don’t wind up cutting over first.

@dfee I am not really sure that blurring lines between applications and libraries is the correct answer to dependency management, so I don’t see poetry’s approach as an advantage. I wasn’t involved in whatever your issue was with the recommendation engine, but we took that out some time ago...

@techalchemy

I am not really sure that blurring lines between applications and libraries is the correct answer to dependency management, so I don’t see poetry’s approach as an advantage.

Why though? I never understood this idea that you should manage the dependencies of a library and an application differently. The only difference between the two is the lock file which is needed for an application to ensure a reproducible environment. Other than that it's the same thing. This is the standard in most other languages and Python seems the exception here for some reason and this is bad from a user experience standpoint since this is making things more complex than they should be.

it also has limitations and issues itself

Which ones? I am really curious about the issues or limitations you encountered while using Poetry.

My apologies to bean so rude. Now reading my comments I realize that despite the info I provided and some of my options are still valid (IMHO), it's wasn't appropriated the way I wrote what I wanted to say.

I understand that the issue tracker is most a place where to discuss bugs and improvements, and discuss whether this is a bug or an error by design is not clear in the thread, but again my apologies.

I thing there are two strong topics here:

  • Should pipenv update all your outdated dependencies where you are trying just to install a new dependency: the ones that are not needed to update because the new package / version we are trying to install can works with the existent dependencies, and even the ones that aren't dependencies of the new package we are trying to install? Maybe this is out of scope of this ticket, but it's a really important topic to discuss.
  • Do one of these parameters --keep-outdated --selective-upgrade allow us to avoid these behaviour? It's not clear what these options do, there is a lack of documentation about them, and even in the related issue (#1554) nobody is answering that.

In case it's a bug in on one of these params --keep-outdated --selective-upgrade, I still thinking that do not set whatever param solves the unnecessary update of the dependencies as default it's a really bad idea.

To compare with a similar scenario, imagine that you execute apt-get install vim to just install the vim tool in your system (and the necessary vim's dependencies or updates if apply), but imagine also that in this situation apt updates all the other dependencies of your system: python, the QT system, the Linux kernel... and so on. It's not that apt shouldn't allow us to updates other dependencies, but there is a clear command to do that: apt-get upgrade, while apt-get install PACKAGE just install / update PACKAGE and it's dependencies.

@sdispater the distinction is at the heart of every disagreement we've ever had and it's incredibly subtle but I'd point you at https://caremad.io/posts/2013/07/setup-vs-requirement/ or a good article for the elixir use case: http://blog.plataformatec.com.br/2016/07/understanding-deps-and-applications-in-your-mixfile/

pyproject.toml isn't really supported for library definition metadata -- and not at all by any version of pip that doesn't implement peps 517 and 518 (both of which are still having implementation details worked out) as an authoritative library declaration file. setup.cfg exists for that purpose (the actual successor to setup.py ) and IMHO both of those should really be supported. A library is published and intended for consumption with abstract dependencies so that they can play nice in the sandbox with others; applications are usually large, complex beasts with sometimes hundreds of direct dependencies. So one of our main divergences is that when we design and build our tooling, we take this into account also

@mrsarm For your first question, the update behavior was intentional (and was discussed extensively at the time, /cc @ncoghlan and related to OWASP security concerns). On the second point, the behavior is currently not properly implemented which is why the issue is still opened, which led us to rewriting the backing resolver behind pipenv, which I mentioned above. It simply didn't support this behavior. --selective-upgrade is supposed to selectively upgrade only things that are dependencies of the new package, while --keep-outdated would hold back anything that satisfied the dependencies required by a new package. Slightly different, but I am fairly sure neither works correctly right now.

pyproject.toml isn't really supported for library definition metadata -- and not at all by any version of pip that doesn't implement peps 517 and 518 (both of which are still having implementation details worked out) as an authoritative library declaration file. setup.cfg exists for that purpose (the actual successor to setup.py ) and IMHO both of those should really be supported.

Well this is certainly off topic but it's an important discussion so I can't help myself.

There is actually no standard around setup.cfg right now other than the conventions established by distutils and setuptools. pyproject.toml is absolutely for library metadata as the successor to setup.py or the community would have placed build requirements in setup.cfg instead.

pyproject.toml describes how to build a project (PEP 518), and part of building is describing metadata. I'm NOT saying that pyproject.toml needs a standard location for this metadata, but PEP 518 uses this file to install a build tool and from there it's very reasonable to expect that the build tool will use declarative configuration from somewhere else in the file to determine how to build the project.

Anyway, going back to pipenv vs poetry - there seems to be some idea floating around that applications don't need certain features that libraries get, like entry points, and this is just incorrect. It should be straightforward for an application to be a python package.

The only true difference between an application and a library in my experience with python and with other ecosystems is whether you're using a lockfile or not. Of course there's a third case where you really just want a requirements.txt or Pipfile and no actual code and that seems to be all that pipenv has focused on so far (pipenv install -e . falls into this category as pipenv is still afraid to try and support the package metadata). Unfortunately, while the design of pipenv is cleaner with this approach, it's also way less useful for most applications because PEP 518 decided to punt on how to install projects into editable mode so in order to continue using pipenv we will be stuck on setuptools quite a while longer as you cannot use pyproject.toml to switch away from setuptools and still use pipenv install -e ..

There is actually no standard around setup.cfg right now other than the conventions established by distutils and setuptools. pyproject.toml is absolutely for library metadata as the successor to setup.py or the community would have placed build requirements in setup.cfg instead.

Distutils is part of the standard library and setuptools is installed with pip now, so saying that there is no standard is a bit silly. Not to mention it uses the standard outlined in pep 345 for metadata, among others, and can also be used to specify build requirements.

the community would have placed build requirements in setup.cfg instead.

Do you mean the pep authors? You can ask them why they made their decision, they outline it all in the pep.

pyproject.toml describes how to build a project (PEP 518), and part of building is describing metadata. I'm NOT saying that pyproject.toml needs a standard location for this metadata, but PEP 518 uses this file to install a build tool and from there it's very reasonable to expect that the build tool will use declarative configuration from somewhere else in the file to determine how to build the project.

This came up on the mailing list recently -- nothing anywhere has declared a standard around pyproject.toml other than that it will be used to declare build system requirements. Anything else is an assumption; you can call that "library definition metadata", but it isn't. Try only defining a build system with no additional information about your project (i.e. no pep-345 compliant metadata) and upload it to pypi and let me know how that goes.

Anyway, going back to pipenv vs poetry - there seems to be some idea floating around that applications don't need certain features that libraries get, like entry points, and this is just incorrect. It should be straightforward for an application to be a python package.

Who is saying that applications don't require entry points? Pipenv has an entire construct to handle this.

so in order to continue using pipenv we will be stuck on setuptools quite a while longer as you cannot use pyproject.toml to switch away from setuptools and still use pipenv install -e .

Not following here... we are not going to leave pip vendored at version 10 forever, I've literally been describing our new resolver, and the actual installer just falls back to pip directly... how does this prevent people from using editable installs?

This came up on the mailing list recently -- nothing anywhere has declared a standard around pyproject.toml

That's correct, it is not a "standard", yet in that same thread recognise that by calling it pyproject.toml they likely asked for people to use this file for other project related settings/config.

So by the same logic you invoked here:

Distutils is part of the standard library and setuptools is installed with pip now, so saying that there is no standard is a bit silly.

pyproject.toml is a standard, and the community has adopted it as the standard location to place information related to the build system, and other parts of a Python project.

Not following here... we are not going to leave pip vendored at version 10 forever, I've literally been describing our new resolver, and the actual installer just falls back to pip directly... how does this prevent people from using editable installs?

PEP 517 punted on editable installs... which means there is no standard way to install a project in editable mode if you are not using setup tools (which has a concept known as develop mode which installs the project in editable mode).

Distutils is part of the standard library and setuptools is installed with pip now, so saying that there is no standard is a bit silly. Not to mention it uses the standard outlined in pep 345 for metadata, among others, and can also be used to specify build requirements.

Yes, the build system is expected to output the PKG-INFO file described in PEP 345. This is a transfer format that goes in an sdist or wheel and is generated from a setup.py/setup.cfg, it is not a replacement as such for the user-facing metadata. PEP 518's usage of pyproject.toml is about supporting alternatives to distutils/setuptools as a build system, no one is trying to replace the sdist/wheel formats right now. Those replacement build systems need a place to put their metadata and fortunately PEP 517 reserved the tool. namespace for these systems to do so. It's not an assumption - both flit and poetry have adopted this namespace for "library definition metadata".

Try only defining a build system with no additional information about your project (i.e. no pep-345 compliant metadata) and upload it to pypi and let me know how that goes.

How constructive.

Who is saying that applications don't require entry points? Pipenv has an entire construct to handle this.

Where is this construct? I cannot even find the word "entry" on any page of the pipenv documentation at https://pipenv.readthedocs.io/en/latest/ so "an entire construct" sounds pretty far fetched? If you mean editable installs then we have reached the point I was making above - with pipenv deciding to couple itself to pipenv install -e . as the only way to hook into and develop an application as a package, for the foreseeable future pipenv's support here is coupled to setuptools. I think the entire controversy boils down to this point really and people (certainly me) are frustrated that we can now define libraries that don't use setuptools but can't develop on them with pipenv. To be perfectly clear this isn't strictly pipenv's fault (PEP 518 decided to punt on editable installs), but its refusal to acknowledge the issue has been frustrating in the discourse as poetry provides an alternative that does handle this issue in a way that's compliant with the pyproject.toml format. Pipenv keeps saying that poetry makes bad decisions but does not actually attempt to provide a path forward.

https://pipenv.readthedocs.io/en/latest/advanced/#custom-script-shortcuts

Please read the documentation.

@bertjwregeer:

pyproject.toml is a standard, and the community has adopted it as the standard location to place information related to the build system, and other parts of a Python project.

Great, and we are happy to accommodate sdists and wheels built using this system and until there is a standard for editable installs we will continue to pursue using pip to build sdists and wheels and handle dependency resolution that way. Please read my responses in full. The authors and maintainers of pip, of the peps in question, and myself and @uranusjr are pretty well versed on the differences between editable installs and the implications of building them under the constraints of pep 517 and 518. So far All I'm seeing is that the peps in question didn't specifically address how to build them because they leave it up to the tooling, which for some reason everyone thinks means pipenv will never be able to use anything but setuptools?

I've said already this is not correct. If you are actually interested in the implementation and having a productive conversation I'm happy to have that. If you are simply here to say that we don't know what we're doing, but not interested in first learning what it is we are doing, this is your only warning. We are volunteers with limited time and I am practicing a 0 tolerance policy for toxic engagements. I do not pretend my work is perfect and I don't pretend that pipenv is perfect. I will be happy to contribute my time and effort to these kinds of discussions; in exchange I ask that they be kept respectful, that they stick to facts, and that those who participate also be willing to learn, listen, and hear me out. If you are here just to soapbox you will have to find another platform; this is an issue tracker. I will moderate it as necessary.

This discussion is wildly off topic. If anyone has something constructive to say about the issue at hand, please feel free to continue that discussion. If anyone has issues or questions about our build system implementations, please open a new issue. If you have issues with our documentation, we accept many pull requests around documentation and we are aware it needs work. Please defer all of that discussion to new issues for those topics. And please note: the same rules will still apply -- this is not a soapbox, it is an issue tracker.

https://pipenv.readthedocs.io/en/latest/advanced/#custom-script-shortcuts
Please read the documentation.

Entry points are a more general concept than just console scripts and this link is completely erroneous in addressing those concerns. <soapbox>Ban away - you're not the only maintainer of large open source projects on here and none of my comments have been a personal attack on you or the project. People commenting here are doing so because they want to use pipenv and appreciate a lot of what it does. My comment was not the first off topic post on this thread, yet is the only one marked. Your snarky comments indicating that you think I don't know what I'm talking about are embarrassing and toxic.

In the project we maintain, we can soapbox. And yes, pip will support all compliant build systems which you both yourselves seem to full well understand will produce consumable metadata, and as pipenv uses pip as the backing tool to drive its installation process, as I described, yes, pipenv will support all compliant tooling. I already said this.

So yeah, please take your toxicity somewhere else. Your attitude is not welcome here. Final warning. Persistent attempts to incite conflict won’t be tolerated.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

xi picture xi  ·  3Comments

fbender picture fbender  ·  3Comments

FooBarQuaxx picture FooBarQuaxx  ·  3Comments

AkiraSama picture AkiraSama  ·  3Comments

jacebrowning picture jacebrowning  ·  3Comments