Pygithub: Async support

Created on 24 May 2020  ·  10Comments  ·  Source: PyGithub/PyGithub

What do you think about PyGithub async support? If I do async support for PyGithub - do you accept it?

Most helpful comment

I don't think this issue should be closed. Using asyncio for IO-bound operations is the perfect use-case.

All 10 comments

In what way would you add in async support? Given most operations are done remotely on GitHub and our code is waiting for a response or a JSON blob back, how would it help?

Given most operations are done remotely on GitHub and our code is waiting for a response

That's a good reason for async introduction! I mean... PyGithub should handle API access as coroutines, otherwise most of the time it's waiting for server's reponse instead of doing something useful. It's a common concept on nodejs and it's currently supported on @octokit/rest.js
I'm surprised nobody asked this feature before...

Obs: maybe I misunderstood how PyGithub currently works, but I think sequential and synchronous API wrappers are less efficient than an async one.

I could add a new AsyncRequester class and make in all other classes asynchronous methods for interacting with it, while preserving the logic.

Asynchronous code could help in tasks where you need to produce a large number of queries, for example, in a search. It would also allow you to quickly work with multiple accounts on Github

Which sounds like a complete redesign, along with adding support for utilising multiple accounts. I love your enthusiasm, but I think it's an awful lot of work for not enough gain.

It might involve a lot of work, but I'd love to see it implemented. Currently I'm stuck with JS on my research because of @octokit/rest.js performance overcomes PyGithub's. If some help is wanted I would be glad to work on this too.

Btw I think multiple accounts support would be too much! Isn't just async a tremenduous first step towards performance gains?

Maybe, I can do any editions and show it in pull request? For test

Asyncio sounds like a good idea given my usecase. I am trying to read all files in a repository recursively, and synchronous requests are just too slow (I might be missing something like rate limiting on github's api but we could definitely make such operations faster).

I would strongly suggest using something like GitPython for that rather than requesting everything via the GitHub API.

Thanks for the interesting suggestion, I'll give it a try as it does make sense to do it that way.

I don't think this issue should be closed. Using asyncio for IO-bound operations is the perfect use-case.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

davidxia picture davidxia  ·  14Comments

RitamDey picture RitamDey  ·  13Comments

sfdye picture sfdye  ·  9Comments

azatelli picture azatelli  ·  16Comments

GrapeBaBa picture GrapeBaBa  ·  14Comments