Uncategorized

358 most popular Python packages have wheels – Packaging


Vanity metric, but still cool and interesting in my opinion.

There are now 358 out of 360 most popular python packages have wheels available. https://pythonwheels.com/

The 2 non-wheel packages are:

I’m not even sure if those could ever be wheels without changing packaging standards.



2 Likes

“future” appears in quite a lot of dependency closures in my experience.

It’s a bit of a shame that “future” is not a wheel. In theory, you can certainly write a lot of useful software if your dependency closure only includes those top 358 packages.

If all 358 have PEP 658 metadata, it should be possible to do a very fast dependency solve purely using static metadata without any intermediate package downloads or builds. Could even be done without a Python interpreter ala GitHub – prefix-dev/rip: Solve and install Python packages quickly with rip (pip in Rust)



1 Like

It’s annoying that the list of wheels on that page can’t be cut & pasted, otherwise I could relatively quickly check. But I assume that most, probably all of those 358 will have PEP 658 metadata for their latest versions, at least. Unfortunately, for many resolves, looking at older versions is necessary and I don’t think the “backfill” exercise to add metadata for older wheels has been done yet.

But yes, it’s great news that we’re reaching a point where a significant number of real-world installs can be completed with static data, only downloading what needs to be installed.



3 Likes

Maybe this can help: https://pythonwheels.com/results.json

Ah cool. I didn’t spot a mention to that file. There’s actually 77 of the 358 which don’t have static metadata – presumably because they haven’t had a release since PyPI started extracting metadata from wheels.

A few spot checks suggests that’s the case.

This is the script I used to get upload times.

def get_upload_times(pkg):
    ACCEPT = "application/vnd.pypi.simple.v1+json"
    url = f"https://pypi.org/simple/{pkg}/"
    rsp = requests.get(url, headers={"Accept": ACCEPT})
    data = rsp.json()
    return [f["upload-time"] for f in data["files"] if f["filename"].endswith(".whl")]

This is the list of packages with no metadata:

adal
aioitertools
aiosignal
appdirs
asn1crypto
asynctest
azure-common
backoff
backports-zoneinfo
cinemagoer
colorama
coloredlogs
contextlib2
crashtest
decorator
entrypoints
et-xmlfile
gast
google-crc32c
google-pasta
h11
httplib2
humanfriendly
imdbpy
iniconfig
installer
isodate
itsdangerous
jeepney
jmespath
matplotlib-inline
mccabe
mdurl
msrest
msrestazure
multidict
mypy-extensions
oauth2client
oauthlib
openpyxl
oscrypto
parso
pkginfo
pkgutil-resolve-name
ply
ptyprocess
py
py4j
pyasn1-modules
pycparser
pynacl
pyproject-hooks
pysocks
python-dateutil
python-dotenv
python-json-logger
pytzdata
requests-aws4auth
requests-file
requests-oauthlib
requests-toolbelt
rfc3339-validator
rsa
scramp
secretstorage
six
sniffio
sortedcontainers
sqlparse
tabulate
toml
tomli
toolz
uritemplate
webencodings
xlrd
xmltodict



1 Like

There’s no reason why future, at least, couldn’t be one; there’s an issue and multiple linked PRs open. AFAIK, it is merely due to being mostly unmaintained in the past few years, and usage will continue to drop as its original purpose—making code cross compatible with Python 2 and Python 3—fades away. But it if the maintainer pops up again, it seems like it would be a fairly straightforward matter.

PySpark appears to have a non-trivially complex build process that requires building/running against the Spark JARs of the existing Spark version, which may or may not be straightforward to incorporate into a wheel.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *