-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance of _repr_png_ with larger images #7241
Comments
I've created PR #7242 to resolve this. |
That was quick, thanks for that! It occurred to me later that it might be considered reasonable to resample very large images down as well. This would obviously be a change in semantics, but given this ostensibly is for "display purposes" it might be a reasonable thing to do. I can't find much in the way of guidance on this in either:
What to use when resampling a "large" image seems to bring policy awkwardly into the mix, but maybe something like: THRESHOLD_SIZE = 1200
factor = max(image.size) // THRESHOLD_SIZE
if factor > 1:
image = image.resize((image.width // factor, image.height // factor), Resampling.BOX) in an attempt to reduce artifacts by keeping to integer scaling. Having to pick a threshold feels nasty and makes me think this shouldn't be done. Exposing this at a module level would let users control this, but I can predict confused users submitting bug reports if the image is resampled. A similar argument could be made for reducing higher bit-depth images down to 8BPC. Thought it worth writing this up in case anybody else has ideas / for future reference. |
Pillow only supports 8BPC or single-channel 32-bit floats. |
what are you referring to here? I've personally used JPEG2000 images with unusual bit-depths which got exposed as |
Pillow only stores multichannel image modes with up to 8 bits per channel, or single channel images with up to 32 bits. |
Okay, that makes more sense. It still makes sense (to me and my use cases anyway) to reduce higher bit depth images down (to |
I've just been using PIL to work with some larger images in IPython/Jupyter (~3k by 2k pixel images) and noticed that it can take a long time (nearly 10 seconds) for them to appear in the output. I had a poke in the code and found that it's due to
Image._repr_png_
using default compression levels which is causing it to take a significant amount of time.I tried adding
compress_level=1
to thesave
call which reduces the wait to a couple of seconds–much nicer for interactive use.I see it's had a recent refactor (in #7135) to also include JPEG support which is nice, but I could put a quick patch together if making this sort of change is considered sensible.
The text was updated successfully, but these errors were encountered: