Handling of very large images Topic is solved

Talk about code development, features, specific bugs, enhancements, patches, and similar things.
Forum rules
Please keep everything here strictly on-topic.
This board is meant for Pale Moon source code development related subjects only like code snippets, patches, specific bugs, git, the repositories, etc.

This is not for tech support! Please do not post tech support questions in the "Development" board!
Please make sure not to use this board for support questions. Please post issues with specific websites, extensions, etc. in the relevant boards for those topics.

Please keep things on-topic as this forum will be used for reference for Pale Moon development. Expect topics that aren't relevant as such to be moved or deleted.
User avatar
Tomaso
Board Warrior
Board Warrior
Posts: 1622
Joined: 2015-07-23, 16:09
Location: Norway

Handling of very large images

Unread post by Tomaso » 2017-03-19, 09:57

Reference:
https://github.com/MoonchildProductions ... ssues/932/

Since you've already decided not to support viewing of very large images in Pale Moon, could you instead change the way such images are handled?

Example:
http://imgsrc.hubblesite.org/hu/db/2006 ... ll_jpg.jpg

Currently, the above image flat out crashes Pale Moon.
Would it be possible for PM to pop up the save window instead, whenever images with a resolution beyond 16k are detected?
If the resolution can't be determined by reading the EXIF data in advance, then perhaps it can be done simply by detecting the first line of pixels or something?

JustOff

Re: Handling of very large images

Unread post by JustOff » 2017-03-19, 16:51

+1, I think Pale Moon should at least not crash in such cases.

User avatar
gracious1
Keeps coming back
Keeps coming back
Posts: 891
Joined: 2016-05-15, 05:00
Location: humid upstate NY

Re: Handling of very large images

Unread post by gracious1 » 2017-03-19, 21:18

This not only crashed PM, it caused my entire GUI to freeze. I had to drop to console and reboot. :thumbdown: What is this supposed to be a picture of?

PM 27.1.2 x86_64
Ubuntu 14.04.5
20 July 1969 🌗 Apollo 11 🌓 "One small step for [a] man, one giant leap for mankind." 🚀

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35633
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Handling of very large images

Unread post by Moonchild » 2017-03-19, 22:24

Such large images are simply out-of-scope. What do you want me to do, restrict display resolutions more?
Crashes like these have to be checked individually because it's possible they hit specific different limits. The limit check was also imposed on PNG images, not JPG which uses a different decoder altogether. I'd have to check what the crash point is here. There is no simple "wallpaper" solution for these things, unless you'd rather I restrict images to very small sizes.

(as an aside, it doesn't crash my browser :P)
Attachments
Image2.jpg
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35633
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Handling of very large images

Unread post by Moonchild » 2017-03-19, 23:00

All this seems to be directly related to the HQ image downscaling code. Likely you won't run into these crashes if you disable that (image.high_quality_downscaling.enabled) or enable "downscale-during-decode" (only works for jpeg right now). For normal use, the HQ downscaler is required to not make downscaled images past 33.3% of original size look ugly.

Point remains that this is out of scope for a web browser. I understand that you'd prefer that the browser doesn't crash, but it's directly related to the available resources a browser runs in. so there is no real "safe smallest size" that can be set. I'd rather let the browser decode/display "as much as it can".
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35633
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Handling of very large images

Unread post by Moonchild » 2017-03-19, 23:39

Further details, as garnered from some Mozilla discussion:

The problem appears to be that we're simply running out of memory, this image is about 15000x15000, at 4bpp this means that it will take 900 MB of memory. For reasons beyond my knowledge we choose to de-optimize this image (this is where the crashes occur here). When de-optimizing we create a VolatileBuffer the size of the surface, but in order to get our data in there we -also- need to create a D3D software buffer to read back to. This brings us to requiring two blocks of 900 MB of contiguous memory which we simply don't have at our disposal.

Now, I can look into if it's possible to port bug #1146663 across, but that would require that all the image decoders we have (including the new jxr and WebP ones) support downscaling during decode, so that's really not trivial and will involve a lot of code changes to do.
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35633
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Handling of very large images

Unread post by Moonchild » 2017-03-20, 00:44

Compromise reached: Images <64Mpix will be HQ scaled, and can be viewed at 1:1 zoom. (example: http://testserver.palemoon.org/huge_images/full_2_8k.jpg) (51Mpix, 10MB download)

Larger images won't HQ scale, and depending on resources will at some point (depending on available resources and D2D drawing surfaces) start to display black at 1:1 zoom.
These images can still be saved to disk and opened in a dedicated viewer. (example: http://testserver.palemoon.org/huge_images/full_2_16k.jpg) (61MB download)
Even larger resolution will simply refuse to display (example: http://testserver.palemoon.org/huge_images/full_jpg.jpg) (25MB download)

This approach will avoid the OOM crashes while not requiring an immediate full rewrite of the image frame structures.
I'll build and push a new unstable version so y'all can torture it with more Hubble images. Just keep in mind that this kind of use remains far outside our scope and this will simply have to be "good enough".
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

User avatar
Tomaso
Board Warrior
Board Warrior
Posts: 1622
Joined: 2015-07-23, 16:09
Location: Norway

Re: Handling of very large images

Unread post by Tomaso » 2017-03-20, 10:41

Moonchild wrote:Even larger resolution will simply refuse to display
Why not pop up the save window instead, allowing such images to be downloaded?

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35633
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Handling of very large images

Unread post by Moonchild » 2017-03-20, 10:48

Tomaso wrote:
Moonchild wrote:Even larger resolution will simply refuse to display
Why not pop up the save window instead, allowing such images to be downloaded?
You can still right-click the placeholder and save as.
You don't want a modal pop-up triggered by an image. Simple as that.
In addition, it'd be needlessly complex to pass this message back from the image decoder all the way to the front-end.

As an aside, I don't understand why you're being so terribly persistent about this. For the last time, it's out of scope.
Anyone putting such large images up on the web should assume people would want to download them to view in dedicated applications, not in a web browser. That means a server-side decision to provide either a packed archive format or use a MIME type that triggers downloads.
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

JustOff

Re: Handling of very large images

Unread post by JustOff » 2017-03-20, 11:08

Moonchild wrote:This approach will avoid the OOM crashes while not requiring an immediate full rewrite of the image frame structures.
Thanks, I confirm that my custom build with this update no more crashes and I'm sure it's more than enough.

rhinoduck

Re: Handling of very large images

Unread post by rhinoduck » 2017-03-20, 11:18

Moonchild wrote:Now, I can look into if it's possible to port bug #1146663 across, but that would require that all the image decoders we have (including the new jxr and WebP ones) support downscaling during decode, so that's really not trivial and will involve a lot of code changes to do.
Just a heads-up that I added the use of Downscaler to nsWEBPDecoder by following what nsJPEGDecoder does. The work on the decoder side was actually pretty straightforward, and I will have no problem doing the same for nsJXRDecoder, and I could potentially do the same for the remaining decoders. I am not sure if there is more to this that needs to be done, I didn't read through the linked bug yet.

I can create a pull request with the changes in the WebP decoder if you'll want it. I suggest creating a tracking issue on GitHub for the whole "downscale-during-decoding" package for further discussion.

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35633
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Handling of very large images

Unread post by Moonchild » 2017-03-20, 13:40

rhinoduck wrote:The work on the decoder side was actually pretty straightforward, and I will have no problem doing the same for nsJXRDecoder, and I could potentially do the same for the remaining decoders. I am not sure if there is more to this that needs to be done, I didn't read through the linked bug yet.

I can create a pull request with the changes in the WebP decoder if you'll want it. I suggest creating a tracking issue on GitHub for the whole "downscale-during-decoding" package for further discussion.
If you want to work on that, that's great. I've created a tracking issue on GitHub to keep track of what all needs doing. Issue #969
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

User avatar
Tomaso
Board Warrior
Board Warrior
Posts: 1622
Joined: 2015-07-23, 16:09
Location: Norway

Re: Handling of very large images

Unread post by Tomaso » 2017-03-20, 15:00

Moonchild wrote:You can still right-click the placeholder and save as.
You don't want a modal pop-up triggered by an image. Simple as that.
Yes of course, but if left-clicking doesn't show anything, it gives the users a false impression of a broken link.
At least a message saying something like "Image to big. Try right-click, Save link as" would be a somewhat more elegant way of doing it.

--
Moonchild wrote:I don't understand why you're being so terribly persistent about this.
Actually, it doesn't matter much to me one way or another.
I just had a couple of ideas that I wanted to get your opinion on.
..but no worries, I won't comment any further on this issue.

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35633
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Handling of very large images

Unread post by Moonchild » 2017-03-20, 17:54

Tomaso wrote:At least a message saying something like "Image to big. Try right-click, Save link as" would be a somewhat more elegant way of doing it.
It's also not possible with the current setup. As said before, passing an error up all the way to the front-end isn't straightforward. A hard failure of the decoder to work because the image it too big for your available resources just comes back as the image having errors. that's just the way it works. For a convenience message for something that is completely out of scope, making specific changes in the decoder, imagelib, back-end interface and front end just to display a message for a situation that shouldn't occur to begin with is totally disproportionate.
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

Locked