Page 1 of 1
Support <picture> element + some things relating to it
Posted: 2014-09-16, 05:32
by __NM64__
Pretty simple, add support for the <picture> element.
Specification:
http://www.w3.org/html/wg/drafts/html/m ... re-element
Walkthrough:
https://longhandpixels.net/blog/2014/02 ... re-element
Demos & info (including current browser support):
http://responsiveimages.org
Bugzilla listings (Status: RESOLVED FIXED):
https://bugzilla.mozilla.org/show_bug.cgi?id=870022
https://bugzilla.mozilla.org/show_bug.cgi?id=870021
https://bugzilla.mozilla.org/show_bug.cgi?id=1030606
Now it may be optimal that, when right clicking on an image that uses the <picture> element and then selecting "Save As...", the browser would by default save the highest available resolution of that image (rather than saving whatever resolution of the image is displayed). Alternatively, perhaps the "Save as..." menu opiton could be a cascaded context menu for images that use <picture>, thereby allowing the user to specifically save any of the images. It would probably be similarly wise to make the right-click "View Image" option also link to the highest resolution image by default.
Another thing is that it may be useful to have the ability to flat out
always load a specific size in a web pages by default, whether it's the highest resolution, the lowest, or a middle resolution. In particular, smaller resolution could be useful for slow and/or bandwidth-metered connections while largest would be good for big high-resolution displays on a fast unlimeted bandwidth connection. Heck, maybe it could even be possible to have a setting to load the lowest resolution images first, then once the page is fully loaded, start loading one of the higher resolutions (this would have a similar result to
interlaced images).
One of the main things I'm concerned about is having fallback images that don't use <picture> being of a lower resolution than those made availble via <picture>. This is demonstrated with the following demo:
http://responsiveimages.org/demos/on-a-grid
I'm also concerned about images that use <picture> having available larger resolutions than whatever image is displayed, but then getting a medium or low resolution image when saving said image and not even being aware that a higher resolution version even exists.
Re: Support <picture> element + some things relating to it
Posted: 2014-09-16, 06:01
by New Tobin Paradigm
Can you find a MozCo bug for it too?
Re: Support <picture> element + some things relating to it
Posted: 2014-09-16, 06:04
by __NM64__
Re: Support <picture> element + some things relating to it
Posted: 2014-09-16, 06:15
by New Tobin Paradigm
Re: Support <picture> element + some things relating to it
Posted: 2014-09-16, 06:26
by __NM64__
Added those Bugzilla listings to the first post.
Matt A Tobin wrote:Fearless
Your choice of words is funny considering that I made this thread specificially because I was paranoid about saving lower-res images when higher-res versions were available and I wasn't even aware of it.
Re: Support <picture> element + some things relating to it
Posted: 2014-09-16, 06:28
by New Tobin Paradigm
The question is now is the commit code anywhere near easy to implement in our codebase or will Moonchild have a busy day ahead of him.
Re: Support <picture> element + some things relating to it
Posted: 2014-09-16, 08:40
by Moonchild
Before I even look at this: can someone please explain to me why we need <picture> when we already have have <img>, and js img.src=URL to change image sources, or even just css to do the same?
Give me a good, pressing reason why this is needed to begin with? (And no, "because my fav site wants to use it" is not a good reason)
Re: Support <picture> element + some things relating to it
Posted: 2014-09-16, 08:45
by __NM64__
This article covers the issue of "why not just use Javascript?":
http://arstechnica.com/information-tech ... eb-faster/
When a server sends a page to your browser, the browser first downloads all the HTML on the page and then parses it. Or at least that's what used to happen. Modern Web browsers attempt to speed up page load times by downloading images before parsing the page's body. The browser starts downloading the image long before it knows where that image will be in the page layout or how big it will need to be.
This is simultaneously a very good thing—it means images load faster—and a very tricky thing. It means using JavaScript to manipulate images can actually slow down your page even when your JavaScript is trying to load smaller images (because you end up fighting the prefetcher and downloading two images).
EDIT: Also relevent:
http://usecases.responsiveimages.org/
Limitations of current techniques
Reliance on semantically neutral elements and CSS backgrounds:
Large images incur unnecessary download and processing time, slowing the experience for users. To work around this problem, web developers specify multiple sources of the same image at different resolutions and then pick the image of the correct size based on the viewport size. As web developers lack the markup to achieve what they need, they end up relying on semantically neutral elements, CSS background images, and JavaScript libraries. In other words, developers are being forced to willfully violate the authoring requirements of HTML.
Bypass preload scanner
The reliance on semantically neutral elements (e.g., the div and span elements), instead of semantically meaningful elements such as img, prevents browsers from loading the image resources until after the DOM has (at least partially) loaded and scripts have run. This directly hinders the performance work browser engineers have done over the years to optimize resource loading (e.g., WebKit's HTMLPreloadScanner). Unnecessarily bypassing things like the preload scanner can have measurable performance impact when loading documents. See, for example, The WebKit PreloadScanner by Tony Gentilcore for a small study that demonstrates an up to 20% impact in load time when WebKit's PreloadScanner is disabled. More recent performance tests yield similar results. For more information, see How the Browser Pre-loader Makes Pages Load Faster by Andy Davies.
Reliance on scripts and server-side processing:
The techniques rely on either JavaScript or a server-side solution (or both), which adds complexity and redundant HTTP requests to the development process. Furthermore, the script-based solutions are unavailable to users who have turned off JavaScript.
Re: Support <picture> element + some things relating to it
Posted: 2014-09-16, 09:06
by Moonchild
Modern Web browsers attempt to speed up page load times by downloading images before parsing the page's body. The browser starts downloading the image long before it knows where that image will be in the page layout or how big it will need to be.
No, they don't. That's silly and not true.
How would a browser know what images are in a page before parsing the html? clairvoyance?
Second article is also very opinionated and debateable. A properly designed website (especially modern one) will decide in real-time what to download by using dynamic html (either client-side or server-side). Thinking reliance on javascript is bad is strange - it's the main focus of most web browsers out there to render modern web pages using DOM structures. Web developers do not lack the markup to achieve what they need for "responsive design" and saying otherwise is just not true either.
W3C - "Approving solutions for problems we don't have"
Posted: 2014-09-16, 09:10
by New Tobin Paradigm
Modern Web browsers attempt to speed up page load times by downloading images before parsing the page's body. The browser starts downloading the image long before it knows where that image will be in the page layout or how big it will need to be.
Code: Select all
[05:06:51.278] GET http://binaryoutcast.com/ [HTTP/1.1 200 OK 363ms]
[05:06:51.761] GET http://roboto-webfont.googlecode.com/svn/trunk/roboto.all.css [HTTP/1.1 200 OK 204ms]
[05:06:51.762] GET http://binaryoutcast.com/wp-content/themes/binoc-v24/style.css [HTTP/1.1 200 OK 28ms]
[05:06:51.763] GET http://code.jquery.com/jquery-1.7.1.js [HTTP/1.1 200 OK 141ms]
[05:06:51.763] GET http://ajax.googleapis.com/ajax/libs/jqueryui/1.8.16/jquery-ui.min.js [HTTP/1.1 200 OK 285ms]
[05:06:51.764] GET http://platform.twitter.com/widgets.js [HTTP/1.1 200 OK 102ms]
[05:06:51.765] GET http://binaryoutcast.com/wp-content/plugins/multisite-global-search/style.css?ver=3.9.2 [HTTP/1.1 200 OK 29ms]
[05:06:51.765] GET http://binaryoutcast.com/wp-content/plugins/ckeditor-for-wordpress/ckeditor/ckeditor.js?t=CBDD&ver=3.9.2 [HTTP/1.1 200 OK 203ms]
[05:06:51.766] GET http://binaryoutcast.com/wp-includes/js/jquery/jquery.js?ver=1.11.0 [HTTP/1.1 200 OK 230ms]
[05:06:51.767] GET http://binaryoutcast.com/wp-includes/js/jquery/jquery-migrate.min.js?ver=1.2.1 [HTTP/1.1 200 OK 234ms]
[05:06:51.768] GET http://binaryoutcast.com/wp-content/plugins/ckeditor-for-wordpress/includes/ckeditor.utils.js?ver=3.9.2 [HTTP/1.1 200 OK 232ms]
[05:06:51.769] GET http://binaryoutcast.com/media/v24/img/division/ux/icons/icon-home.png [HTTP/1.1 200 OK 286ms]
[05:06:51.769] GET http://binaryoutcast.com/wp-content/uploads/2012/12/binoc-monitor.png [HTTP/1.1 200 OK 440ms]
[05:06:52.258] GET http://binaryoutcast.com/wp-content/themes/binoc-v24/media/obsolete/pre-clarity/img/design/v22/background.jpg [HTTP/1.1 200 OK 237ms]
[05:06:52.259] GET http://binaryoutcast.com/wp-content/themes/binoc-v24/media/v24/img/core/ux/header/logo.png [HTTP/1.1 200 OK 55ms]
[05:06:52.259] GET http://binaryoutcast.com/wp-content/themes/binoc-v24/media/v24/img/core/assets/xlucent-pixel-dark.png [HTTP/1.1 200 OK 55ms]
Seems to me images are the LAST thing that loads.. Also Yes some browsers like chrome use prefetching but that causes a security and privacy issue as well as doesn't cover images as far as I know...
Re: Support <picture> element + some things relating to it
Posted: 2014-09-17, 05:57
by zcorpan
The arstechnica article does a terrible job at describing what actually happens. It was never true that browsers first downloaded everything and then started parsing. It is also not really true that images are downloaded before the markup is parsed.
What used to happen is that images are downloaded as soon as the HTML parses creates the <img> element (subject to download priorities of course). That could well be before the browser has completely fetched the CSS and hence, before it knows the layout.
What happens now is pretty much the same thing, except if there is a <script src> that blocks the HTML parser from continuing (it has to wait in case the script uses document.write), browsers now continue parsing the rest of the page with a lightweight speculative tokenizer in order to find further URLs to start fetching while the blocking script is downloaded. This includes images and it can happen before the browser knows the layout.
It was a requirement that responsive images do not break this optimization. Hence <picture>. If it was not important, you could use JS instead.
HTH
Re: Support <picture> element + some things relating to it
Posted: 2014-09-17, 11:34
by Moonchild
So, in other words, it's to counter bad web design. I don't see how that is going to be priority.
Re: Support <picture> element + some things relating to it
Posted: 2014-09-18, 07:29
by zcorpan
Actually even if you don't use any blocking <script src>s, including the critical resources in the markup is still a good idea to let the browser properly prioritize its network resources.
Re: Support <picture> element + some things relating to it
Posted: 2014-09-18, 08:08
by Moonchild
If web designers would actually follow the few simple design rules there are, this wouldn't be a problem to begin with. e.g. put CSS in the <head>, don't use blocking javascripts above the fold, load js asynchronously when you can, and use a lazy loading system on websites that have a very large content and lots of images. Simple rules, easy to follow, and no need for arbitrary elements. It's like this is an ongoing war between "camp scripting" and "camp markup". Of course the W3C is going to be solidly in "camp markup". We have advanced scripting, and modern web browsers who are exceedingly good at it, so why jump through hoops to avoid it?