Talk about code development, features, specific bugs, enhancements, patches, and similar things.
Forum rules
Please keep everything here strictly on-topic.
This board is meant for Pale Moon source code development related subjects only like code snippets, patches, specific bugs, git, the repositories, etc.
This is not for tech support! Please do not post tech support questions in the "Development" board!
Please make sure not to use this board for support questions. Please post issues with specific websites, extensions, etc. in the relevant boards for those topics.
Please keep things on-topic as this forum will be used for reference for Pale Moon development. Expect topics that aren't relevant as such to be moved or deleted.
-
ChrisCat
- Moongazer
- Posts: 8
- Joined: 2021-04-03, 05:01
Unread post
by ChrisCat » 2024-03-07, 00:03
Massacre wrote: ↑2024-03-06, 23:14
ChrisCat wrote: ↑2024-03-06, 22:57
Not really. Since hitting the frequency ceiling and moving to multi-core processors, we've seen improvements with out of order execution, pipelining, speculative execution, etc.
This one ended with Spectre/Meltdown disaster.
It's still there. Yes, spectre and meltdown caused them to pull back, to be more careful and not so aggressive with it, but speculative execution still happens. But this also goes to show you can't rely on the CPU making your code fast like you used to, and continued improvement relies on the application to put in some leg work of its own to better utilize the processor's capabilities.
Massacre wrote: ↑2024-03-06, 23:14
The improvements are indeed present, but they are rather minor. Increasing number of cores (and software threads using them) gives much more performance.
And using more threads hurts performance on CPUs with fewer cores. Whether it's one thread or three per CPU core, it will do the same amount of work in a given amount of time, and those context switches don't come free.
Massacre wrote: ↑2024-03-06, 23:14
Most of them are distributed computing (it's not much different from mining) or video codecs.
And cryptography, JSON parsing, zlib compression, jpeg processing, and other algorithms. Video also isn't a niche part of the modern web.
Massacre wrote: ↑2024-03-06, 23:14
I doubt there will be substantional performance improvement until that JS engine will be rewritten from scratch.
At least with what's been said so far, AVX does offer "
considerable performance enhancement" and "greatly increase browsing speed" right now.
-
andyprough
- Keeps coming back
- Posts: 752
- Joined: 2020-05-31, 04:33
Unread post
by andyprough » 2024-03-07, 00:05
ChrisCat wrote: ↑2024-03-06, 22:57
Chrome and Firefox may not require AVX, but they also use different tactics for their performance, such as loading up on threads and using more memory.
Interesting idea. I wonder what the performance of Pale Moon would look like if it were allowed to be a memory hog like Chrome or Firefox. Let it use 3X the memory and do all kinds of pre-fetching and so forth. You could have a Chrome killer, it would be fun to see.
I mean, we've entered a world where no one is really able to use a modern OS without a minimum of 16GB of memory anyway, and 32GB+ will be the next new normal. Why not let it rip and see what happens? Could be loads of fun.
-
Moonchild
- Pale Moon guru
- Posts: 35650
- Joined: 2011-08-28, 17:27
- Location: Motala, SE
Unread post
by Moonchild » 2024-03-07, 00:12
Massacre wrote: ↑2024-03-06, 23:46
it's just not worth losing backward compatibility by gaining minor performance.
If it was truly minor, I wouldn't have started this discussion.
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite
-
Massacre
- Moon lover
- Posts: 95
- Joined: 2020-05-01, 13:16
Unread post
by Massacre » 2024-03-07, 00:17
ChrisCat wrote: ↑2024-03-07, 00:03
Massacre wrote: ↑2024-03-06, 23:14
I doubt there will be substantional performance improvement until that JS engine will be rewritten from scratch.
At least with what's been said so far, AVX does offer "
considerable performance enhancement" and "greatly increase browsing speed" right now.
In heavy software media decoding scenario, maybe. Will it speed up JS of "modern web-applications" to be on par with Chromium that they specifically target? I doubt. Will be there much difference when rendering non-media / JS heavy sites (typical scenario of using Pale Moon for now, if you don't want to experience heavy interface lags)? I doubt, too.
-
Massacre
- Moon lover
- Posts: 95
- Joined: 2020-05-01, 13:16
Unread post
by Massacre » 2024-03-07, 00:21
Moonchild wrote: ↑2024-03-07, 00:12
Massacre wrote: ↑2024-03-06, 23:46
it's just not worth losing backward compatibility by gaining minor performance.
If it was truly minor, I wouldn't have started this discussion.
I mean, it's minor when comparing performance of problematic sites against Chromium.
-
athenian200
- Contributing developer
- Posts: 1537
- Joined: 2018-10-28, 19:56
- Location: Georgia
Unread post
by athenian200 » 2024-03-07, 00:43
Massacre wrote: ↑2024-03-07, 00:21
I mean, it's minor when comparing performance of problematic sites against Chromium.
If the user expectation is that we have to keep up with Chromium, then we might as well stop work on Pale Moon now, because that's not realistic. Even Firefox can't do that. And they basically abandoned all their principles trying.
"The Athenians, however, represent the unity of these opposites; in them, mind or spirit has emerged from the Theban subjectivity without losing itself in the Spartan objectivity of ethical life. With the Athenians, the rights of the State and of the individual found as perfect a union as was possible at all at the level of the Greek spirit." -- Hegel's philosophy of Mind
-
suzyne
- Lunatic
- Posts: 364
- Joined: 2023-06-28, 22:43
- Location: Australia
Unread post
by suzyne » 2024-03-07, 01:11
Massacre wrote: ↑2024-03-07, 00:17
Will be there much difference when rendering non-media / JS heavy sites
Conversely, it can be suggested that for users who don't use such sites, then the memory space and speed of the 32 bit version of Pale Moon should be
more than adequate. They don't need the 64 bit version anyway, so the direction it takes is of little consequence for them?
Laptop 1: Windows 10 64-bit, i7 @ 2.80GHz, 16GB, NVIDIA GeForce MX450.
Laptop 2: Windows 10 32-bit, Atom Z3735F @ 1.33GHz, 2GB, Intel HD Graphics.
-
Massacre
- Moon lover
- Posts: 95
- Joined: 2020-05-01, 13:16
Unread post
by Massacre » 2024-03-07, 01:13
athenian200 wrote: ↑2024-03-07, 00:43
If the user expectation is that we have to keep up with Chromium, then we might as well stop work on Pale Moon now, because that's not realistic. Even Firefox can't do that. And they basically abandoned all their principles trying.
Well, you can keep your niche, a lightweight browser designed for multiple non-JS demanding tabs...
-
Massacre
- Moon lover
- Posts: 95
- Joined: 2020-05-01, 13:16
Unread post
by Massacre » 2024-03-07, 01:15
suzyne wrote: ↑2024-03-07, 01:11
Conversely, it can be suggested that for users who don't use such sites, then the memory space and speed of the 32 bit version of Pale Moon should be
more than adequate. They don't need the 64 bit version anyway, so the direction it takes is of little consequence for them?
As I mentioned before, platforms they are planning to drop do actually support at least 16 GB RAM.
-
R3n_001
- Moonbather
- Posts: 66
- Joined: 2019-05-25, 20:39
Unread post
by R3n_001 » 2024-03-07, 01:26
Massacre wrote: ↑2024-03-07, 01:15
As I mentioned before, platforms they are planning to drop do actually support at least 16 GB RAM.
LGA1366 can do 48GB, perhaps even 96GB. Pre-AVX.
If that doesn't count, I have two Phenoms that can do 32GB.
Some DDR3 Core 2 boards might be able to do 32GB too.
-
suzyne
- Lunatic
- Posts: 364
- Joined: 2023-06-28, 22:43
- Location: Australia
Unread post
by suzyne » 2024-03-07, 01:39
Massacre wrote: ↑2024-03-07, 01:15
suzyne wrote: ↑2024-03-07, 01:11
Conversely, it can be suggested that for users who don't use such sites, then the memory space and speed of the 32 bit version of Pale Moon should be
more than adequate. They don't need the 64 bit version anyway, so the direction it takes is of little consequence for them?
As I mentioned before, platforms they are planning to drop do actually support at least 16 GB RAM.
Yes, and that memory can be put to good purposes by other apps.
Do Pale Moon users who
don't use the sites that are packed with JavaScript, and what not, actually consume those sorts (16GB) of amounts of memory for browsing? In my experience 2 ~ 3GB is the max that Pale Moon generally requires and needs. But then I don't relate to people who insist on keeping literally 100's of tabs open and treat open tabs like bookmarks.
Laptop 1: Windows 10 64-bit, i7 @ 2.80GHz, 16GB, NVIDIA GeForce MX450.
Laptop 2: Windows 10 32-bit, Atom Z3735F @ 1.33GHz, 2GB, Intel HD Graphics.
-
Massacre
- Moon lover
- Posts: 95
- Joined: 2020-05-01, 13:16
Unread post
by Massacre » 2024-03-07, 01:55
suzyne wrote: ↑2024-03-07, 01:39
Yes, and that memory can be put to good purposes by other apps.
Do Pale Moon users who
don't use the sites that are packed with JavaScript, and what not, actually consume those sorts (16GB) of amounts of memory for browsing? In my experience 2 ~ 3GB is the max that Pale Moon generally requires and needs. But then I don't relate to people who insist on keeping literally 100's of tabs open and treat open tabs like bookmarks.
No, but ~4GB is a realistic scenario, especially in latest versions of Pale Moon, and 32-bit version tends to lag horribly after it allocates 1.5GB or something like that.
Well, there is a "workaround" by restarting Pale Moon to actually reclaim memory lost in memory leaks for already closed pages... I believe it is related to "JS runtime".
-
suzyne
- Lunatic
- Posts: 364
- Joined: 2023-06-28, 22:43
- Location: Australia
Unread post
by suzyne » 2024-03-07, 02:18
Massacre wrote: ↑2024-03-07, 01:55
~4GB is a realistic scenario
I don't want to be one of the
What! There are people who use the internet in ways different from me? type of forum user, but I would love to know the use case for going over 3GB, because I never have in Pale Moon.
Sure, in other mainstream browsers it might be easier (or when there is a memory leak), but I just haven't gone there in Pale Moon.
Laptop 1: Windows 10 64-bit, i7 @ 2.80GHz, 16GB, NVIDIA GeForce MX450.
Laptop 2: Windows 10 32-bit, Atom Z3735F @ 1.33GHz, 2GB, Intel HD Graphics.
-
moonbat
- Knows the dark side
- Posts: 4984
- Joined: 2015-12-09, 15:45
Unread post
by moonbat » 2024-03-07, 06:49
suzyne wrote: ↑2024-03-07, 02:18
I would love to know the use case for going over 3GB, because I never have in Pale Moon.
Haven't you heard of the new trend of having an open tab count in the triple or quadruple digits because somehow one hasn't heard of bookmarks and needs to have all of them open all the time - and then come crying after the inevitable browser crash
-
Piotr Kostrzewski
- Lunatic
- Posts: 280
- Joined: 2018-08-14, 15:08
Unread post
by Piotr Kostrzewski » 2024-03-07, 07:45
moonbat wrote: ↑2024-03-07, 06:49
suzyne wrote: ↑2024-03-07, 02:18
I would love to know the use case for going over 3GB, because I never have in Pale Moon.
Haven't you heard of the new trend of having an open tab count in the triple or quadruple digits because somehow one hasn't heard of bookmarks and needs to have all of them open all the time - and then come crying after the inevitable browser crash
If you want to know, I exceeded 3 GB usage in Pale Moon. Not intentionally, and only with one page:
https://to-do.office.com/tasks/ (after logging in) I was surprised by this fact. Does anyone else experience this type of wear and tear on this site?
-
moonbat
- Knows the dark side
- Posts: 4984
- Joined: 2015-12-09, 15:45
Unread post
by moonbat » 2024-03-07, 08:04
Off-topic:
Was it after leaving the page open for some time, or almost as soon as you loaded it? I'm guessing the former, and usually that's a memory leak on the page that grows memory usage over time.
-
Piotr Kostrzewski
- Lunatic
- Posts: 280
- Joined: 2018-08-14, 15:08
Unread post
by Piotr Kostrzewski » 2024-03-07, 08:11
moonbat wrote: ↑2024-03-07, 08:04
Was it after leaving the page open for some time, or almost as soon as you loaded it? I'm guessing the former, and usually that's a memory leak on the page that grows memory usage over time.
Almost as soon as the page loads.
And there were an awful lot of errors in the console.
-
Massacre
- Moon lover
- Posts: 95
- Joined: 2020-05-01, 13:16
Unread post
by Massacre » 2024-03-07, 09:33
suzyne wrote: ↑2024-03-07, 02:18
Massacre wrote: ↑2024-03-07, 01:55
~4GB is a realistic scenario
I don't want to be one of the
What! There are people who use the internet in ways different from me? type of forum user, but I would love to know the use case for going over 3GB, because I never have in Pale Moon.
It depends on what and how many sites you are using during same session. Unfortunately, if the site uses
some Javascript that allocates memory, it may be never get completely freed.
-
R3n_001
- Moonbather
- Posts: 66
- Joined: 2019-05-25, 20:39
Unread post
by R3n_001 » 2024-03-07, 10:37
suzyne wrote: ↑2024-03-07, 02:18
Sure, in other mainstream browsers it might be easier (or when there is a memory leak), but I just haven't gone there in Pale Moon.
Can say my YouTube browser can reach over 5GB just on YouTube. I'd assume memory leak in code because I leave my computer on 24/7.
-
ptribble
- Moongazer
- Posts: 8
- Joined: 2021-03-14, 12:10
Unread post
by ptribble » 2024-03-07, 18:40
I presume that keeping to an SSE2 baseline wouldn't invalidate status as an official build?
For Tribblix, I would do just that. I know that I have users on relatively old, but still fairly capable, hardware. And in general I would try and avoid being in the situation where the OS itself would work happily on a wide range of hardware (we require 64-bit, nothing more specific than that) but a key application wouldn't.
In terms of Linux distributions, my recollection is that Ubuntu is using x86-64-v1 as a baseline, RHEL and SLES are currently x86-64-v2, withe RHEL10 mooted as x86-64-v3. So adding an AVX requirement (which means x86-64-v3) means that you're shipping something that has more stringent requirements than the OS underneath. (And, in terms of Microarchitecture Levels, AVX and AVX2 come together.) That's my perspective, but as I'm not a Linux user I have no skin in the game at all.