How true/false are these allegations?

General project discussion.
Use this as a last resort if your topic does not fit in any of the other boards but it still on-topic.
Forum rules
This General Discussion board is meant for topics that are still relevant to Pale Moon, web browsers, browser tech, UXP applications, and related, but don't have a more fitting board available.

Please stick to the relevance of this forum here, which focuses on everything around the Pale Moon project and its user community. "Random" subjects don't belong here, and should be posted in the Off-Topic board.
PaleExplorer29

How true/false are these allegations?

Unread post by PaleExplorer29 » 2019-06-09, 08:16

I found this Reddit comment, which at least shows for me a very in-depth analysis of Pale Moon:

https://old.reddit.com/r/privacytoolsIO ... e/ed0hqwi/

How true/false are these allegations? :eh:


User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35636
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: How true/false are these allegations?

Unread post by Moonchild » 2019-06-09, 13:06

For convenience I'm quoting the text here, posted by a since-deleted reddit account.

Take this as "rumor Control, part II" or something. I will only post this once, feel free to refer people to this post.

It basically boils down to the same old mantra that we're supposedly "old and insecure" and it gets really tiresome to have to respond to that and its various levels of incorrect statements, conclusions and accusations just to try and press a point that we're somehow not worthy to exist.
PM uses an out-dated code-base (38 ESR) that has been well retired for a while now. Their Goanna engine is a fork of an old version of Gecko with bug and security fixes taken from later versions of Gecko. Sooner or later they will start falling behind in features and security and then they fork Gecko again (they already did it 3 times so far!) leading they won't be able to take anything from the upstream. Also Firefox is moving further away from what Pale Moon developers wants so PM can't constantly keep up with a rapidly changing web. Because of this PM won’t have the new web features, doesn't offer real world performance changes and performance improvements of modern web browsers because of supporting a dying infrastructure and basing a browser on old code makes security patches harder; PM even doesn't have an sandbox feature (just like Basilisk browser) so it isn't very secure browser and shouldn't be used at all. And developers seems to be rather cocky about security by claiming to have fixed Meltdown / Spectre in 2016.

And there aren't really enough people behind the Pale Moon project, to keep up with Goanna engine.Therefore, Pale Moon has a compatibility issues with many websites, playback issues, weird bugs, missing many basic components etc. This also leads to decreased performance (without modern improvements) but developers disagrees with browser benchmarking, but it’s not surprising a browser based on four year old code might be slower than a modern one. And Pale Moon has a history of blocking multiple ads-ons like AdNauseam, ABP and NoScript. Excuses were ridiculous like AdNausean is malware etc. This also goes against PM's motto; "Your browser, Your way"'. It's clear that the developers can't professionally deal with this fork.
PM uses an out-dated code-base (38 ESR) that has been well retired for a while now.
False in many respects.
Pale Moon uses a hybrid code base that is its own. It does not verbatim use any mozilla-central code base, and different components have been forked at different points in time to be able to use the best source available for our purposes. Note: forked. Since fork points our own development has rapidly made the code in use unlike the state it was at the fork point and you cannot compare a fork with a rebuild (that simply uses the code with no or minimal changes) that way. If you do, then you should also be saying that Firefox is an old Netscape Navigator, because at one point in time they forked their code from it, and many parts of it still bear the Netscape markers.
Their Goanna engine is a fork of an old version of Gecko with bug and security fixes taken from later versions of Gecko.
True, but overly simplified and incomplete to suggest only minimal work has been done.
Goanna indeed forked off from Gecko as a web rendering and layout component. but once again, forked != used verbatim. We did take security fixes (either ported or rewritten) from upstream for Goanna, because why invent the wheel twice? But aside from that we have our own development and Goanna is most certainly not behaving the same way as Gecko due to our own development on it.
Sooner or later they will start falling behind in features and security and then they fork Gecko again (they already did it 3 times so far!)
False.
This is a forward looking statement based on situations the poster knows anything about.
We've rebased many more times in the past when we were still not drifting as far from Mozilla that a hard fork was necessary, but the later re-forking that has always been more about the way Mozilla has handled their code development (with unnecessary and incessant refactoring and rearchitecturing instead of continuous development) making parts of the browser development that DO need extremely specialized knowledge impossible to port across. Since we've not been building a new browser engine from scratch, we've been relying on this Mozilla development for a workable base at the fork point and further specialized development at least being somewhat portable. The last time we did this kind of rebasing (in 2017) was only necessitated because of SpiderMonkey's development required for ECMAScript. The way SpiderMonkey is entwined with DOM and in turn with the layout and rendering made it impossible to disentangle the components as-needed for a JS engine update without taking on the rest. I've discussed this with various Mozilla employees and they agreed it was impossible to do. This meant a last and final fork which gave rise to UXP, since it would be the last bastion of XUL development before Mozilla would use the sledgehammer on it. Of note: Tycho (v27) was already extremely close to achieving this goal and it was a difficult decision to take on double development work because of Mozilla's rearchitecturing and of course refusal to assist what is seen as a competitor.
Because of this entanglement, stating that rebasing was done "because we fell behind in features" is incorrect, so is the resulting conclusion that "we can't keep up" because we HAVE been keeping up with developments on the web, as Pale Moon's high levels of web compatibility shows, and we've been responsive to any reports of web compatibility issues. As far as ECMAScript goes, we have very complete and ongoing support for what is cooked up in that corner: 100% ES6, ES2016 and ES2017 feature support, and slated more implementation for later drafts and proposals for JavaScript, even if most of that is all syntactic sugar because JS has already been squeezed for trying to do everything for everything with everything, using multiple programming paradigms.
Because of this PM won’t have the new web features, doesn't offer real world performance changes and performance improvements of modern web browsers because of supporting a dying infrastructure
False.
What "dying infrastructure" is this person talking about? Are they saying the web is dying?
We have web features that are actually in use. If they are talking about "the latest draft proposals" then understand that it usually goes like this: Google Chrome comes up with some new 'feature' that web designers didn't ask for but will be easy to implement in their code or look good on paper. Mozilla apes Chrome because they are a Google lapdog by now and have this phobia of becoming "insignificant on the web if not keeping browser parity". This makes for the two parties needed to have WhatWG adopt the (unused and unwanted) feature into whatever "living standard" it would be part of. Then we have a few experimental sites that use this draft/proposal, which is then used by (mostly Firefox) fanbois to paint us black for not supporting draft X or Y. All over somthing that the web at large doesn't even use.
We do offer real-world performance changes and improvements too -- more so than Firefox. Benchmarks are not real-world.
makes security patches harder
Security patches in general are not very difficult. Sure, it's harder than "patch -p1 -i {path/to/patchfile}" because it involves actually porting code... But I've personally been porting sec patches across for over 8 years for this browser and no, it generally doessn't take more than 1 day in a Mozilla rabid release cycle to audit, port and implement what is applicable.
PM even doesn't have an sandbox feature (just like Basilisk browser) so it isn't very secure browser and shouldn't be used at all.
False, False and False.
Pale Moon doesn't have a multi-process sandbox container because Pale Moon (and Basilisk) are not using a multi-process setup. Instead, our "sandboxing" is internal, strictly separating the context in which untrusted content is loaded and scripts executed. In fact, using IPC and e10s has given rise to a hell of a lot more security vulnerability by explicitly relying on a fragile inter-process messaging system and relying on a separate process not being able to escape the context (and the sandboxing container has shown time and again to be insufficient at containing untrusted code/scripting). There's a big fallacy in e10s "security" concepts: a separate process may, in itself, be running at a lower integrity level in the operating system, but if you entwine its functioning with a generally administrator-elevated process, then that link becomes the channel through which exploits get system-level access.
Complaining about security because we have removed the e10s-specific sandbox container because we're not using e10s (dead code cleanup) is terribly uninformed of a statement.
And developers seems to be rather cocky about security by claiming to have fixed Meltdown / Spectre in 2016.
But we did.
We pre-empted the side-channel exploits by making the use of high-precision timers that are not necessary for normal web operation impossible back in 2016. The risk of exploiting very high resolution timers was clear, and mitigated before the world at large found a practical way to exploit it. That doesn't make use "cocky", it is stating a simple fact.

As for the rest of the rant, it's basing on wrong conclusions due to the points lifted out here, and things that have already been extensively discussed (like blocking extensions because of clear and good reasons -- we leverage our blocklist only when necessary, AND have gone through lengths to give the user more control over what is blocked from the preferences, something Firefox also doesn't offer) I won't repeat the discussions about those here, please do a forum search.
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

Locked