Compilation time doubled after de-unifying the sources

Discussions about the development and maturation of the platform code (UXP).
Warning: may contain highly-technical topics.

Moderators: trava90, athenian200

JustOff

Compilation time doubled after de-unifying the sources

Unread post by JustOff » 2020-04-28, 17:24

Is it expected that Pale Moon compilation time has almost doubled after de-unifying the sources?

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35602
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Compilation time doubled after de-unifying the sources

Unread post by Moonchild » 2020-04-28, 18:52

That was only de-unifying /dom -- more will follow.

And yes, if your aren't on a particularly powerful machine with a fast drive, it can impact your compilation time significantly.
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

User avatar
Isengrim
Board Warrior
Board Warrior
Posts: 1325
Joined: 2015-09-08, 22:54
Location: 127.0.0.1
Contact:

Re: Compilation time doubled after de-unifying the sources

Unread post by Isengrim » 2020-04-28, 19:23

Just for reference, I tried building Ambassador on my new machine. With platform at the last stable release point it takes about 17 minutes. With trunk it takes about 24. Certainly a noticeable increase in time, but definitely not double.
a.k.a. Ascrod
Linux Mint 19.3 Cinnamon (64-bit), Debian Bullseye (64-bit), Windows 7 (64-bit)
"As long as there is someone who will appreciate the work involved in the creation, the effort is time well spent." ~ Tetsuzou Kamadani, Cave Story

Nightbird
Lunatic
Lunatic
Posts: 279
Joined: 2016-07-18, 21:12

Re: Compilation time doubled after de-unifying the sources

Unread post by Nightbird » 2020-04-28, 20:58

Depending your hardware.

Things will be probably more "difficult" with an "old" processor.
I'm pretty sure that you will need now a "modern" machine with a multi-real-core cpu (a Ryzen ?) and probably a fast hdisk (a SSD maybe).
Diversity is key.

Those who forget the past are doomed to repeat it.

New Tobin Paradigm

Re: Compilation time doubled after de-unifying the sources

Unread post by New Tobin Paradigm » 2020-04-29, 00:23

The question comes down to this.. Do you want it to be harder to develop for but build faster or do you want it to be easier to develop for and take longer to compile?

Dealing, fighting and resolving issues with the shifting concatenation and deprot of UNIFIED_SORCES basicly negates any saved time by compiling faster.

But you wouldn't know that if you haven't ventured into or oversaw those hitting the trouble spots of areas like dom/ or layout/ (I have done both btw). We let it go for as long as we could but at this critical point in development its so called benefits are severely hindering continued development.

User avatar
athenian200
Contributing developer
Contributing developer
Posts: 1535
Joined: 2018-10-28, 19:56
Location: Georgia

Re: Compilation time doubled after de-unifying the sources

Unread post by athenian200 » 2020-04-29, 01:28

I understand your frustration, I was using a machine with a mechanical hard drive and the Solaris linker seems to hit the disk really hard with multiple smaller object files. So for me the time it took to compile UXP went up from about 30 minutes to two hours. That's four times.

However, the fact of the matter is that this is 2020 and we should all be using SSDs for development anyway, and ideally have at least 8-core/16-thread CPUs. This de-unifying of sources is needed to make development easier because the unified building system can be problematic to fix when it fails in unexpected ways, and having a lot of sources in a single object file can make some aspects of debugging significantly harder. I am still completely in favor of it, and am trying to find ways to compensate until I can get a new PC this Christmas.

One of the big things that has helped me is a program called distcc. https://distcc.github.io/

It's not actually a C compiler itself, it's a frontend to GCC that allows you to distribute compilation tasks across multiple machines on a network as a form of distributed computing. An example of how this could be useful is as follows. I have a Surface Book that has a Skylake CPU with two cores, and an NVMe SSD. It has a fast hard drive and can link fast, but has so few cores that it doesn't usually compile faster than my older 3770k and 2600k desktops. What distcc essentially lets me do is perform final linking on the Skylake laptop with the NVMe SSD, while taking advantage of the extra cores in the desktop machines to churn through all the files. This brings my compile times back down significantly.

Another thing worth noting if you're strapped for cash is that you don't actually need the latest and greatest hardware to build UXP. What you need is a reasonably fast SSD and several CPU cores. It's possible to pick up older server hardware that has several cores for a reasonable price. There are a lot of solutions that don't involve going out and spending $1000 on a new machine, so the expectation of finding a bit more processing power or upgrading an existing development machine to an SSD isn't all that unreasonable. Software development often requires fairly powerful machines.

And if nothing else, there's the old-fashioned method of leaving it to build overnight, planning your builds ahead of time and simply being patient while saving up money for a new machine. The fact of the matter is, as projects continue forward and become more complex as hardware gets better, build times go up as projects naturally take advantage of new hardware like SSDs and 8-core CPUs. They start to do various things that make the development easier at the expense of total build time on older hardware. That's just technology marching on.
"The Athenians, however, represent the unity of these opposites; in them, mind or spirit has emerged from the Theban subjectivity without losing itself in the Spartan objectivity of ethical life. With the Athenians, the rights of the State and of the individual found as perfect a union as was possible at all at the level of the Greek spirit." -- Hegel's philosophy of Mind

User avatar
Isengrim
Board Warrior
Board Warrior
Posts: 1325
Joined: 2015-09-08, 22:54
Location: 127.0.0.1
Contact:

Re: Compilation time doubled after de-unifying the sources

Unread post by Isengrim » 2020-04-29, 02:59

New Tobin Paradigm wrote:
2020-04-29, 00:23
The question comes down to this.. Do you want it to be harder to develop for but build faster or do you want it to be easier to develop for and take longer to compile?
I'm not against de-unifying sources by any means as long as the trade-off is worth it, which I assume it is. :)
athenian200 wrote:
2020-04-29, 01:28
However, the fact of the matter is that this is 2020 and we should all be using SSDs for development anyway, and ideally have at least 8-core/16-thread CPUs.
Off-topic:
I should tell you about the time I built Pale Moon and Basilisk on a dual-core Thinkpad from 2012...
a.k.a. Ascrod
Linux Mint 19.3 Cinnamon (64-bit), Debian Bullseye (64-bit), Windows 7 (64-bit)
"As long as there is someone who will appreciate the work involved in the creation, the effort is time well spent." ~ Tetsuzou Kamadani, Cave Story

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35602
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Compilation time doubled after de-unifying the sources

Unread post by Moonchild » 2020-04-29, 08:15

Isengrim wrote:
2020-04-29, 02:59
as long as the trade-off is worth it, which I assume it is. :)
You assume correctly.

An example of something you can run into when using unified sources which have deprot (dependency rotting, the more toxic brother to bitrotting): You make some large (or sometimes small!) additions or removals to a part of the source tree, and when you compile the compilation or linking suddenly fails in completely unrelated code you haven't touched. Scratch your head time, right?
What has happened? The addition or removal changed the sizes or ordering of source files in such a way that source files are grouped differently in chunks to be compiled in unified mode, and the files that need to be grouped together because of deprot (because one file needs e.g. a header included in another file in the same chunk, or a declared used namespace, or what not) are split across different chunks. Result: build or link failure.

Other reasons are actually outlined in the OP in Issue #80 (UXP).

How did deprot come to be? years and years of neglect of the unified code by Mozilla, coding in a way that assumed "everything is available unless it complains it isn't". They themselves regularly run into deprot too now, but only fix it "on-the-go" and "as-needed" in the bugs that shift code around causing build failures.
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

New Tobin Paradigm

Re: Compilation time doubled after de-unifying the sources

Unread post by New Tobin Paradigm » 2020-04-29, 09:13

The issues above are also multiplied by the number of features that can be enabled/disabled by the number of possible target operating systems by the number of applications that can be built.

Mozilla's everything is always everywhere PLUS everything is always built is NOT gonna fly for any of us.

This is not soley "the Pale Moon codebase" (as an analogue of just "the Firefox codebase"), the world does not revolve around JustOff, and his problems are not automatically our emergency.

And that is all I have to say about that.

User avatar
adesh
Board Warrior
Board Warrior
Posts: 1277
Joined: 2017-06-06, 07:38

Re: Compilation time doubled after de-unifying the sources

Unread post by adesh » 2020-04-29, 09:44

I didn't notice any difference in build times but that is because I generally don't care about it as it always takes "long enough" for me.

However I noticed that now Basilisk's object directory not sits at 5.8G while earlier it was around 3.7G in size.
athenian200 wrote:
2020-04-29, 01:28
multiple smaller object files
Size difference may due to much small object files now instead of bigger unified binary files considering that these new files may be smaller than 4k block size on disk. As a result smaller files take up more space on disk.

Just telling my observations.

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35602
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Compilation time doubled after de-unifying the sources

Unread post by Moonchild » 2020-04-29, 10:06

Object files of non-unified building will be smaller but there will be more of them, each containing not only the object code of the file compiled but also its dependent code; this takes more space. But seriously, if you don't even have a few GB free on your disk what are you doing building on that drive anyway? :)
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

JustOff

Re: Compilation time doubled after de-unifying the sources

Unread post by JustOff » 2020-04-29, 19:24

Moonchild wrote:
2020-04-28, 18:52
yes, if your aren't on a particularly powerful machine with a fast drive, it can impact your compilation time significantly.
Well, thanks. That's sad, but I'll try to get used to it.
New Tobin Paradigm wrote:
2020-04-29, 09:13
the world does not revolve around JustOff, and his problems are not automatically our emergency.
I had no doubt you wouldn't miss the chance to bite. I hope you feel better now.

Fedor2

Re: Compilation time doubled after de-unifying the sources

Unread post by Fedor2 » 2020-05-20, 05:45

Not so much, my machine has 2006 cpu - xeon4core and 16gb ram, compiling fully on ramdrive, however object directory get plumper, had to extend ramdrive from 6 to 8gb.

Locked