HN Gopher Feed (2017-11-28) - page 1 of 10 ___________________________________________________________________
How Stylo Brought Rust and Servo to Firefox
309 points by mnemonik
http://bholley.net/blog/2017/stylo.html___________________________________________________________________
VeejayRampay - 1 hours ago
It might sound very naive to say this, but I found it very cool
that it was someone from the States, an Aussie and a Spaniard
working on this, open source is something magical when you think
about it. Props to everyone involved, all those projects sound like
a lot of fun for a good cause.
nopit - 2 hours ago
Installed the new firefox, had 1 tab running for a few days which
had allocated more than 10gb of virtual memory. I had high hopes
but im sticking with chrome.
DigitalJack - 2 hours ago
Forgive my ignorance, but does having X amount of virtual memory
allocated necessarily correspond to physical memory (and storage
for that matter?)
tjoff - 2 hours ago
It does not (but it is a big problem for 32 bit applications
where you quickly run out of addressable memory).
deathanatos - 2 hours ago
No, it doesn't necessarily correspond. It could be an
indicator, though. RSS would be more useful, IMO.That said,
even if the poster is correct, it isn't necessarily wrong
either. AFAICT, nothing stops JS on a page from allocating that
much memory, and "leaking" it (e.g., holding on to the JS
object, maybe accidentally in a giant list, and not making use
of it). It isn't the browser's fault if JS is actually "using"
that much RAM.
[deleted]
thomastjeffery - 2 hours ago
> Stylo was the culmination of a near-decade of R&D, a multiple-
moonshot effort to build a better browser by building a better
language.This is the most impressive, and useful aspect of all the
recent work in Firefox. Rust is an amazing language. It really
brings something new to the table with its borrow checker.The fact
that rust was created as part of a greater effort to work on a web
browser is amazing.
stingraycharles - 2 hours ago
What I wonder, and I do not mean this in negative way, is whether
this would have happened in a more commercially oriented
organisation. Mozilla remains a foundation, and I consider Rust a
fruit of their labour in itself.To put it another way, I find it
hard to justify developing Rust just for a web browser. But if
you consider it from the perspective of a foundation developing
tools for the developer community as a whole, it makes much more
sense.
kochthesecond - 2 hours ago
Well, we could look at Go and Erlang.
lucozade - 2 hours ago
Is there anything specific about Rust that you don?t think a
commercial organisation would do?Obviously, commercial
organisations have a long track record of developing and
supporting programming languages.
wyager - 57 minutes ago
> Is there anything specific about Rust that you don?t think
a commercial organisation would do?A good job. Most
commercial PL efforts kind of suck. There are a small number
of exceptions, and Rust is one of them.
drb91 - 2 hours ago
Well, it's been in development for at least 7 years, without
any product until the url parser appeared. That's far greater
of an investment with no return than I've seen at any
software enterprise outside of Google X, Microsoft Research,
and Intel's various funding efforts, and each of those
arguably have had more projects cut off before they return
than they have had projects that succeeded in generating some
return.Enterprises tend to invest in <5 year projects, I've
noted, and 5 years is a hell of a long time for an extended
investment.
kibwen - 1 hours ago
This is pretty much just the purview of R&D departments in
general, which includes Mozilla Research. It's just a happy
coincidence that, thanks to open source, software companies
are relatively incentivized to share their projects with
the public rather than keeping them proprietary, which is
the default tack for R&D units in other industries.
drb91 - 1 hours ago
Well, I'm not even necessarily trying to soap box here
about open source or free software per se?I do think
commercial/proprietary research has value to society as a
whole, albeit less value. For instance, take Big
Table?enormously influential and, I think, beneficial to
society in spite of being largely closed off to the
public. However, rust is way better for everyone, and I
find it shocking it came from such a relatively small
organization.
Manishearth - 2 hours ago
> Well, it's been in development for at least 7 years,
without any product until the url parser appeared.This is
inaccurate, the URL parser never was and still isn't in a
released Firefox.The first Rust in a released firefox was
the mp4 metadata code.It's worth noting that in those 7
years Servo advanced a lot, which meant that the Stylo
project didn't have to rewrite a style system, just take an
existing one and polish it. (This is still a lot of work,
but not as much)
drb91 - 1 hours ago
Well, I'm certainly happy to admit my details are
incorrect, but I think my broad point still stands?they
look longer term than the typical commercial offering.
stingraycharles - 2 hours ago
In addition to this, Mozilla is hardly the size as
Microsoft or Google. A more commercially focused version of
Mozilla would probably have dedicated their resources
elsewhere.
jopsen - 2 hours ago
> I find it hard to justify developing Rust just for a web
browser"just for", our browsers are already very complex.The
browser is responsible for sandboxing and in effect taming the
wild web.. and the web is not looking to become less wild :)in
the future browsers will have to prioritize CPU time between
tabs seeking to mine bitcoin, crazy ad schemes, and battery
power.In terms of security, browser bugs scares me a lot more
than some privilege escalation bug in the kernel. Because they
can quickly be deployed widely.
wongarsu - 9 minutes ago
effectively, browsers are growing up to gain most of the same
functionality as modern operating systems, with all the
complexity that brings.
pjmlp - 1 hours ago
The majority of programming languages used across the industry
have had commercially oriented organisations behind them,
including all the C family of programming languages.
thomastjeffery - 1 hours ago
It's certainly true that corporations do put a lot of work into
languages and runtimes. Apple created LLVM and clang, Microsoft
created .NET and CLR with C#, F#, VB.NET, etc.These projects
were valuable to Apple and Microsoft for a variety of reasons:*
promoting their IDE: XCode builds faster, and has better error
messages. You can use any .NET language with Visual Studio in
the same project.* promoting their platform: Objective-C and
Cocoa let you create fast GUI apps in a standard way, and we
don't need GCC anymore. .NET provides a useful feature-complete
standard library over a variety of languages.To contrast, Rust
was made with the intention of simply making a better systems
language. Rust doesn't have a standard library or environment
tied to a specific OS or proprietary dependencies. Rust itself
doesn't promote Windows, OS X, ASP.NET, Cocoa, IOS, Android,
etc. That is what makes it seem much less likely that rust
would be created by a corporation.
AceJohnny2 - 1 hours ago
> Apple created LLVM and clangTo be specific, LLVM started as
an academic project by Vikram Adve and Chris Lattner in 2000.
Apple hired Chris Lattner in 2005 to work on LLVM for Apple.
Clang, though, does appear to have been an Apple project,
being introduced by Steve Naroff of Apple in 2007 as an open-
source project.
thomastjeffery - 51 minutes ago
Thanks for pointing that out.LLVM is one of the main things
Apple gets to claim credit for, yet they aren't the only
ones who deserve it.LLVM is one of the reasons rust is so
great, and the world is better because of it.
bluejekyll - 20 minutes ago
You can still give Apple credit for having sponsored a
significant investment into LLVM. Without them, it might
not have taken off the way it has.
wongarsu - 2 hours ago
A classic example would be Erlang, which was developed at
Ericsson for use in telephone exchanges.
aidenn0 - 46 minutes ago
The environment in which Erlang was developed was very
different from the environment today. There were no third-
party 4GLs available targeting the niche that Ericsson
wanted, and there are good reasons to not want to use C in a
telephone exchange.
ticklemyelmo - 2 hours ago
Having trouble calculating the motion of bodies due to gravity,
better invent calculus.
linkregister - 1 hours ago
I love Firefox Quantum and it has replaced Chrome as my browser at
home. It's memory consumption is far lower with the same amount of
tabs open.That said, why does it perform slower than Chrome on most
benchmarks? Is it due to the Chrome team doing much more grunt work
regarding parallelism and asynchronous I/O? Or are there still
features in the current Firefox build that still call the original
engine?Does Rust have a runtime penalty as Golang does?
steveklabnik - 1 hours ago
> most benchmarksWhich benchmarks are you talking about? It
depends on what those benchmarks measure.For example, a lot of
the Quantum work was in user-percieved UI latency; unless the
benchmark is measuring that, and I imagine that's a hard thing to
measure, it's not going to show up.> Does Rust have a runtime
penalty as Golang does?Rust has the same amount of runtime as C
does: very very little. https://github.com/rust-
lang/rust/blob/master/src/libstd/rt....
ricardobeat - 41 minutes ago
Not sure if this is normal, but I have very noticeable lag in
the search/address bar autocomplete which does make the whole
browser feel a bit slow (MacOS Sierra, using Developer
Edition).And since we are here, the prompt/dialog windows in FF
are still not native looking too. These are my two major
complaints :)
steveklabnik - 32 minutes ago
I don't work on Firefox, but you should file bugs. I've
generally had a pretty good time doing so.
linkregister - 22 minutes ago
Thanks, that's good to know about Rust's runtime.I'm
embarrassed to say that I just blindly trusted a couple of
websites' claims that they ran benchmarks, without verifying
they're even industry-standard. The articles were on Mashable
and AppleInsider.Mashable tested webpage load times, which only
one dimension of many to optimize for. AppleInsider looked at
speed, CPU usage, and memory consumption.
steveklabnik - 15 minutes ago
No worries! Benchmarking is super hard, generally, and
browsers are so huge, there's tons of things that you can
measure. I'm not trying to say that you shouldn't trust any
of them, just that that might be the source of discrepancy
between what you've experienced and those numbers.It also
true that sometimes it's slow! There's always more to do.
Brakenshire - 2 hours ago
One thing I've been wondering is that Stylo and Webrender can
parallelize CSS and Paint, respectively, but I haven't seen any
mention in Project Quantum (the project to integrate Servo
components into Firefox/Gecko) of any component to parallelize
layout, which is probably the biggest bottleneck on the web at the
moment.Is parallel layout something which can only be done through
a full rewrite, hence with Servo, and bringing Servo up to full web
compatibility, or can this be handled through the Project Quantum
process, of hiving off components from Servo into Firefox?
kibwen - 1 hours ago
The OP links a video from 2015 that implies that one of the
advantages of making Stylo the first Servo component in Gecko is
that the next phase in the pipeline, layout, will be able to
benefit from having a well-defined interface in place. I'm
curious about this as well!
Manishearth - 1 hours ago
Integrating layout is a lot more challenging.Now, once stylo and
webrender are in play, ideally layout can just fit in between.
All the interfacing already exists from Servo.However, there are
a lot more things in Firefox that talk to layout. This would need
work, more work than Stylo.But this isn't the major issue. The
major issue is that Servo's layout has a lot of missing pieces, a
lot more than was the case with its style system. It's hard to
incrementally replace layout the way webrender did with rendering
(fall back when you can't render, render to texture, include
it).So it's doable, but a lot more work.
bdmarian - 2 hours ago
I really like the new Fox. I?ve tried switching over completely but
I think it?s causing some random BSODs on my Latitude E5570. The
laptop does have a second Nvidia graphics card, for which there is
no driver installed. ( don?t ask :) I?m perfectly fine with the
onboard Intel and I much prefer the extra hours of battery life)
haberman - 3 hours ago
I'm a huge, huge fan of Rust, Stylo, Servo, WebRender, etc. Hats
off to everyone involved.
pcwalton - 3 hours ago
Thanks for the kind words!
[deleted]
pitaj - 4 hours ago
One thing I've noticed about Firefox, especially on mobile, is that
transform animations are pretty janky.Does anyone know if this is
being worked on? Should I submit a bug report?
cpeterso - 2 hours ago
Stylo is now enabled in Firefox for Android's Nightly builds. You
can install Nightly from the Google Play Store to see if Stylo
makes a difference for the animation problems.https://play.google
.com/store/apps/details?id=org.mozilla.fe...
pitaj - 1 hours ago
since when? I've been running Nightly for months.
johnp_ - 1 hours ago
Landed ~5 days ago:https://hg.mozilla.org/mozilla-
central/rev/db56323cd08f
muizelaar - 4 hours ago
There are no known issues about this. You should submit a bug
report.
pcwalton - 4 hours ago
Feel free to file a bug report with your hardware, Firefox
version, and test case, yes. These often boil down to simple bugs
(hardware-specific or page-specific) that can be quickly fixed
when isolated.The medium-term effort to revamp the graphics stack
is WebRender. Note that, like Stylo, WebRender is not just meant
to achieve parity with other browsers. It's a different
architecture entirely that is more similar to what games do than
what current browsers do.
aidenn0 - 42 minutes ago
Kind of OT, but I've noticed both latency and battery-life hits
for compositing WMs in X (compared to traditional WMs).Are
those things being measured at all in FF? It may be that the
tradeoff is worth it (and I have no doubt it can be done better
than the median compositing WM on linux), but it would be good
to have that data.On the other hand, it may be moot if Wayland
does end up taking over from X.
spiderfarmer - 3 hours ago
Firefox? CSS transform/animation performance is terrible on
macOS. I filed a bugreport but there?s no interest in solving
it unfortunately.https://bugzilla.mozilla.org/show_bug.cgi?id=1
407536
pcwalton - 3 hours ago
Those are very general conclusions to draw from one test
case. The bouncing ball test runs at 60 FPS for me on macOS;
most of the time is spent in painting, as expected. Likewise,
Stripe scrolls at 60 FPS for me.I should note that the
bouncing ball test is the kind of thing that WebRender is
designed to improve?painting-intensive animations?so it's
obviously untrue that there's no interest in improving this
sort of workload. When running your bouncing ball test in
Servo with master WebRender, I observed CPU usage of no
higher than 15% (as well as 60 FPS)?
kbenson - 3 hours ago
> With the page at stripe.com, I don't see any difference
between FF52ESR, FF56.0.2, FF57.0b14 with servo enabled, or
with it disabled. On my 2012 Macbook Air with macOS 10.12, I
see about 98-108% CPU on each, according to iStat Menus CPU
meter. With Safari it's about 35%. That's exactly what I
would expect based on past experience.> I would probably
close this as invalid, as it's not something new or specific
to Quantum or Servo, or as a duplicate of one of the older
bugs, though I'm not sure which.That likely points as to why
there's little movement on this bug. It's title can be
interpreted to indicate a Quantum regression, but it's a
general issue that's longstanding, so it may be the people
that are seeing it are not focused on it (they're likely
tracking down and fixing actual regressions, not known
problems).I know that doesn't help your issue, but it may
help you locate the relevant bug report and lend your weight
there, if you feel so inclined.
DonbunEf7 - 3 hours ago
"So it?s pretty clear by now that ?don?t make mistakes? is not a
viable strategy."This is more generally known as Constant Flawless
Vigilance: https://capability.party/memes/2017/09/11/constant-
flawless-...
xstartup - 1 hours ago
I use firefox/rust every day. Thanks for the one of the most
interesting language!
agentultra - 2 hours ago
This is a great story. For large, existing code-bases incremental
change is the only strategy I've seen work. Kudos to the team
behind it.
FlyingSnake - 2 hours ago
> They borrowed Apple?s C++ compiler backend, which lets Rust match
C++ in speed without reimplementing decades of platform-specific
code generation optimizations.This was a pretty smart move by the
Rust team, and this gave them a rock solid platform to go cross-
platform. In words of Newton, "If I have seen further it is by
standing on the shoulders of giants". Kudos team Rust, and let's
hope they eat C++'s lunch soon.
sp332 - 2 hours ago
Does this just mean LLVM? Because it's weird to describe it as
"Apple's", Apple just uses it.
cakoose - 1 hours ago
LLVM started out as a research project at the University of
Illinois, but Apple hired the lead developer in 2005 and put a
ton of work into making it good enough to replace GCC. Apple
also originally wrote Clang, LLVM's C/C++/Objective-C frontend,
though Rust doesn't directly rely on this part.Calling it
"Apple's" threw me off too, but it's not entirely misleading
because without Apple, it might not have become a production-
ready compiler. At the least, I would say Apple did more than
"just use it".
masklinn - 1 hours ago
> Apple just uses it.Apple "just uses LLVM" in the same way
Apple "just uses Webkit".Apple hired Chris Lattner back in
2005, just as he completed his PhD. At the time, LLVM could
just barely compile Qt4[0] (the result didn't quite actually
work yet) and was still hosted on CVS.Lattner had to make LLVM
production-ready (rather than a research project), add a bunch
of major features, start a C/C++/ObjC frontend (Clang) and
create a team of LLVM developers in the company.Apple shipped
their first bits of LLVM-based tooling in Leopard in 2007.[0]
https://web.archive.org/web/20111004073001/http://lists.trol...
Manishearth - 2 hours ago
Yes, this is LLVM.
Sammi - 2 hours ago
Apple doesn't "just use it". LLVM was originally developed at
an university, then made production ready at Apple, and is
still very much influenced by Apple, as Apple bases almost
everything they make on it.
fulafel - 2 hours ago
It's slightly oversimplified. Apple hired Chris Lattner from
academia to ramp up work on LLVM & create Clang. (Apple always
hated GNU stuff so the motive might not have been purely
technical.)
guelo - 1 hours ago
It's pretty much standard these days for any non-VM language to
target LLVM.
thomastjeffery - 2 hours ago
> the breadth of the web platform is staggering. It grew
organically over almost three decades, has no clear limits in
scope, and has lots of tricky observables that thwart attempts to
simplify.It would be great to create the html/css/javascript stack
from scratch, or at least make a non-backwards-compatible version
that is simpler and can perform better. HTML5 showed us that can
work.
dullgiulio - 2 hours ago
Yeah but Firefox is already struggling while supporting all the
possible standards and more ("sorry our site is better view with
Google IE4... ehm Google Chrome").The whole Mozilla strategy of
corroding Firefox piece by piece is actually very professional.
Big backwards-incompatible transitions in technology almost
always fail.
Manishearth - 2 hours ago
> sorry our site is better view with Google IE4... ehm Google
ChromeFWIW this is usually due to folks doing performance work
in only one browser or not really testing well and slapping
that label on after the fact.Or stuff like Hangouts and Allo
where they use nonstandard features.The major things Firefox
doesn't support that Chrome does are U2F (it does support it
now, but flipped off, will flip on soon I think) and web
components (support should be available next year I guess; this
kinda stalled because of lots of spec churn and Google
implementing an earlier incompatible spec early or something.)
notriddle - 2 hours ago
Didn't HTML5 only remove features that didn't work consistently
across browsers anyway?
gjem97 - 2 hours ago
What parts of FF 57 are written in Rust? Just Stylo?Edit: I don't
intend for this to sound like I'm complaining, just interested.
[deleted]
steveklabnik - 2 hours ago
Stylo is the biggest and most significant thing; there are some
smaller bits (a media parser, and something else?) included
before 57.
Manishearth - 2 hours ago
I'd say the change of the encoding stack to encoding-rs is
pretty significant; while it's not that much code it's stuff
that gets used throughout the codebase.
steveklabnik - 1 hours ago
That's fair, and you know the impact better than I!
cpeterso - 2 hours ago
Stylo is new in Firefox 57, but Mozilla has shipped other Rust
code in earlier Firefox versions:https://wiki.mozilla.org/Oxidati
on#Rust_components_in_Firefo...Completed: MP4 metadata parser
(Firefox 48) Replace uconv with encoding-rs (Firefox 56) U2F
HID backend (Firefox 57) In progress: URL parser WebM demuxer
WebRender (from Servo) Audio remoting for Linux SDP parsing
in WebRTC (aiming for Firefox 59) Linebreaking with xi-unicode
Optimizing WebVM compiler backend: cretonne
phkahler - 1 hours ago
Can anyone explain what a URL parser does and why it's so
complex? I feel like there's a whole interesting story lurking
there.
kolme - 40 minutes ago
URLs have been a security issue for browsers in the past, and
can get pretty hairy. From UTF-8 coded domain names to
whatever you want to "urlencode". For example, you can encode
whole images into URLs, for embedding them in CSS files.Old
IE versions had a hard URL length limit and were very picky
with the characters in domain names, both limitations
included as "security fixes" (which broke the standards).
steveklabnik - 46 minutes ago
A URL parser takes a string with a URL in it, and returns
some sort of data structure that represents the URL.It's
complex because URLs are complex; I believe this is the
correct RFC: https://tools.ietf.org/html/rfc3986 It's 60
pages long.(That said, page length is only a proxy for
complexity, of course)
noir_lord - 36 minutes ago
As someone who once tried to write code to do it to avoid
pulling in a dependency.Never again, it's not just that the
spec is 60 pages long but that the actual behaviour out in
the real world is miles away from the spec, the web is a
complex place where standards are...rarely standard.
mbrubeck - 33 minutes ago
And, as a bonus, there's the other URL standard, which
describes what browsers actually
do:https://url.spec.whatwg.org/
wyldfire - 3 hours ago
Anecdote regarding this new FF:I would find frequent cases where my
system would stall for 10-20s (could not caps lock toggle, pointer
stopped moving). I almost always have just Chrome and gnome-
terminal open (Ubuntu 16.04). I had attributed it to either a
hardware or BIOS/firmware defect.Now, after switching to Firefox I
have gone a week without seeing those stalls.YMMV -- I never
bothered to investigate, it could be something as simple as
slightly-less-memory-consumption from FF, or still a hardware
defect that FF doesn't happen to trigger.
plq - 3 hours ago
Typical behavior when the OS is writing to the swap space to free
some ram.If that ever happens again, you can run "free" on the
terminal to see whether this is the case.
throwaway613834 - 3 hours ago
This sounds vaguely like what I've been experiencing on recent
Chrome versions. On Windows I've had Chrome randomly hang...
initially on the network, then after a few seconds even the UI
freezes. When that happens, if I launch Firefox and try to access
the network, it hangs too. But programs that don't try to access
the network don't hang. Then after maybe (very roughly) 30
seconds, it all goes back to normal. No idea what's been going on
but it seems like you might be experiencing a same thing, and it
seems like a recent bug on Chrome versions, not a firmware
issue... I'm just confused at how it affects other programs. It
didn't use to be like this just a few weeks ago.
tdb7893 - 3 hours ago
I notice chrome having these issues if I'm running out of
memory or if another program is trying to read from the hard
drive at the same time.
throwaway613834 - 2 hours ago
I am most definitely not running out of memory or having
other programs active. I easily have like > 10 GB free RAM
and it happens when nothing else is open.Like I was
suggesting early -- my habits haven't changed. It's started
doing this quite recently. It wasn't like this a few weeks
ago.
dahauns - 31 minutes ago
Just a shot in the dark, but do you have an nVidia GPU? Some
drivers caused hangs with Chrome when GPU
acceleration/rasterization was enabled in the browser settings.
[deleted]
sctb - 2 hours ago
Please don't.https://news.ycombinator.com/newsguidelines.html
kbd - 2 hours ago
It's gratifying to see how successfully the same organization has
learned from the debacle that was the rewrite from Netscape 4 to
Mozilla in the first place. That time, they didn't release for
years, losing market share and ceding the web to Internet Explorer
for the next decade. Joel Spolsky wrote multiple articles[1][2]
pointing out their folly.This time, their "multiple-moonshot
effort" is paying off big-time because they're doing it
incrementally. Kudos![1] https://www.joelonsoftware.com/2000/04/06
/things-you-should-...[2] https://www.joelonsoftware.com/2000/11/20
/netscape-goes-bonk...
cwzwarich - 1 hours ago
The original plan was for the Servo project was to develop a new
browser engine separate from Gecko, similar to the original
Mozilla transition. An alpha-quality browser with Servo as an
engine was originally a 2015 goal, and the active goal tracking
this was removed from the roadmap in 2016:https://github.com/serv
o/servo/wiki/Roadmap/_compare/47f490b...As someone who was
involved with Servo during that time period, I was disappointed
at the time that it was quite obviously not going to happen.
However, looking at what has happened since then, the change of
focus towards Firefox integration was definitely a good move.
majewsky - 3 minutes ago
When I looked at browser.html, it was what you could call an
"alpha-quality browser", unless we use different definitions of
the term "alpha-quality".
mjw1007 - 1 hours ago
Joel is making two separate claims there, though he doesn't
cleanly distinguish them.One is that rewriting from scratch is
going to give you a worse result than incremental change from a
technical point of view (the ? absolutely no reason to believe
that you are going to do a better job than you did the first time
? bit).The second is that independent of the technical merits,
rewriting from scratch will be a bad commercial decision (the ?
throwing away your market leadership ? bit).We now know much more
about how this turned out for his chosen example, and I think
it's clear he was entirely wrong about the first claim (which he
spends most of his time discussing). Gecko was a great technical
success, and the cross-platform XUL stuff he complains about
turned out to have many advantages (supporting plenty of
innovation in addons which I don't think we'd have seen
otherwise).It's less clear whether he's right about the second:
certainly Netscape did cede the field to IE for several years,
but maybe that would have happened anyway: Netscape 4 wasn't much
of a platform to build on. I think mozilla.org considered as a
business has done better than most people would have expected in
2000.
aidenn0 - 49 minutes ago
In Joel's defense, we don't know that gecko is better than X
years of incremental changes to Netscape 4.One of the big
issues with software engineering advice is that it is really
hard to find apples-to-apples comparisons for outcomes.
mjw1007 - 44 minutes ago
True.I think we can say that Gecko ended up technically
better than incremental changes to Internet Explorer, which I
think was starting off from a more maintainable codebase than
Netscape 4. That's hardly conclusive but it's some evidence.
aidenn0 - 28 minutes ago
Indeed. My intuition based nothing other than my
subjective experience is that there are times that throwing
away the code is the correct decision, but they are a
rounding error compared to the times that people want to
throw away the code, so to a first order of approximation
"never start over from scratch" is correct.Simply shifting
the burden of proof to the "rewrite" side is usually
sufficient. Where I currently work a rewrite request is
not answered with "no" but "show me the data." 90% of the
time the requester doesn't even try to gather data, the
other 10% of the time results in some useful metrics about
a component that everyone agrees needs some TLC, whatever
the final decision.
JupiterMoon - 3 hours ago
FF has for me crashed more times in the last week than in the
previous year. - Multiple installs on different Linux systems. The
last crash was with a clean profile.And then there's the
disappearing dev tools - that's fun.EDIT: I hope that there is
something weird with my systems. But I fear that the rush to push
this out might have been a little hasty.EDIT EDIT Apart from the
crashes the new FF has been nice. I've been able to largely stop
using chromium for dev work - so not all is bad.
mbrubeck - 3 hours ago
You can go to "about:crashes" to get some more information about
reported crashes. If you open a crash report and click the
"Bugzilla" tab, you can find out if a bug is on file for that
specific stack trace.
JupiterMoon - 3 hours ago
Cool. I'll check this out next time. Any way to report the
disappearing dev tools?
dochtman - 3 hours ago
Just file it on https://bugzilla.mozilla.org/; you can login
with your GitHub credentials.
thomastjeffery - 3 hours ago
I have been using Firefox for several months, using Windows 10,
and several GNU/Linux distributions, different hardware, etc.,
and have never experienced a crash.It's definitely something
weird to do with your systems, meaning it's a real bug that you
are experiencing, and I am not.So please share crash reports, and
file bug reports. Different hardware/software quirks may reveal
bugs in Firefox/Linux/drivers/window managers/anything. By
submitting a bug report for Firefox, you may be able to help find
a video driver bug, etc.
noir_lord - 35 minutes ago
Had one on FF Android but none on Linux since 57 release or in
FFDE.
m0th87 - 2 hours ago
> For example, register allocation is a tedious process that
bedeviled assembly programmers, whereas higher-level languages like
C++ handle it automatically and get it right every single
time.Ideal register allocation is NP-complete, so a compiler can't
get it right every single time.I'm not sure how good in practice
modern compilers are at this, but would be curious to know if
there's some asm writers who can actually consistently outperform
them.
Manishearth - 2 hours ago
I think "get it right" here is "have it work at all", not "get it
fast".
m0th87 - 2 hours ago
Ah, that makes sense!
phkahler - 1 hours ago
Optimal register allocation has been polynomial time for more
than 10 years - for some definition of optimal. IIRC it started
with programs in SSA form and has dropped that requirement more
recently. Modern GCC uses SSA form and I think LLVM might too.
kbsletten - 2 hours ago
They're not saying that C++ compilers do the best possible
register allocation, they're saying that C++ compilers generate a
register allocation that works and doesn't contain bugs.
Technically, spilling everything to memory and loading only what
you need to issue instructions is "getting it right" by this
definition. No compiler strives to get the "optimal" anything in
the general case, but we do expect them to strive to be "correct"
in all cases. The language we use determines which properties are
included in our idea of "correctness".
[deleted]
fulafel - 3 hours ago
Congratulations, it's really an unparalleled performance of
parallel performance.
vanderZwan - 2 hours ago
Speaking of which: does anyone know if some new optimization land
in the beta versions a couple of days ago? Or if some bug that
caused delays on Linux got fixed?I updated my developer version
yesterday and it was as if Firefox - already ludicrously fast
compared to before - turned on the turbo booster.Obviously, I'm
not complaining ;)
wldcordeiro - 1 hours ago
The next Servo component is webrender and I'm not certain on if
it has been flagged to on for developer edition but that would
certainly affect speed.
vanderZwan - 1 hours ago
I'd be very surprised if that wasn't announced
somewhere.Guess it must be a specific bug that affected my
hardware more than the average person then.
steveklabnik - 1 hours ago
I'd be shocked, WebRender is still not default in nightly,
and I'd expect that before it was turned on for beta/DE.
moosingin3space - 1 hours ago
If I remember correctly, it's in Nightly behind a feature
flag, not Beta.