HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
The Birth and Death of JavaScript

www.destroyallsoftware.com · 712 HN points · 255 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention www.destroyallsoftware.com's video "The Birth and Death of JavaScript".
Watch on www.destroyallsoftware.com [↗]
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Nov 30, 2022 · j-pb on WasmEdge
Yeah, but we made some awesome sandboxing friends along the way.

We could actually get a nice performance boost out of running WASM with VM enforced security instead of Rings and the MMU.[0]

[0]: https://www.destroyallsoftware.com/talks/the-birth-and-death...

Nov 23, 2022 · samtheprogram on Wasmer 3.0
Running WASM outside of a web browser? Obligatory Gary Bernhardt, “The Birth and Death of JavaScript”: https://www.destroyallsoftware.com/talks/the-birth-and-death...
rizky05
None
walrus_pen
Javascript and WASM are unrelated
kaba0
Are they? Wasm is just asm.js turned into an independent thing.
jamil7
The parent comment didn't say they are. It's a very well known talk in which the speaker describes a hypothetical future technology that looks a lot like WASM.
Because it wouldn't be HN if someone didn't: https://www.destroyallsoftware.com/talks/the-birth-and-death...
>> No body wants a Nuclear War and 40 million refugees off the coast of Australia.

For thise wondering about this reference:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Obligatory link to Gary Bernhardt's "The Birth and Death of Javascript": https://www.destroyallsoftware.com/talks/the-birth-and-death...
None
None
phoe-krk
Possibly the most insightful and accurate talk I've seen regarding the web stack, and it's from 2014 - that's eight years ago! I keep on coming back to it year after year. Even the names used, "asm.js" and "wasm", are mostly the same.
Gary Bernhardt's talk [1] is becoming more and more of a premonition.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

Talking about inner platforms (and in case there's one person left who hasn't seen it): https://www.destroyallsoftware.com/talks/the-birth-and-death...
dekhn
https://en.wikipedia.org/wiki/Inner-platform_effect

"""The inner-platform effect is the tendency of software architects to create a system so customizable as to become a replica, and often a poor replica, of the software development platform they are using. This is generally inefficient and such systems are often considered to be examples of an anti-pattern."""

Like I said elsewhere, I'm not completely opposed to the idea of the browser as a complete and fully functional application container for an inner platform. And I want to encourage creative people to try new technologies, especially WASM to explore the idea of "how much can we move to the browser". However, I see the container as the mediator of the network, not the application.

Of course it is.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Jokes aside, I wouldn't be surprised to see WASM in the kernel one day, perhaps to replace eBPF.

Teknoman117
I highly doubt that.

The whole point of eBPF is that it's a limited subset of an execution environment that not only provides a memory safe environment, but just as crucially all eBPF programs are guaranteed to terminate.

It's not a general purpose computation environment in a strict sense.

voxic11
afaik the eBPF vm used by the linux kernel can run programs that never complete, those programs are representable in eBPF bytecode just like they are in WASM bytecode. To prevent running eBPF programs that never terminate the linux kernel runs static analysis over the eBPF bytecode when its loaded, I would think similar analysis could be run over WASM bytecode to ensure it terminates.
api
There is at least one WASM runtime for the kernel but not to run WASM in the kernel. It's to have a platform-independent WASM user space.
lioeters
> WASM in the kernel

This may not be too crazy of an idea, it would enable fine-grained secure sandboxing like what Firefox is doing.

Securing Firefox with WebAssembly - https://hacks.mozilla.org/2020/02/securing-firefox-with-weba...

Practical third-party library sandboxing with RLBox - https://rlbox.dev/

mmis1000
Yep. The wasm don't even have a primitive to access to its own executable. Let alone modification and cause RCE. Bound checking definitely have overheads so you wouldn't expect it to suit all workloads, but for most workload the trade-off would be probably acceptable. And it would probably enable a universal linux driver that runs independent to cpu arch.
This prescient talk "The Birth & Death of JavaScript" by Gary Bernhardt in 2014 is slowly but surely coming true:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

heavyset_go
I'd love to see this future, but I wonder if a WASM VM can match the performance of something like V8 that is fine-tuned to run a single language well.
corysama
That’s pretty much the point of WASM, isn’t it? V8 is a billion dollar investment in running as best as possible a language that is very, very difficult to run performantly. WASM instead opens up options to run well languages that are much easier to run performantly.
heavyset_go
Definitely it is, and it does open doors for this, I'm just curious if running say, a JavaScript or Python interpreter on top of WASM would be more performant than V8, given that there could be two layers of VMs, ie the WASM VM and the JS/Python VM running on it.

I'd hope that they'd be comparable in performance, especially when it comes to doing client-side web app things in languages other than JavaScript.

corysama
That wouldn’t work out. The C Python interpreter is slower than v8. A WASM Python interpreter would be at most as fast as the native C implementation
afiori
Currently JIT in wasm is incomplete. That is a wasm program cannot modify itself (it would need to create new modules and ask the host to load them with a shared memory)

So compiling JS to wasm would probably be limited to the performance of interpreting JS on native

Time to watch "The Birth & Death of JavaScript" again.

I don't remember exactly when he says it's time to get rid of containers and VMs, but it's in there...

https://www.destroyallsoftware.com/talks/the-birth-and-death...

This is pretty cool! It immediately reminded me of Gary Bernhardt's 2014 talk, 'The Birth and Death of Javascript'[1]. Truly life imitating art.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

Sep 26, 2022 · nisegami on WordPress WASM
Is it my turn to post this prophetic talk? https://www.destroyallsoftware.com/talks/the-birth-and-death...
For the uninitiated: https://www.destroyallsoftware.com/talks/the-birth-and-death...
uavals
link to a video
aaaaaaaaata
It's there.
zeristor
Or for the initiated the HNews post of it back in 2014:

https://news.ycombinator.com/item?id=7605687

Gary Bernhardt wasn't all that wrong when he mentioned anything that can be written in JavaScript will be written in JavaScript. This post indeed reminded me of his talk https://www.destroyallsoftware.com/talks/the-birth-and-death...
capableweb
Bypassing the idea that this thing is not run via JavaScript but WASM, he was definitely right :)
robocat
Use wasm2js to cross-compile WASM to pure JavaScript: https://www.google.co.nz/search?q=%22wasm2js%22 and there are other ways to not require native WASM!
kretaceous
I think it was by Jeff Atwood and hence is called Atwood's Law.
mishraprince
Aah, yes. My bad.
jillesvangurp
The only thing he missed is that javascript as a compilation target makes less sense than a byte code format designed for such a thing, which is what WASM is. WASM is more similar to JVM bytecode than it is to Javascript. It seems we have a few implementations of WASM that for historical reasons share a code base with a javascript interpreter. But technically that interpreter is not really needed.

You can actually use WASM to run your own javascript interpreter and people are already doing that to not be dependent on the interpreter that comes with the browser or wasm runtime (outside the browser). If you are going to run node.js in a wasm runtime, that's what you might want to do. Likewise, if you want to offer a browser IDE for a node.js project, you might want to run node.js in a browser and this is probably what you'd be doing rather than passing through the javascript to the browser javascript interpreter, which lacks most of the node.js API. Just easier that way.

Likewise if you want to run some old internet explorer 10 javascript, packaging up on old version of that as wasm might allow you to do that. The hard part of course would be getting your hands on the source code. But MS might help us out here or somebody might implement something compatible. Very much like is being done with flash.

the birth and death of javascript is a perfect primer for this article in case anyone did not see it yet (https://www.destroyallsoftware.com/talks/the-birth-and-death...)
Gary's predictions are also still on track https://www.destroyallsoftware.com/talks/the-birth-and-death...
prefer not to get into a debate about the efficiencies of x vs y but sufficed to say: Computers are only getting negligibly faster over the last 10 years and our software has gotten overall quite considerably slower.

A runtime overhead that slows everything down to 1.5x, even with low hanging fruits is only going to accelerate this issue.

From a consumer perspective: We all act as if everyone has 8-16G of ram, because that's what we're used to, but the reality is that the majority of people have 2-4G of ram, even these days. That's not counting the anemic CPUs that are often inside awful cooling solutions.

From a server perspective: we outsource our Ops to cloud providers and pay a significant premium for computational speed, which means things like runtime overheads have direct costs.

The reason I called it inefficient is because it's not adding anything we don't have, it's just "another layer" with a large runtime overhead.

And, anyway, I'm mainly referring to this talk: https://www.destroyallsoftware.com/talks/the-birth-and-death...

This is by far my favorite tech talk of all time. It goes into why WASM can run faster than native code in many contexts, the reason being that it get's around the overhead of OS security rings

https://www.destroyallsoftware.com/talks/the-birth-and-death...

georgyo
The base argument here doesn't make sense.

WASM requires an interpreter which must be native.

The argument is that this interpreter can be smarter about what crosses OS security rings. But those same improvements could be done in the native compiler or interpreter.

The next argument could be that many things using the WASM target would focus more effort on improving it so all WASM targets benefit outpacing their individual optimizations.

This one is harder to dismiss outright, but instead of optimizing for machine code you are now optimizing your WASM output.

Also this intermediate byte code representation already exists for both LLVM and JVM, which many languages target.

It is difficult to see WASM magically improving performance at all and especially not dramatically enough to encourage people to switch to it for that reasoning.

dluan
https://twitter.com/garybernhardt/status/1341534427227123713

the fact that i am more frequently having to dig up this tweet to show people lately is astonishing

eatonphil
That talk is from 2014 and Wikipedia says wasm was announced in 2017?
time_to_smile
While the parent is does seem to be treading into Poe's law territory, it's not entirely correct to dismiss that talk's relationship to wasm based on the dates your quoting.

Bernhardt in the talk explicitly mentions asm.js which is the precursor to wasm (it's even mentioned in the wikipedia article you skimmed a bit too quickly). asm.js was released Feburary 2013.

I'm surprised HN has such a short memory, but the impetus for that talk was a clearly disturbing trend at the time implying that everything should be done in javascript. Node.js was gaining rapid popularity, people were discussing javascript as the new C for using as the language to write example code in, and while things like asm.js were exciting, they seemed to point towards the hilariously nightmarish future Bernhardt is discussing there.

capableweb
asm.js was first mentioned in 2013. asm.js was eventually superseded by wasm and is pretty much the beginning of wasm as we know it. Didn't watch the talk, but could asm.js be the thing the presenter was talking about?
AprilArcus
it post-dated and took inspiration from Mozilla's asm.js, which was highly influential on wasm.
dnsco
This talk is about asm.js which is a precursor technology to wasm, parents logic seems to be "wasm is an improvement on asm.js". I have no idea if the kernel isolation benefits the garry bernhardt talk is about apply.
ninkendo
That’s not the right conclusion to take from that video.

WASM isn’t faster than native code. It’s that an operating system written from the ground up to use a language VM (for instance, wasm) to implement all memory protection, on the system — and most importantly, no other memory protection, including page tables or processor-level isolation that normally separates kernel code from user space — may end up being more performant than what we have now.

Running wasm on a standard OS kernel like Linux/windows/darwin is not going to give you the benefits. The benefits come from eliminating system call overhead associated with switching in and out of the kernel protection context, which is something you need if you’re executing raw machine code which can load/store any memory address. If you simply eliminate the ability to run arbitrary machine code, you can just let everything run in kernel space and use the language VM (like wasm) to protect memory. The result may or may not be faster overall, but it’s purely hypothetical today because essentially no operating system works this way. (Microsoft wrote a research OS back in the ‘00s to try this out but it ultimately didn’t turn into a shipping product. There may be other research OS’s out there that toy with this idea, but nothing in production… maybe the old symbolics lisp machines work on this principal but I’m not sure. There may have been some similar machines in the smalltalk days as well.)

ativzzz
SPIN, a research OS in the 90s also explored a similar concept by allowing the kernel to be extended via controlled interfaces.

> For instance, the SPIN Web server executes entirely in the kernel address space.

http://www-spin.cs.washington.edu/

But like you said, not in production.

zamalek
> Microsoft wrote a research OS back in the ‘00s to try this out but it ultimately didn’t turn into a shipping product.

They called it SIPs (software isolated processes). You can see the benefit of WASM for this problem, though. It has gained significant traction, and it is significantly simpler than CLR. I really hope something comes of it.

It came true: https://www.destroyallsoftware.com/talks/the-birth-and-death... This presenter presented a presentation on javascript at pycon which this was predicted.
That is actually a joke made in this talk[1], which overall is fantastic talk for anyone that hasn't seen it.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

DustinBrett
That talk has been inspiration for me.
whizzter
It needs to become reality :D, on a more serious note maybe it'd be possible for this to be used in inceptions with a slightly older but capable Firefox before they implemented process separation?
I'd be curious about the overhead if the browser was running in kernel space. This would be surprisingly secure since browsers provide some of the most hardened sandboxing out there.

Obligatory: https://www.destroyallsoftware.com/talks/the-birth-and-death...

loa_in_
It would be much slower because of so many context switches
ygra
(In)famously, Windows NT put GDI into the kernel for performance reasons. Any application wanting to draw something would also have to pay for that context switch, which apparently was deemed not a problem back then. Have context switches become that much more expensive since then?
10000truths
You've got it the wrong way around - the browsers' sandboxing is good because they make use of resource isolation features provided by your kernel.
klabb3
Is that so? I thought even different contexts within the same browser context were well isolated. For instance, isn't the reason arbitrary web pages can't access my Lastpass credentials is due to this sandboxing?
ale42
You can't enforce real isolation in user mode, you need the kernel to do it.
kevingadd
The browser security model relies heavily on protection from things like process isolation, ASLR, etc though at this point browsers also implement their own defense in depth mechanisms on top. for example, jsc maintains separate heaps for each data type - referred to as cages, i think - so that it's hard to perform type confusion attacks. but there are entire classes of attacks that would be viable against browsers if not for the tools they get from the kernel like ACLs, page protections, ASLR, control flow guard, etc.

When one part of the stack of protections fails, you can see things like persistent root on chromebooks where that one hole is used as a springboard to attack other vulnerabilities: https://news.ycombinator.com/item?id=15713103

valgaze
See @ 13min12s on klabb3's link

Note that presentation was from 2014, I wonder what that'd look like in 2022

Another approach is to never context switch by running all programs in kernel mode and vetting them with an interpreter/JIT compiler: https://www.destroyallsoftware.com/talks/the-birth-and-death... (only half joking)
throwaway81523
That is how the Singularity research OS works, except it's done by static verification in a compiler.
A hundred plus comments and not a single mention of GIMP?

I think we are close. After COVID, we should be releasing a lot of the emotional attachment to the C programming tools. And by 2025, we should be able to have the Windows version of GIMP running inside Mac versions of Chrome, on top of Wine, X Windows, CC and WASM in Chrome. And since Chrome can be compiled to WASM, we could have GIMP running on Chrome and Chrome running inside Firefox.......

And the story ( or history ) continues in the link below.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Javascript is like Ice 9, it turns everything it touches into Ice 9.

LiveView was invented not to have to deal with Javascript-powered SPA architectures and client-side coding in general, because JS is ass, and here's LiveViewJS to convert to Javascript a technology invented not to deal with Javascript.

Gary Bernhardt warned us, yet no one listened. shakes fist at cloud

https://www.destroyallsoftware.com/talks/the-birth-and-death...

newsbinator
The difference is that Ice-Nine turns all water into Ice-Nine. Whereas more javascript projects doesn't mean fewer not-javascript projects (except inasmuch as people who like javascript are busy doing javascript things instead of other things).
pier25
What a great video.

I used to agree with him but in 2022 we're still using yavascript and it's only getting better.

I have no doubt at some point we will get optional static types and even TypeScript will become obsolete like CoffeScript did.

floodfx
I can see you are angry and don't like JS. Perhaps you might be less angry if you look at the project and see that it is not an SPA architecture nor does it require or encourage client-side coding in general?

Maybe instead of shaking your fist, you might just celebrate the fact that we live in amazing times where even JS programmers can enjoy the LiveView programming paradigm! :)

spion
I'm a mostly-JS engineer for 12+ years. JS sucks and could learn a lot from ecosystems like Elixir (and Rust). We should fix the suck first.
fouc
I'm not sure that comparing JS to a potentially catastrophic substance with the capability to destroy all life on Earth implies that OP is angry about JS. It could be a tongue-in-cheek comparison.
floodfx
Well they did say "JS is ass". Happy to give them them the benefit of the doubt but I appreciate more obvious supportive commentary.
1_player
I am not angry, it was tongue in cheek and an opportunity to share more Gary Bernhardt with the world, but JS is indeed ass.
spondyl
For anyone curious about Ice 9: https://en.wikipedia.org/wiki/Cat%27s_Cradle

> Ice-nine is an alternative structure of water that is solid at room temperature and acts as a seed crystal upon contact with ordinary liquid water, causing that liquid water to instantly transform into more ice-nine.

A substance called ice-9, with basically the same properties, makes a passing appearance in the visual novel Zero Escape: Nine Hours, Nine Persons, Nine Doors if you haven't heard of the book but remember the term from somewhere else.

Admittedly, that's where I first learned about it but that said, there is a real substance called Ice IX but it isn't as interesting

https://en.wikipedia.org/wiki/Ice_IX

Jan 25, 2022 · cercatrova on Nova by Panic
I've heard a hypothesis saying that JS endures because it's bad, otherwise it would've been subsumed by other languages and we never would've been fed up with it to create Webassembly, which is more optimized for its use case than what the other languages we might've used instead of JS.

In other words, feeling the pain of JS over 20 years led to the creation of WASM that might not have been as needed had we had good languages on the web present.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

https://youtube.com/watch?v=pBYqen3B2gc

andrekandre

  > In other words, feeling the pain of JS over 20 years led to the creation of WASM that might not have been as needed had we had good languages on the web present.
honestly, that sounds like retroactive justification... bytecodes where a thing since way before js showed up, and afaik there was no technical reason it couldn't have been like that from the beginning
This would be the last step predicted by this 2014 (!) talk: https://www.destroyallsoftware.com/talks/the-birth-and-death...

At the time it seemed unrealistic; asm.js was only just invented

ksec
The Nuke was COVID, so we need another 5 years or so to release our emotional energy from those C infrastructure.

At least I hope the nuke was COVID.....

stavros
Yeah, that talk proved extremely prescient.
paavohtl
It's an excellent talk, but IIRC the point was that hypothetical Metal language was still a subset of JavaScript. From the technology point of view WebAssembly has basically nothing to do with JavaScript, except that both are part of the web platform and commonly implemented by JavaScript engines. It's just a low-level VM which happens to be included in web browsers.
hajile
Wasm was definitely inspired by asm.js (though I think asm.js is still faster). The MVP for wasm was basically "do what asm.js does."
Mandatory Gary Bernhardt. https://www.destroyallsoftware.com/talks/the-birth-and-death...
kown7
Good rewatch - pandemic or war is close enough I guess
I dont disagree. From a technology POV, there are zero reason why anything cant be done on WASM in the future [1], and how WebGPU with extension could be used as a replacement for Metal or Vulkan. Or working on DOM access with WASM ( or is that a thing already? ). Others such as Web App and Web API could be developed. But none of these works at this point in time. And not in the near future. Especially when, as you said the platform vendors do not have any incentive to do so.

The single click Home Screen is definitely a platform limitation and not a technology limitation.

There are dozens if not hundreds of Real world examples where companies moves to Web Based technology ( example Ikea ) and move back to Native App simply because the experience, and user retention rate is far higher in Apps. React Native is bridging the gap but many would argue that is Native as well, not Web.

I am not in support of everything App Store. I think there needs to be a balance of things. I think there needs to be an option for user to install ( or sideload, which is a word I hate ). What we dont have an answer is how should Apple monetise what ever they think is needed for their R&D cost that is also acceptable to user and developer.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

foothebar
I personally can not judge many things, like the performance, that's why I am asking in so (too) much detail for anything. Do you have a link to the IKEA case by the hand, maybe? Was it more an 'we want everything to move all the time, so phones are too slow' problem, probably including 'we load ways too much unnecessary stuff' on their web page, or what drove them moving back?
> having a trusted compiler could (eventually) massively increase performance by removing processes entirely (no more virtual memory! no more TLB flushes and misses! less task switch overhead!) and eliminating the kernel/user mode separation

I saw a talk a while ago that was advocating for the same thing, except this was about JS and not webassembly. I can't find it tho - I remember it being related to the WAT js talk; It also mentioned that it would eliminate rings on the cpu (and simplify cpus) and context switches which would make execution faster; they were citing some MS research on the matter - damn I really wanna find the talk now...

Edit: https://www.destroyallsoftware.com/talks/the-birth-and-death...

thanks BoppreH

MS research: "Hardware-based isolation incurs nontrivial performance costs (up to 25-33%) and complicates system implementations" (virtual memory and protection rings); I think MS knows what they're talking about here

throw10920
Thanks for the link. I would argue that a true trusted compiler needs to accept an unmanaged language and emit code without a runtime, though. A runtime is cheating, because you can always make one that implements an iron-clad sandbox that doesn't require processes...by implementing a (very slow) VM.

To put in another way - I don't think that security or performance are that hard to achieve on their own - the hard part is getting both at once. And then, adding expressiveness on top is even more difficult, as Rust as aptly demonstrated.

kaba0
Rust is not secure at all in the sense used here — untrusted, arbitrary user code written in rust is a security threat.
kmeisthax
More specifically, unsafe blocks may violate the compiler's security guarantees and procedural macros actually run inside the compiler process at build time. Declarative macros do this too, but they're far too restricted to allow shenanigans. Procmacros can disable Rust's stability guarantees[0].

[0] https://github.com/m-ou-se/nightly-crimes

kaba0
Nah, that’s not what I mean. It is a Turing complete language — if it is used to interpret some other language inside itself, it can’t add anything to that languages’ guarantees automatically. You can write a javascript interpreter in rust that is trivial to exploit and access e.g. the file system or whatever.
hinkley
I think I heard of this in the early to mid 00’s and it was in the context of Java. This set of ideas has been cooking for a while. Might be about time to taste the proverbial soup.
None
None
SigmundA
Singularity was a experimental OS written in a a variant of C# and .Net managed code by MS Research that ran using software isolated processes rather than hardware isolation, this is probably what they where referencing:

https://en.wikipedia.org/wiki/Singularity_(operating_system)

kaba0
http://joeduffyblog.com/2015/11/03/blogging-about-midori/

There is also a really great blog about Singularity’s “rebirth” experimental OS, Midori, that continued in its footsteps.

Getting strong vibes of The Birth and Death of JavaScript (2014) [1], one of the numerous great talks by Gary Bernhardt.

My engineer side is happy seeing how strong tooling enables such creative features with high assurances.

My futurist side is dreading the day Intel launches their first Javascript/WebAssembly-only processor.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

MangoCoffee
i don't think JavaScript going to die but its time that we have another option for the web. JavaScript have its warts. some people love JavaScript and some don't. its not fair for JavaScript to be the only option. i see Web Assembly as an option for people who doesn't like JavaScript warts to use their favor langue to develop for the web.
Omnius
I feel like the only people that "like" javascript or those that had it for their first language. Its needed, its better than it was, but compared to just about any other language its a total mess.
k__
My career was C, Java, PHP, JavaScript.

I like JS the most.

It's flexible, lightweight, and omnipresent.

The only other mainstream language that gives me that feeling is Rust.

allisfalafel
Rust is hardly omnipresent. I understand it has trouble with lesser used architectures and operating systems. While yes you probably dont use them, they do still exist.
chaostheory
The ECMA improvements are nice but imo it's too hard to manage large Javascript programs, or Typescript wouldn't have such a large following.
k__
Good point.

I'm using TypeScript and JavaScript interchangably.

I know JavaScript like the back of my hand, so TypeScript isn't telling me much new, but the static type checking allows me to keep less of that knowledge in working memory. Pretty awesome language!

mmastrac
ARM already has Java and JavaScript extensions in their CPUs, so that day isn't completely off the horizon yet.

I'm not even sure it would be a terrible idea, as we'd have a very interesting JS/WASM-like set of opcodes that we could target with _any_ compiler.

hajile
The "Javascript instruction" is a bit of a misnomer.

JS accidentally got part of the x86 execution model for float conversion baked into the spec. ARM added an instruction to mimic the old x86 one. It's potentially useful in some other contexts too.

mmastrac
Regardless, FJCVTZS is still literally a "Javascript" instruction: "Floating-point Javascript Convert to Signed fixed-point, rounding toward Zero".
lelandfe
Brilliant, hilarious talk, thanks for linking.
I believe you are looking for Gary Bernhardt's talk:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

PeterBarrett
Fantastic! Thanks for that, I was searching for a good 20 mins, much appreciated.
If you haven't seen "The Birth And Death of Javascript," you should watch it.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Oct 19, 2021 · pchanda on Nim 1.6
Also this talk by Gary Bernhard: https://www.destroyallsoftware.com/talks/the-birth-and-death...
The intro has some insider talk (ex, bindle). If you read on a little it describes how Hippo is a PaaS and uses WASM under the hood to make it happen.

This has some great potential. Is the node the code running on Linux, FreeBSD, Mac, Windows, or something else? Doesn't actually matter because web assembly is being executed.

It reminded of the Birth & Death of JavaScript talk from a bunch of years ago (it's really about wasm)... https://www.destroyallsoftware.com/talks/the-birth-and-death...

M.E.T.A.L (https://www.destroyallsoftware.com/talks/the-birth-and-death...) - hopefully "pandemic" won't take the place of the 5 years war between 2020 and 2025
pjmlp
That depends pretty much on the anti-vaccine folks.
I feel like a fool this somehow put a smile on my face :)

And my memory may be hazy, but in those days Windows always crash every few days and we normally "shut down" our computer after usage.

But the most interesting thing is a running Windows 95 in English version. May be screenshot [1] weren't hard to come by. And we are lucky nuke didn't happen in 2020.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

foepys
How far we have come.

Nowadays I usually let my notebook hibernate (not sleep) for the night and when I return the next morning, I often discover that I forgot to stop my debugger and it is still attached to my running program. Not even a hiccup when resuming.

progman32
Heh. In the meantime, here I am having to software-reset my network card after almost every resume. Thankfully, like beer, Linux is the answer to _and the cause of,_ all the world's problems.

# echo 1 > /sys/bus/pci/devices/0000\:0b\:00.0/remove # echo 1 > /sys/bus/pci/rescan

It's all coming true...

https://www.destroyallsoftware.com/talks/the-birth-and-death...

ilammy
Especially the war thing, uh-huh.
Jun 17, 2021 · 73 points, 43 comments · submitted by NilsIRL
thunderbong
Submitted multiple times earlier. The thread with the most comments (227 comments) -

https://news.ycombinator.com/item?id=7605687

base698
Wat?
tedk-42
Javascript will never die, unless there's something to come along and either replace browsers or replace the scripting language used by browsers.

WASM is not a replacement to JavaScript and never will be. It's not even a damn language.

mysterydip
Javascript is the C of the web. We had a chance to learn from hindsight, and instead we made a different behemoth without realizing it.
kortex
Or is it? One thing JS and C have in common is a minimal core, kinda loosely defined, relatively easy to implement, that nonetheless lets you create powerful abstractions. I don't think it's a coincidence. I think anything more sophisticated and "well-built" ironically would have seen less initial traction, and lost first-mover advantage.

That JS wasn't a Lisp is kind of a pity, though. But again, Lisps have been around longer than almost any other language yet are still comparatively obscure, which tells me something inhibits their broader adoption.

brutal_chaos_
I beg to differ. WASM, if it doesn't go off the rails with future revisions, will supplant JavaScript because of WASM being a compilation target. This, in effect, opens up any language to run on the web. JavaScript may not fully disappear, but usage will most likely greatly diminish.
dmitriid
> will supplant JavaScript

Not until either of the two things happen:

- WASM gets GC and can work with DOM directly

- DOM is supplanted by canvas- and webgl-based libraries and frameworks

habibur
Those things will be built over time. It's a large field out there waiting to be explored.
Fergusonb
Javascript is still popular today even if you ignore the browser. Node is all over the place.
tedk-42
A small JavaScript text file which can currently run natively on any browser will always be far superior to a binary WASM blob that's compiled.

WASM fills in a use case where you need to run highly performant code on a browser. It just so happens that you can write it in whatever language you choose.

moron4hire
> WASM fills in a use case where you need to run highly performant code on a browser.

That's not exactly correct. It's possible to write equally performant code in just Javascript, by being careful to avoid certain features of the language (e.g. garbage collection, expando objects, dynamic types for variables, etc). Writing code in a stricter language like rust or c++ might make hitting those targets a little easier, but also using TypeScript can make it easier, too.

With the exception of loading, parsing and some JIT profiling, as WASM can potentially be smaller, and it can skip the majority of parsing and early profiling.

So WASM shouldn't be thought of as a tool for achieving speed. It should more be thought of as a means for running cross-platform code, for languages other than Javascript.

qbasic_forever
There's no difference--that small JS text file gets compiled on the fly into platform-specific assembly language with today's JIT compilers in browsers. WASM is just skipping the text source step and giving browsers something they can compile directly.

I do agree it is a shame to lose direct insight to the text source code, but let's be honest the production JS shipped to browsers today is far, far from being human readable. It's minified and shrunk to the most small and incomprehensible degree to save bandwidth. View source and try to read and understand the JS on any big site like facebook.com, etc. and you won't get very far.

tedk-42
There's a big difference in how you develop, ship and share the code.

I would say a page loading a 1KB JS file better off than a WASM binary blob that's 2MB.

Edit: to be clear, I'm not focusing on performance

bryanrasmussen
sad news for some people but now that it has been truly made independent of the browser JavaScript will never die, in the same way that every other programming language that has ever been used to build applications is still puttering around.

There are too many people with worthwhile JavaScript skills to service, too many companies who have things built in it and employees with those skills who will keep building things in it.

Maybe in 2050 there will be Cobol style posts on HN about JavaScript.

on edit: changed removed from to something more understandable

tbrownaw
I think it's not too unreasonable to say that this includes a successful prediction of webassembly.
m1kal
I tend to disagree. Asm.js is dying. Webassembly is not "everything on top of JavaScript". While higher layers can work the same as they could with asm.js (still in the web browser), it's not what GB predicted.
AprilArcus
asm.js already existed when this was written with AOT compilation support in Firefox
maven29
Wasn't the contemporaneous asm.js already in commercial use in 2014?
trixie_
I was thinking of a new 'open source website' concept, where you can call your website 'open source' if all the javascript behind it is unobfuscated/unminified, comments still there.

I would even like to take it a step further, an 'open source' OS where every binary has symbols available. The system can be stopped anywhere and the full stack trace is understandable.

junon
Symbols being stripped isn't usually for "open source" reasons. Neither is minification. It's done to reduce sizes.

A binary with symbols has tons of extra crust that is largely unnecessary. Even if you have them, what good does a stack trace do if you don't also have the source to fix it?

DaiPlusPlus
> Even if you have them, what good does a stack trace do if you don't also have the source to fix it?

It makes patching the binary yourself a heck of a lot easier. As is often the case with legacy enterprise software from a vendor now long-gone…

junon
Do you want full open source? Or allow enterprise software? pick one.
DaiPlusPlus
How are those mutually exclusive?

Enterprise software ends up as something invariably so specific to the company using it that it doesn’t matter if it’s open-source or not - absolutely no-one could extract any value from it than the original company and whatever kafkaesque internal processes lead to its development.

Anything more general-purpose is already commodity - ERP being the prime example.

trixie_
The point is complete transparency into everything happening on your computer through installed application, operating system, or loaded website. Absolutely nothing would be running as an obfuscated binary.

The source to build it is not just 'open'. All running binaries would be able to be mapped to their corresponding source.

junon
Again, we don't "obfuscate" binaries to make them harder to decipher. We do it because debug symbols are massive and cause a lot of bloat and sometimes even performance hits.

If you want a system like that, just compile Linux and all of your tools in debug mode. But again, why do this when you can just recompile from source?

This is a gross misunderstanding of how computers work, I feel.

qbasic_forever
I get more and more excited the closer we get to an eventual reality that a kid learning about programming, operating systems, Linux, etc. is just a click on their phone away from starting it all. No gatekeepers--appstores, sideloading, hacks, jailbreaks, locked down BIOS, etc.--to hold them back. Your browser, a shell, and the entire world in front of you.
DaiPlusPlus
You’re still stuck in the browser sandbox (I prefer to think of it as a Browser-Plato’s-Cave in the context of whole nested systems). And you can’t escape the rectangular box imposed on you by the browser. …especially on mobile OSes: there is essentially zero system-integration by going that route because Apple and Google both need incentives to entice people into forking over that lovely 15-30%.

Things like Push notifications. Background activity. Guarantees about data persistence. First-class Home Screen app icon. Ability to directly share data with other native applications even if they don’t want to (to the extent it’s enabled by the platform’s native Share Activity). And to a lesser-extent: the ability to use native widgets for the best user-experience on that plstform. Too many SPAs fall into uncanny-valley when they start to look too similar to native widgets - and it’s off-putting.

And the fact that Fortnite still isn’t available as a PWA on iOS is telling… I was expecting them to launch an OnLive-like service rendered to a <canvas> over WebRTC by now… and worryingly this gives Apple a strong incentive to immediately halt any work on improving WebRTC in PWAs…

outofpaper
> I was expecting them to launch an OnLive-like service rendered to a <canvas> over WebRTC by now… and worryingly this gives Apple a strong incentive to immediately halt any work on improving WebRTC in PWAs

Like Stadia? (https://youtu.be/3_RAyxpFurU?t=113)

DaiPlusPlus
Exactly the same, yeah.
croes
Are we getting closer or is it the opposite?
qbasic_forever
Very close, almost all of POSIX is implemented in browsers now. Look at for example JSLinux https://bellard.org/jslinux/ for the classic example, or more recently stuff like pyodide that compile real desktop Python to run fully in the browser with WASM https://github.com/pyodide/pyodide

AFAIK there are a few loose ends still TBD with WASM & WASI to support sockets and networking, but once that's in place we'll likely have a full WASM POSIX environment in your browser. Get some of the core tools like gcc, etc. prebuilt and you're good to go to just start building the world in your browser. No app store reviewer to hold you back, no megacorp to decide homebrew apps aren't allowed anymore... the world is your oyster to create and share anything.

d_tr
On the other hand, having a 70M sloc OS act as a bootloader for a similarly sized pile of hacks which is treated like an OS does not sound like a very exciting platform...
dmitriid
> almost all of POSIX is implemented in browsers now.

POSIX is not implemented anywhere. There are degrees to which it's implemented in various systems. It also doesn't mean that having POSIX implemented makes things accessible to anyone, or that this 40-year-old standard is even relevant anymore.

> Look at for example JSLinux https://bellard.org/jslinux/ for the classic example

It doesn't mean that POSIX is implemented in the browser:

- it's basically an emulator running on top of some browser tech that runs linux.

- Linux is mostly, but not entirely POSIX-compliant

> we'll likely have a full WASM POSIX environment in your browser. Get some of the core tools like gcc, etc. prebuilt and you're good to go to just start building the world in your browser

This will literally never happen outside of some geek circles. If only for the simple reason: you'll have to download the entirety of Linux and its tools into the browser for every user.

qbasic_forever
I don't know where your rage is coming from, other than to be a pedant.

Downloading the entirety of Linux, coreutils, etc. _is_ the point. Right now a kid with an iPhone is limited to programming it with whatever toy apps Apple has decided to allow on their app store. Or on Android you're lucky to be allowed to use Termux to run a little proot environment to mostly use core linux tools. There's _no_ way for that kid to learn and use 'real' programming languages and tools--want to learn rust? Sorry, you're SOL. Want to program some Go? Not going to happen. Etc.

But give that kid a browser window that's a full POSIX shell with gcc and all the coreutils built. Perhaps some CDN hosting precompiled packages just like apt/rpm/etc. repos... and now we're talking. They can do _anything_ and no app store limit or whatever will stop them.

This is precisely for geek circles. The kind of geek that opens up that weird qbasic.exe they found rooting around their parents DOS machine and then blew their mind at the possibilities. We don't have anything like that for kids today and it's a real shame, but the birth and death of JS demo here made into reality can change that. I look forward to the day some kid clicks a link, sees a blinking bash prompt and a pointer to read some man pages or help files and has their mind blown too.

layoutIfNeeded
Lol. Sure, kids will be learning Rust (!) on a phone (!!) browser (!!!). Now that I think about it, turning web browsers into these bloated pseudo-OSs was well worth it, if only for those nerd kids learning Rust on their phones!
dmitriid
> I don't know where your rage is coming from

I don't know why you assume I have any rage. I'm just pointing out facts.

> There's _no_ way for that kid to learn and use 'real' programming languages and tools--want to learn rust?

Ah, yes. "Real programming languages", not "toys provided by Apple". "Kids wanting to learn Rust on a phone".

> But give that kid a browser window that's a full POSIX shell with gcc and all the coreutils built.

And?

> They can do _anything_

No. They still can't do "anything". Since Linux is already running in the browser, go ahead, run it on a phone, install rust into it, and go do "anything". Then come back and tell me if any kid will want to do that.

> This is precisely for geek circles. The kind of geek that opens up that weird qbasic.exe

Ah yes. So now you're limiting all kids to just geeks who are willing to tinker with a rust compiler running in a shell in a linux running on a wasm on a browser on a phone. God forbid it would be other kids with "apple toys" and "non-real programming languages" that don't require "a full POSIX shell with gcc and all the coreutils built".

> I look forward to the day some kid clicks a link, sees a blinking bash prompt and a pointer to read some man pages or help files and has their mind blown too.

More likely than not that kid will see a blinking prompt, and will go away saying "what in the fresh hell is this".

Modern "toys" open significantly more opportunities for kids to learn than any posix shell. Apple's Swift Playgrounds will teach and interest a few magnitudes more kids to learn programming than a "bash shell with a prompt to read man pages".

mechEpleb
I get the feeling that the world was significantly more open to kids growing up with access to computers in the 80s and 90s, because as someone whose formative years were the 00s and early 10s, the interesting bits of technology were already buried under a thousand layers of abstraction and indirection. Kids nowadays won't ever learn what a shell is unless they go out of their way to learn it for some reason.
dimal
Sorry, you would be wrong. There was no internet. You were lucky if there was one other person around you who knew BASIC and could answer questions. Everything you needed to learn came in books that cost $30 each (at a time when $30 was a lot of money), and most of the books at the computer store were for using spreadsheets and word processors, not actually coding. Great, you had a DOS shell right in front of you, but learning what to do with it was a struggle. And writing .bat files isn’t very exciting. Maybe if you were in a place like SV, there would be tons of resources to learn, but in most other places you would be in an information desert. I was interested in computers but gave up because I hit the limit of what I could learn pretty quickly and couldn’t get any further. Kids have it MUCH better today.
sensanaty
That's really not true though, lots of dev-focused tools explicitly require hopping into a terminal and typing away commands. Plus at a certain point, anyone with any real curiosity is going to think "I wonder what's hidden behind these abstractions I see all the time?" and dig deeper anyways
bicolao
It's still a lot harder (abstraction layers add complexity) to get close to metal as opposed to say, DOS.
mechEpleb
You'd think that, wouldn't you? Yet here I am, someone who grew up thinking computers were obtuse and boring because getting the computer to do anything interesting seemed like it would require knowing a thousand things not related to the issue at hand. I was always a mechanically minded person, so while the inner workings of things seemed interesting, making toy websites (the entry level computer thing to do in that time period) seemed about as interesting as watching paint dry.

But here I am, working as a software engineer and half way through my MSc in computer science. It took a couple of low level microcontroller classes in my mechanical engineering undergrad for me to see the light.

todd8
My daughter finished her CS degree about a year ago. I was glad to see that one required class had her build a very simple computer from gates as her final project. I don’t know the details, but I think the computer had perhaps 10 different instructions. She didn’t like that project very much, but I thought it was valuable; someone has got to understand the principles of operation at that level.

I started programming in the 60’s and at times had to load instructions into machines using binary on front panel switches. That’s the reasons that machines of that time had the lights and rows of switches on the front so that one could debug programs by looking at the lights to see the program counter, data value, or instruction. See [1] for a photo of a large front panel on an iconic machine of the time.

I even recall pulling plug panels out of card processing equipment to reprogram the sorting and selecting of input data being run through machine as huge stacks of punch cards, see [2] for a picture of a mid-20th century plug panel for data-processing.

The layers of abstraction are important. They enable us to construct some of the most useful, complex, and intricate artifacts ever made on our planet. Today I program in high level languages, and I get to use powerful frameworks, database systems, and amazing hardware right on my desk. Yet, I do miss some of the fun of invention and hacking on systems that I really understood in depth.

[1] https://en.m.wikipedia.org/wiki/Front_panel#/media/File%3A36...

[2] https://www.ebay.com/itm/133017817600?hash=item1ef87ad200:g:...

ksec
>My daughter finished her CS degree about a year ago.

Interesting I thought CS was all software where computer with gate and low level programming were something of EE / Computer Engineering.

todd8
I believe that it was a two semester sequence that covered gate level logic, microcontrollers and assembly language, and a hardware lab with oscilloscopes etc. I would have picked slightly different classes than the ones she ended up taking for her degree, but having worked in industry myself, I thought the exposure to hardware she got could be useful for her in the future.

My own educational background (Math, EE, CS) has given me a lot of flexibility over the last 50 years.

Obligatory: https://www.destroyallsoftware.com/talks/the-birth-and-death...
guscost
This is “inspired in part” by the same (excellent and a little scary) talk. Which makes you think, when would something like this have appeared if the talk had never happened?
pjmlp
Some other bytecode among the dozens in use since the early 1960's would have been chosen instead.
eximius
Asm.js was already in the wild and referenced in the talk, so while it is humorous, I think the path was already set.
teeray
This was my first thought when I saw the headline. What I once thought was satire was, in fact, prophecy.
ksec
We are lucky nuke didn't happen in 2020, but COVID. And there are people fleeing San Francisco for different reason.
May 16, 2021 · teddyh on Making Your Own Tools
The Birth & Death of JavaScript

https://www.destroyallsoftware.com/talks/the-birth-and-death...

amelius
> (...) Death of JavaScript

Don't worry, people will run V8 inside WASM.

Obligatory reference to The Birth and Death of Javascript: https://www.destroyallsoftware.com/talks/the-birth-and-death...
I certainly would not say ‘more open’. It is newer, but that's a mark against it, not for it.

It lacks most of java's ‘batteries’—there are projects like WASI that try to resolve this at a low level, but they miss the point somewhat. On the other hand, wasm is saddled with fewer assumptions about the type of code that will run on it. Wasm has generally poorer performance, which is mostly (though not entirely) an artifact of its design, and hence not easily rectifiable[0]. It's not clear to me the extent to which the jvm suffers similarly[1].

Socially, I think wasm is not as interesting as java. Write-once-run-anywhere was solved in practice not by java but by open source; by compatibility libraries like sdl; by standards, like posix and opengl; and by static linking. (Yes, it's not quite as convenient as just plopping in a jar, but it's close enough.)

I don't think we're ever going to see a desktop full of wasm apps. Despite stated goals, the point of wasm was always to be an evolution of the web as a platform for applets. The Birth and Death of Yavascript[2] has many insights thataways (even though it failed to predict wasm as such).

0. https://www.usenix.org/system/files/atc19-jangda.pdf

1. It's also difficult to directly compare the performance of the two systems in their current state. A number of proposals for both (gc for wasm, value types for java, simd for both) are yet pending and are likely to change the runtimes' respective performance profiles.

2. https://www.destroyallsoftware.com/talks/the-birth-and-death...

tachyonbeam
Why do you say that WASI misses the point? (genuinely curious)
I feel the need to repost this link:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

tyingq
I'm somewhat curious at what pace WASM will gain "market share" while it's only practical to target it with C, C++, Rust, etc.

Supposedly, there are plans to expand it to where it looks more like a virtual machine and less like ASM. Adding things like garbage collection, direct DOM manipulation, polymorphic inline cache, etc. Things that would make it possible to run a decent scripting language without pulling in some huge runtime.

Or forgetting the scripting languages, just the direct DOM access might make it less tedious to use with compiled languages.

It's sort of a second run at getting "applets" right, but in a cross-language way.

MaxBarraclough
> Things that would make it possible to run a decent scripting language without pulling in some huge runtime.

Seems to me the web already has an answer for that: transpile to JavaScript, and leverage the existing high-performance JavaScript engines.

Dart does this, for instance.

cwalv
Graalvm native image has the ability to target llvm with a goal of making it easier "to target a new architecture without having to implement a complete new backend for Native Image."

https://www.graalvm.org/reference-manual/native-image/LLVMBa...

nicoburns
> while it's only practical to target it with C, C++, Rust, etc.

You might not want to bundle Rust in with C and C++ in this context. Rust is pretty accessible to developers coming from scripting languages (JavaScript, Python, etc).

Direct DOM manipulation (and the ability to send reference types across the JS-wasm boundary) is a pretty big deal though I think. Currently it doesn't make an awful lot of sense to build code that interacts with the DOM in WASM (although in Rust there are automatically generated bindings that go through a JavaScript shim).

saagarjha
> Rust is pretty accessible to developers coming from scripting languages (JavaScript, Python, etc).

Not in a way that is unique to Rust…

enos_feedler
I think WASM success won't be measured by market share gain on the web, but by number of new non-browser things that are programmed with WASM.
dmt0
EOS blockchain contracts are compiled to WASM.
zlsa
Agreed. An interesting example is the new Microsoft Flight Simulator, which uses WASM[0] as a sandboxed language to create airplane gauges and custom flight models (HN discussion[1]).

[0]: https://forums.flightsimulator.com/t/getting-started-with-wa... [1]: https://news.ycombinator.com/item?id=24281400

enos_feedler
Very Cool!
Stevvo
In that specific situation I find it very uncool. It's targeted only by C++, a language that is far too complex to have any business being used as a scripting language to make needles move on a virtual altimeter. You get all of the drawbacks with none of the benefits of a C++.
enos_feedler
I don’t find it uncool at all. Well we have to start somewhere. Don’t you think this usage of WebAssembly is at least providing some feedback into it’s ecosystem to make improvements for next time? Id rather have developers choosing webassembly even if the implementation doesnt meet our expectations yet
jedimastert
it seems more like it's it's trying to be LLVM but for the web?
nightowl_games
> Adding things like garbage collection, direct DOM manipulation, polymorphic inline cache, etc. Things that would make it possible to run a decent scripting language without pulling in some huge runtime.

Doesnt this just imply building a 'huge runtime' into the language?

ghayes
FWIW, AssemblyScript[0] adds its own garbage collection runtime into Wasm. The team had a good post recently on the performance implications[1]

[0] https://www.assemblyscript.org/

[1] https://surma.dev/things/js-to-asc/index.html

tyingq
Not sure I understand. If parts of the runtime are already in the browser, then there's less to download.
zozbot234
Not necessarily. GC can be implemented in a way that's fairly compatible with low-level programming, by leveraging support for multiple address spaces and/or memory segmentation. And WASM needs that kind of support anyway for its planned module-based security.
The prophesy has come true... https://www.destroyallsoftware.com/talks/the-birth-and-death...
josephg
Does it run Firefox? I’d love to see a browser in a browser, just to experience mediocrity in all its awful glory.

Networking would be tricky, but you could encapsulate TCP over a custom websocket tunnel.

marvin8
v86 has Firefox: https://copy.sh/v86/?profile=archlinux&c=./networking.sh;ech...
exikyut
No, but it does run NetSurf: http://www.boxedwine.org/app/netsurf/

...with exactly the caveat you described, because the wasm port doesn't support websockets yet.

Alternatively you can run *cough* IE6 in JSLinux.

anthk
You can run w2k and probably XP under TinyEmu JS.
ant6n
I think I actually saw this talk live. What are the chances. Definitely worth checking out, it’s pretty fun.
brian_herman
Wow I am so jealous!
re
Is there a relevant timestamp for those of us not wanting to invest 30 minutes in the full talk?
phoe-krk
Yes, 0:00-29:23. The point of the talk is not laid out during a single moment; it's more like a story that unfolds over time, allowing the reader to understand its implications bit by bit.
ToFab123
No. Not really. It a fascinating talk to watch so you are not going to totally waste those 30 minutes.
EvilEy3
What prophesy? It is written in C++.
pjmlp
WebAssembly like all bytecode formats, doesn't care about the source language.
EvilEy3
WebAssembly doesn't make it JavaScript application either. It is merely runtime.
pjmlp
Shipped as part of most JavaScript runtimes, originally.
mywacaday
If we are living in a simulation it's slightly more troubling that it might be running in a browser.
stjohnswarts
I always hoped we were running in a VM written in Rust rather than python or javascript. But it's hard to explain the platypus with a rust based one...
mywacaday
If we are in a simulation how could we force a buffer overflow and what would it look like to us
eliasbagley
you'd probably need a particle accelerator or something
ineedasername
It would explain why time seems to go by so slow when a person is bored: Not many people want to watch that simulation, so lack of CPU cycles slows everything down.
Mar 29, 2021 · ksec on The Deno Company
It is time to post this again. The Birth & Death of JavaScript ( 2014 )

https://www.destroyallsoftware.com/talks/the-birth-and-death...

I can't help to remind everyone to watch The Birth and Death of Javascript:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Cousins of electrons and their RAM requirements are a steep reminder that the DOM is cancerous and will have to evolve and suffer from deprecation.

Or maybe introduce metrics and good performance practices?

This is just the next step in the evolution of browsers into full fledged operating systems. The Web APIs would become the next POSIX. And we might end up with uni-kernels that would just be running browsers on bare-metal. Want to be stalked, here is your Chrome OS. Want privacy, here is your Firefox OS. (Where did I last hear of these? Hmm...)

Who knows, when all this is over someone might make a browser for this new browser OS.

Gary Bernhardt saw this coming long back. https://www.destroyallsoftware.com/talks/the-birth-and-death...

Bundling is dead. You just need to compile your favorite editor into asm.js and run it in your browser.

Or why not your terminal :) https://www.destroyallsoftware.com/talks/the-birth-and-death...

Gary Bernhardt's presentation on the "history" of Javascript from 1995 to 2035 is hilarious and seems like something you'd enjoy: https://www.destroyallsoftware.com/talks/the-birth-and-death...

It takes things way beyond simply "emulating Javascript in Javascript", yet is presented so well that you barely notice the transition from current (2014) reality to a comically absurd future.

My own experience is that startup employers really do have unrealistic/over-optimistic expectations of what's actually required to achieve a certain result. Anything worth doing in the startup world hasn't been done before, so it's usually not possible to invent it on a schedule.

People who will throw something together are heroes, but very often the result is complete crap that has enormous costs if it gets adopted as a standard. I'm being only slightly facetious asking, how much has humanity been set back due to the fact that less than two weeks was spent inventing Javascript? Maybe they should have spent an extra week or two on it and saved millions of person-years down the road [ h/t the recent repost of https://www.destroyallsoftware.com/talks/the-birth-and-death... ]

At the same time, my own psychological makeup made me vulnerable to trying to meet those unrealistic expectations, and that's on me. It took decades to really figure out what's going on, and ultimately it's a matter of acceptance rather than actionable change, although change is possible. With so much of our capabilities and challenges being unconscious, I continue to marvel at the miracle that anything gets done at all.

Dec 31, 2020 · tomasreimers on WebAssembly Studio
Jokes about the [Birth and Death of JavaScript](https://www.destroyallsoftware.com/talks/the-birth-and-death...) aside, I love the concept of WASM and that we're making the web a more robust compile target.

One of the big limitations in my mind is that I still don't know of that many people using it in production at scale. Is there a list, or well known set of examples (other than Figma), who are using WASM in prod?

arconis987
Figma is written in C++, compiled to WebAssembly. Probably the most advanced web-based design tool in the world.

I think they have like 4-5M users, and a $2B valuation.

https://madewithwebassembly.com/showcase/figma/

stopyellingatme
Figma is an actual tool, written for the web. Lots of great engineering happening over there.
jayflux
I know the Birth and death of JS gets mentioned quite a lot in WASM context (even as a joke) but I really don't see that being the case. Most web applications/sites are simple, and won't need to be compiled at a lower level in order to work. On average JS developers will be cheaper than C/C++/Rust developers to hire, so I don't see shops changing their personnel or stack anytime soon, especially if the job is to knock up a website that's a bit interactive.

It simply won't be viable to have low-level engineers do things like "build a dropdown nav" or "make an interactive carousel", and these sort of tasks will always be around.

WASM is there to augment JS in places JS isn't suitable for, rather than outright replace it. JS will definitely still exist.

To answer your Q there's also a comprehensive list of projects using Web Assembly on this site: https://madewithwebassembly.com/all-projects

brundolf
The problem is that it's hard to use as a primary language because of a lack of direct Web API access. So basically right now it's not a casual decision to say "I prefer X over JS, so I'll just use that instead because it targets WASM!" Because there's this enormous cost of crossing that API barrier (which means you'll be using some JS regardless, directly or indirectly).

So for now the value-proposition only really makes sense for a fairly narrow subset of projects: mostly ones where client-side pure-compute is a bottleneck. Theoretically API access is being worked on, but I think it's still a ways out.

postalrat
Direct access seems improbable and even undesirable. Libraries can be written and shared to expose what you want to the wasm you need to run.
brundolf
Lots of people would like the web to become a language-agnostic platform. I don't hate JS the way some do, but even I'd be excited about that prospect. That can never really happen until WASM has direct API access.

That said, I tried just now to find where I read that this was on the long-term roadmap and I'm having trouble finding it. So maybe you're right that it isn't currently planned.

dboreham
I'm also intrigued by the possibilities of WebAssembly. However I also have a nagging feeling "didn't we already do this with Java, 20 years ago?". Wondering what's different this time, or is this just a MySpace vs Facebook second-times-the-charm thing.
codeflo
I’m not sure either. There are some technical differences. WASM is more low-level than JVM bytecode, which is important for some applications and makes it a more normal compiler target. It’s more integrated with the rest of the browser: it doesn’t come with its own rectangle like Flash and Java did, but instead manipulates the DOM just like (or at the moment, only through) JavaScript. And it’s a true open standard, not a corporate-controlled platform.

Do these differences really matter, or is it mostly just the timing?

brabel
WASM is definitely more low-level than JVM byte code right now... but I'm not sure it'll remain that way given the many proposals that are very likely to be implemented in the future (even though it's taking a long time for even simple ones, like multi-value returns, to get into the standard). For example, with WASI and GC, which will finally enable direct access to the DOM, how much lower-level will WASM be compared to JVM? Not much, I would say.

The big difference is that WASM, from the beginning, is supported by all the browsers natively (a consequence of it not being a proprietary technology, but an open standard), not as a plugin... If Java had started that way, the story would've turned out quite differently (but we know that at the time, the browser everyone was using was made by Microsoft, and Sun was a competitor so this would've never happened).

novok
It starts immediately, while any embedded java applet took forever and looked really ugly. We've also done this with flash apps
tadfisher
The difference is that wasm is designed to be sandboxed, instead of the JVM which was designed to normalize disparate computing environments. In practice, this entails a lot of work to create runtimes to do anything interesting in wasm, but the behavior is secure by default (ignoring side-channel attacks, of course). Java applets run in a sandbox by default, of course, but the JRE itself has unmitigated access to the host system, so the security boundary isn't as well defined. Users can also be tricked to trust malicious applets, which is a built-in mechanism to escape the sandbox; browsers hopefully will not provide this feature for their wasm runtimes.
WalterGR
> the JRE itself has unmitigated access to the host system

As does the browser in which WebAssembly executes.

> Users can also be tricked to trust malicious applets

But the ability of applets to be trusted could have been eliminated entirely. To rewrite OP's question, then:

"If we had entirely gotten rid of trusted applets, couldn't we already do this with Java, 20 years ago?"

Of course, we didn't get rid of them, but that's still a valid question vs. inventing another technology.

Closi
Can't we just do this in sandboxed regular assembly running inside a virtual machine?!

/s

asiachick
which assembly?
codeflo
Interesting thought experiment. However, you have to realize that Sun wasn’t a charity. At the time, platform vendors were were hoping to target intranet applications for the enterprise. Microsoft had a similar concept for trusted applications with IE’s “Active Scripting”, with very similar results in terms of (non)security.
TheRealPomax
For one, no one needs to install WASM, nor are they going to run into versioning issues because the OS has a hard dependency on it. Every single person with a computer, from a smart phone to render farm workstations, will have WASM support because those devices will have a browser. The only browser that doesn't support it is the legacy IE line of browsers, of which only 11 is still supported, and which only runs on systems that will happily run Edge, instead.

Devs can rely on the fact that users won't need to install a single thing, and users can rely on the fact that they're not going to have to go through the insanity that is "trying to install the right version of Boobletech(tm) Meep(r)" or, hell: "trying to get their OS to even acknowledge that the preinstalled version of java is over a decade old and it needs to stop using it instead of the new version you installed".

If your compiler can target WebAssembly, and your users are on computers with operating systems that are still supported, your WASM application will work for them, because everyone has a browser.

worik
Unless your user is using a iPhone.
austincummings
Safari has support for WebAssembly.
TheRealPomax
Not unless your iphone is a hand-me-down from someone who themselves got it as a hand-me-down?

https://caniuse.com/?search=wasm

maxgraey
Ruffle - A Flash Player emulator over WebAssembly: https://github.com/ruffle-rs/ruffle
colesantiago
Shopify is using WASM in production [0]

[0] https://shopify.engineering/shopify-webassembly

kowsheek
We release a web-based ray tracer built with WASM that's integrated into our existing 3D/AR/VR platform: https://twitter.com/ksqio/status/1334962197324320768?s=20
konaraddi
Additional examples of WebAssembly being used at scale that I haven't seen others mention yet:

Microsoft - https://www.microsoft.com/en-us/garage/wall-of-fame/calc-ts-...

Adobe - https://medium.com/adobetech/acrobat-on-the-web-powered-by-w...

Fastly - https://www.fastly.com/blog/announcing-lucet-fastly-native-w...

echeese
Google Earth: https://earth.google.com/web/
zo1
Side note: Does anyone know why that video isn't on Youtube? I've tried to find it, but alas I can't. Copyright? Moral reasons? What?

https://www.youtube.com/results?search_query=gary+bernhardt+...

austincummings
Twitch also appears to be using it, although I don't know what for.
texodus
Perspective is developed at JPMorgan https://github.com/finos/perspective/
maxgraey
There site which aggregate prod use cases: https://madewithwebassembly.com/
TazeTSchnitzel
Unity, the very popular 3D game engine, can “export” games to run in the browser. WebAssembly is undoubtedly involved.
suprfsat
Google Duo's web client has some wasm code cached under the name 'wasm-clips'.
zappo2938
1password uses it in their chrome extension to encrypt signing keys.
pjmlp
Autodesk.

https://www.infoq.com/presentations/autocad-webassembly/

https://blogs.autodesk.com/autocad/autocad-web-app-google-io...

https://forge.autodesk.com/blog/load-encrypted-model-data-we...

BabylonJS plugins

https://babylonjs.medium.com/marker-tracking-in-babylon-js-c...

Blazor

https://dotnet.microsoft.com/apps/aspnet/web-apps/blazor

Uno

https://platform.uno/

Yuioup
The 90s called. They want their RAD tools back.
paulgb
Let’s get them on the phone and negotiate a joint-custody deal, then, because I feel like RAD has only gone downhill since then.

Thinking back to how easy it is to open up VB5, drag-and-drop a UI, and ship an .exe that anyone could run (yes, because of MS’s OS monopoly, but still), the current mess required to build a similarly complex app for the web feels like a serious regression.

tomphoolery
what about https://www.animaapp.com/ ?
sbarre
Hi 90s, this is 2020... OP can't take your call right now because they are busy building low-code environments to help companies keep their dozens of bootcamp hires productively building in-house line-of-business tools that will be a nightmare of maintenance and technical debt in 3 years.

Did you want to leave a message? Perhaps some kind of cautionary tale?

Worse. You could use most websites with Flash disabled and get by fine. That's a lot more difficult with Javascript today.

Flash had a much more arms-length relationship to the browser and wasn't able to be used as a surveillance tool as effectively as Javascript can be.

The constrained nature of Flash made it less of a threat to an open and standards-based web than Javascript (and more broadly WASM).

Edit: On the last point - Flash wasn't able to boot a virtual x86 in a browser. Obligatory reference: https://www.destroyallsoftware.com/talks/the-birth-and-death...

sn_master
"wasn't able to be used as a surveillance tool as effectively as Javascript can be."

Flash cookies, anyone?

These were the only mainstream way to keep tabs on you across different browsers.

Tuna-Fish
You can use most websites with javascript disabled just fine. In fact, it generally greatly improves your experience.

Thanks to the ADA, American sites (or sites that do a lot of business with America) must work with assistive technologies, mainly web browsers designed for the blind. Most of those do not run js. So, you cannot design your website so that it doesn't work with js disabled, unless you are willing to expose yourself to massive legal liability. People used to ignore this a lot, but since Domino’s Pizza v. Guillermo Robles, most sites have been fixed so that they are usable without js.

I strongly recommend getting ublock, blocking all js by default, and then whitelisting sites where it is required/useful. It's hard to overstate just how much better it makes browsing in general.

dumpsterdiver
Since it's relevant, here's a very lightweight javascript toggle I made for Chrome a while back. I still use it every day. No data is sent anywhere, and no unnecessary permissions are required. The code is very simple and easy to validate, in case you're one of those paranoid people like me who insists on looking.

https://chrome.google.com/webstore/detail/js-toggle/bnhjfamo...

sterlind
Domino's vs. Robles doesn't say you can't use JS. no court ever ordered Domino's to do anything, afaict. all they did was affirm that Robles was allowed to sue Domino's under the ADA (without deciding whether their site was in fact accessible or not.) See http://georgemauer.net/2019/11/04/dominos-v-robles.html for a good breakdown of the rulings.

In particular, Robles wanted the court to order Domino's to make their site WCAG-2.0 compliant. WCAG is completely silent on JavaScript. It's possible that some screen readers don't support JavaScript, but if Domino's site used ARIA tags and complied with WCAG the court would almost certainly consider that accessible (there's no official DoJ guidelines on what constitutes accessibility, so the court and businesses have latitude here.)

The ADA is all about reasonable accomodations. Supporting major screen readers is reasonable; supporting all of them clearly isn't. 98% of screen reader users have JS enabled (https://webaim.org/projects/screenreadersurvey4/#javascript), so it's clearly reasonable to provide an accessible webapp.

throwaway201103
As an aside, I thought that the Domino’s Pizza v. Guillermo Robles decision was strange. Domino's Pizza stores are local brick-and-mortar operations. They all have telephones and take orders by telephone. That to me would seem to be a "reasonable accomodation" for a vision-impared person who could not use the website to order a pizza.

In the more general sense, a strictly online service that has no alternate means of access probably does have at least some responsibility under the ADA.

colejohnson66
If it was simply a case of ordering over the phone, Dominos would’ve been fine. The problem was that they had “online only” coupons and their website and app weren’t “ADA friendly”. It’s more complicated than it seemed on the surface (like the McDonald’s coffee lawsuit)
EvanAnderson
You don't have to sell me. I've had Javascript disabled by default for a couple of years now. I love it. I'm a technical person. I can handle (and have the patience for) fidgeting with settings to get sites to work.

My non-technical friends and family can't do that. Trying to impose that upon them would just frustrate them (and, for my family, increase tech support calls to me).

All my banking and insurance sites are useless w/o Javascript. Squarespace sites, Twitter, Imgur, don't work worth a damn without Javascript. I just gave up and mostly gave Google properties a pass to get Youtube to work. (I don't use Gmail so I have no idea how bad it would be.)

A site better be pretty damned compelling before it rises to the level of getting opened in an unconstrained browser setting. I just end up not opening a lot of stuff (or just closing it again when it lights-up "NoScript").

https://www.destroyallsoftware.com/talks/the-birth-and-death...
jiggawatts
That's one of my all-time favourite talks, but I lost the reference, and googling "JavaScript talk" is futile.

Thanks for the link, I'll be sure to bookmark it this time...

Sep 22, 2020 · Fej on DOS Subsystem for Linux
You're thinking of this:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

at about 13:10.

Gary Bernhardt was prophetic https://www.destroyallsoftware.com/talks/the-birth-and-death...
kevingadd
WASM will never replace JavaScript. Something else might, and it might have support for compiling to WASM, but we'll see.
esperent
Wasm is amazing, but it's not intended to replace JavaScript. If you just want to add a form to a website, animate a drop-down menu, or the like, JS (or TS) will probably always be preferable to writing in another language and compiling to wasm. The web platform moves fast and maybe in a couple of years I'll eat these words. But nothing I've seen so far points towards the demise of JS.
pjmlp
A form like these?

https://www.qt.io/web-assembly-example-pizza-shop?hsCtaTrack...

https://gallery.flutter.dev/#/

https://platform.uno/code-samples/

ronjouch
Gary's talk is not exactly about the "demise of JS". Watch the talk, it's a great one :)
If you haven't seen The Birth & Death of JavaScript, you're in for a treat.

Gimp running in Chrome running inside Firefox

https://www.destroyallsoftware.com/talks/the-birth-and-death...

You can't really make this statement without linking to https://www.destroyallsoftware.com/talks/the-birth-and-death...
I now want to use the Javascript-based Gigatron emulator[1] in a browser on a Windows 2000 VM under the jslinux emulator[2]. (I wonder how jslinux would handle a few-year-old version of Firefox...)

Then I can run the Gigatron-based 6502 emulator in that browser to run the 8080 simulator you referenced to run CP/M. Under CP/M I should be able to find a COBOL program to run. I would be achieving an immense coefficient-of-"Inception" and re-enacting "The Birth and Death of all Software" [3] simultaneously.

Doing all of this in Windows NT 4.0 or Linux on my DEC Multia w/ an Alpha CPU would just be icing on the cake.

[1] https://github.com/kervinck/gigatron-rom/tree/master/Contrib...

[2] https://bellard.org/jslinux/

[3] https://www.destroyallsoftware.com/talks/the-birth-and-death...

Apr 23, 2020 · p1necone on Wgpu-rs on the web
https://www.destroyallsoftware.com/talks/the-birth-and-death...

This is becoming (somewhat) more true every day.

It's been envisioned already:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

The talk is hilarious, and very much on point.

https://www.destroyallsoftware.com/talks/the-birth-and-death... I guess Gary was right. In other news, we have five years of war ahead of us.
It's absolutely going to happen. WASM isn't required for this future-- it just helps optimize it. There is a ton of money out there for the company who makes a performant and compatible browser-in-a-browser w/ proper accessibility. Somebody will eventually take the "deal with the devil" to develop it.

An obligatory link to an important talk: https://www.destroyallsoftware.com/talks/the-birth-and-death...

WebAssembly in the kernel faster-than-native reminds me of the comedic talk by Gary Bernhardt, "The Birth & Death of JavaScript" [1]. Great foresight

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

You should check this talk about JS and ASM :) https://www.destroyallsoftware.com/talks/the-birth-and-death...
Feb 03, 2020 · pdkl95 on WebUSB is dead
Viewing a document on the web needs to be decidable. The original design of the web was HTML documents with forms. This IBM 3270 style design used the browser as the user interface to server-side programs. The browser's job of presenting a document was decidable and the form submission, page load process allowed the user to understand and control the data sent to a server. The server learns what was in the form and URL when the user decided to click the submit button.

Moving the software into the browser improved latency, but questions like "Is this webpage doing something dangerous/annoying?" became undecidable. If we provably cannot determine if a webpage will halt without running it, we obviously cannot as any more complicated questions about the webpage's behavior. As long as webpages have access to a Turing complete language, the browser will be de facto an OS. Unfortunately, returning to a web of documents isn't going to happen anytime soon; too many people profit from this ability to run programs on other people's computers.

(Gary Bernhardt's amazing (and terrifying) talk "The Birth & Death of JavaScript"[1] was absurdist comedy, not a guide to future browser designs)

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

bawolff
Decideability is absolutely not the problem. Subjectivity is a problem - There is no objective definition of an obnoxious ad.

I mean, behaviour of a website may be undecideable. Behaviour over the 5 minutes you're viewing it is. I dont think notions from computability theory are particularly enlightening here.

nitrogen
There is no way to know for sure that the minified JS on a website isn't sending every keystroke you enter into a password field before you click submit, until you run the JS.
colejohnson66
Why wouldn’t “deminifying” work? Then you just read the code.
omnimus
Funnily enough Gary Betnhardt seems to only write and talk about javascript nowdays.
JohnFen
> Unfortunately, returning to a web of documents isn't going to happen anytime soon; too many people profit from this ability to run programs on other people's computers.

Yes, I've resigned myself to this truth long ago. Since then, I've been watching the web get smaller and smaller as more and more websites break unless they're allowed to run code client side.

I fully expect that the majority of the web will become inaccessible to me within my lifetime. Alas, the future does not always bring wonderful things!

Yeah, I’m wondering the same. I thought of that talk too when I saw this post.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

floatingatoll
Previously on HN (227 comments):

https://news.ycombinator.com/item?id=7605687

Sorry, I disagree with the reference to academia: the median quality of academic conference talks is abismal in my experience. Sure they are more technical, but they are also that much less engaging, and target a much more narrow audience. No experimenting with styles and flows, just cookie-cutter formats with lots of text and plenty of citations.

Programmer conferences may have a more open format and obviously that invites some low quality talks, but it also leaves the door open to really amazing, totally experimental formats and topics. I'm thinking of stuff like Gary Bernhardt's "The Birth and Death of Javascript" [1], which would never fly at an academic conference in my experience (or at least would not be appreciated), but was immensely influential in programming circles.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

It's really uncanny how Gary Bernhardt predicted it all [0] a few years ago.

[0] https://www.destroyallsoftware.com/talks/the-birth-and-death...

airstrike
Seems like the obvious next step is using WebAssembly outside the browser so we can really go full-circle on this one

EDIT: some quick googling shows it's already being done

https://hacks.mozilla.org/2019/03/standardizing-wasi-a-webas...

JoshTriplett
Yup. In addition to JavaScript runtimes that have added WebAssembly support, such as Node, there are dedicated WebAssembly runtimes, like wasmtime: https://github.com/bytecodealliance/wasmtime

If you want to provide a plugin or extension interface, and want to give those plugins a limited interface rather than making them all-powerful, embedding a WebAssembly runtime gives you all of that plus the ability for people to easily write plugins in any language.

Also see https://hacks.mozilla.org/2019/08/webassembly-interface-type... for an illustrated description of how that'll work smoothly across languages.

zzzcpan
> embedding a WebAssembly runtime gives you all of that plus the ability for people to easily write plugins in any language

No, you would still have to provide an application programming interface for every target language you want to support.

JoshTriplett
No, you don't.

If you export functions to WebAssembly, any language that can run in WebAssembly can call those functions.

And if you define WebAssembly Interface Types for your exported functions (note: still in development), any language that handles interface types can automatically handle things like "how does this language represent a string safely".

Either way, you don't need to define a new API for every language.

zzzcpan
What you are describing is more like __asm__("") in C, not an interface for application developers. Those are still required for every single target language because of the mismatch between the levels of abstractions between those languages, webassembly and actual logic exported.
JoshTriplett
That's not accurate, and people are not expected to write an API adapter for every target language. Please read https://hacks.mozilla.org/2019/08/webassembly-interface-type... .

Interface Types are the mechanism for handling the different abstractions in different languages without having to write language-specific interfaces.

WebAssembly prior to Interface Types is more akin to an exported C function than to inline assembly. That already gives you enough for many kinds of interfaces, most notably the common pattern of obtaining an opaque handle and calling functions on that handle.

WebAssembly with Inteface Types gives you everything needed for high-level interfaces in any language. That lets you define strings, buffers, handles, arbitrary structured types, and pretty much anything you'd expect of a high-level interface.

zzzcpan
I have read the article, when it came out actually, it's merely about type mapping rules to an intermediate common representation. And it absolutely doesn't imply that interface types give you everything for high level interfaces in any language. Not that it's even possible, as high level <-> wasm <-> another high level transformation cannot realistically be automated for an arbitrary high level language, what can be is just high level <-> wasm <-> wasm level looking code in another high level language. It's like decompiling, you can't produce high level code automatically, only low level looking code in high level languages.
reggieband
It isn't just done, it is productized. Both Cloudflare [1] and Fastly [2] have been marketing that they support WASM in their edge networks. Both companies seem to suggest that this is a competitive advantage they could have over other cloud offerings.

1. https://blog.cloudflare.com/webassembly-on-cloudflare-worker...

2. https://www.fastly.com/blog/announcing-lucet-fastly-native-w...

syrusakbary
Completely! WebAssembly has a great potential outside of the browser. Mozilla has been doing a great work on their posts showcasing this new possibility.

Check out Wasmer! https://wasmer.io/ : along with other runtimes we are enabling the use case of WebAssembly programs as standalone applications that can run in any platform, or as libraries to be usable in any programming languages (disclaimer: I work at Wasmer!)

cjbprime
I think we're still waiting for a WASM-only OS, though :)
Izkata
Like this? https://browsix.org/
ksec
I wonder if Cloudflare will do something like this given they want to run WASM in their Edge Servers.
tcoff91
I think I remember reading something on HN about some kind of tool for running WASM in some kind of kernel module to speed up app performance.

EDIT: here it is: https://medium.com/wasmer/running-webassembly-on-the-kernel-...

saghm
I remember something like that being posted too; I think this is it? https://github.com/nebulet/nebulet
cjbprime
So cool!
syrusakbary
Nebulet is a great piece of engineering! <3
chubot
Repeating what I wrote here [1], Fabrice Bellard wrote JSLinux in 2011, which is a CPU emulator written in JavaScript that runs the Linux kernel (using typed arrays and relying on fast JITs).

That's just a way of saying that "the best way to predict the future is to invent it" (and do so before people who were "predicting" it).

If you knew about asm.js and Google's (now abandoned) Nacl and PNacl, there's nothing surprising about the development of wasm. It's been 10+ years in the making.

And 20+ years ago Microsoft's browser had ActiveX plugins (which didn't use a VM and were really unsafe and unportable). Making a portable, sandboxed bytecode solves an obvious problem with that.

Also the JVM ran in the browser, etc.

[1] https://news.ycombinator.com/item?id=21618323

vanderZwan
> If you knew about [x], there's nothing surprising about [y]

You are technically correct, but at the same time I do think this way of framing it is selling Gary Bernhardt a little bit short. There is still an uncanny part in knowing all the right things at the right time to predict the future.

thaumasiotes
"The future will be just like the past, but with a different name".

I guess it's just as accurate when Gary Bernhardt says it as when everyone else says it. It's not exactly a theme that's gone overlooked before. But still... executing bytecode in the browser goes way, way, way, way back.

jchw
It’s not just executing code in the browser. It’s been a while since I’ve watched the video but its more about JavaScript becoming the universal assembly language. I think at least that aspect is wrong because Wasm came into existence. But ignoring that detail and subbing in Wasm for JS, it’s uncanny.
dwohnitmok
It's even more uncanny than that. The talk hypothesized that JavaScript will not be the universal assembly language but rather a lower level language (or more accurately an OS-like system I suppose as the talk presents it), "Metal," would be. In that respect it predicted WebAssembly on the nose.
sp332
In December that same year, the Internet Archive started letting people boot and run MS-DOS emulators in the browser. And this was well after binfmt_misc has been (ab)used with JS engines to execute JS like a native program.

A couple years earlier, NetBSD device drivers were running in the browser, and JS-engine-as-hypervisor was an explicit, if distant, goal. https://news.ycombinator.com/item?id=4757581

Ajedi32
I personally see WASM as a natural descendant of ASM.js, so in that sense I'd say its not really wrong, just missing a step.

I don't know that it's necessarily uncanny though either; ASM.js was already a thing back when that presentation was given, so the existence of WASM isn't really surprising. The real central "prediction" of that talk wasn't WASM, it was METAL, which hasn't quite taken off yet. (The idea is out there[0], but so far mostly just as a self-fulfilling prophecy).

[0]: https://github.com/lastmjs/wasm-metal

saagarjha
JSLinux was largely a toy, AFAIK. I don’t think it was meant as an exhortation to write all software that way.
dwohnitmok
I think that's underselling the talk. Apart from the very enjoyable presentation, it makes valuable insights.

The talk isn't trying to sell itself as 100% original. It makes reference to asm.js and a game demo that already existed at the time of the talk as well as repl.it.

Despite that, I do think it makes a unique insight that even though JavaScript is ubiquitous, it will NOT be the language that future languages compile to, but rather a bytecode perhaps inspired by JavaScript that will be the language of the future. Also, importantly this bytecode will win; that is most languages will the ability to directly compile to or have a VM in this bytecode.

Moreover this bytecode has the potential to entirely supplant native code and can do so with equal or better performance.

At least to me, neither of those were obvious insights even though I knew of these plugins and JSLinux.

First off, those plugins died. Silverlight, ActiveX, Java on the web, Flash, all of these died out and were replaced by JavaScript before wasm really took off. It might've looked like the end state would be a version of JavaScript "winning."

Second, things like PNacl, Emscripten, etc. still seemed like curiosities (as the talk refers to when showing Repl.it). It wasn't clear that they or the ideas they championed would get widespread adoption.

These days it is looking more and more likely that wasm is going to become a target for all sorts of different compilers. The fact that it's a major compilation target of Rust, a language that's about as far away from what I would've thought of as a language for the web as possible, is striking.

And though we're still a long ways away from running everything on WebAssembly, it no longer seems as exotic an idea as it once did to me.

And because of that, as well as the fantastic presentation, I still return to this talk every so often awed at how much closer we are to realizing Metal.

There's still a lot of room for the talk to go very wrong, but it's not as far-fetched as when I first watched it.

EDIT: Put another way; the talk is interesting to me because it emphasizes the birth and death of JavaScript. It talks about a world where the same forces that propelled JavaScript to towering heights of popularity ultimately cast it aside and create a world not possible without JavaScript, but in which JavaScript itself essentially no longer exists.

chubot
OK, yeah the "metal" part is fair. He showed asm.js and then posited that there would be something called "metal" that causes JavaScript to die and enables application written in more languages. And major apps could be ported to it.

Originally my conception of the video was more like this commenter below: It’s been a while since I’ve watched the video but its more about JavaScript becoming the universal assembly language.

I guess a lot of people are saying "he predicted this" without referring what specifically he is predicting.

FWIW it's not clear to me that WASM is going to do that. Everything I've heard from the team says that WASM and JS are complementary. Not that WASM will cause JavaScript to die.

I think there's some possibility of that happening in the distant future, but it's far from obvious. I think JS VMs will always be better at running JS than WASM VMs running JS engines, and all the JS out there will exist for a long time.

Also, JS is at a pretty good level of abstraction to manipulate the DOM, whereas C, C++ and Rust aren't. And it has some good syntactic shortcuts. Despite being a Python person, I would probably even argue that JS is better to manipulate the DOM, despite JS and Python being very similar otherwise. Function literals might be one reason.

So when people say "he predicted this", it would be nice to be specific about what the prediction was. WASM is a step in that direction but I would argue it's also fairly clear given that asm.js existed and he showed it. The real question is if WASM can handle all these use cases. Working on a language has made me appreciate many reasons that it's hard to make a polyglot VM. Tiny changes can bias your VM towards one compilation source vs. another.

dwohnitmok
The video creator is on HN so if the gods of internet attention shine upon us he may be able to comment here.

In lieu of that, I'll offer my one-line take of the prediction of the video. JavaScript will fade from popularity, but its (original) popularity will inspire a low-level assembly-like language (looking more and more like WebAssembly these days) that will provide a new substrate for most application development, web-based or otherwise, replacing traditional binaries.

As you point out it's not at all clear, even five years on from this talk, that this prediction will be correct. WebAssembly is currently complementary to JS and cannot fully replace it. The vast majority of websites these days use JS but not WebAssembly. Use of WebAssembly for applications that traditionally have not been run inside a web browser (e.g. GIMP, LibreOffice, etc.) is still nascent and it's nowhere near a sure bet that it'll take off there.

But maybe, just maybe, it'll happen.

int_19h
The sheer number of big corporate backers, and standardization, is what will make it happen. That's really what is different here versus Java applets, Silverlight, NaCl etc. Those all failed because nobody was big enough to single-handedly push something like that onto the ecosystem. Now that they're acting in concert, things are very different.

Everything else is "just engineering". E.g. as far as being complementary to JS, and not being able to replace it - they are already working on access to DOM.

Lx1oG-AWb6h_ZG0
> Also, JS is at a pretty good level of abstraction to manipulate the DOM

I agree with your point in general, but surely the fact that there are 10 million js frameworks invented every week is proof that the native DOM APIs are not a good abstraction? As a mostly front end dev, most of my UI logic these days target _React_, not the dom APIs. To the extent that I write JavaScript, it’s pure data manipulation, which can be written in any language.

chubot
That's true, although for another example, React is also commonly used with JSX to express DOM fragments. And JS has that syntax but Python, C, Rust, etc. don't.

In other words, the particulars of the language matters. I wouldn't underestimate 20+ years of JS evolution toward expressing the problem better.

I'm working on my own language and all those details are hard. When they work, they're invisible to users. You only notice when it's not there or doesn't work! I would agree that Python is a better language than JS in most respects, but it's not clear to me that it's a better language for writing web front ends.

e.g. the async abstractions and promises are different and I believe that matters.

cbhl
> Moreover this bytecode has the potential to entirely supplant native code and can do so with equal or better performance.

I interpreted that part of the talk as hyperbole and sarcasm. It was saying that programmers will be so far removed from how computers work that they'll happily program against a model that has five layers of abstraction that simply serve to provide the original interface of the bottom layer.

Bytecode, by its nature, has to be translated into native code -- the way for it to be 'faster' than native code is to be native code. In software, you can do this with static or JIT compilation. The hardware people do this by changing their CPUs to make the things that people do in the bytecode faster. (Apple introduced new floating point CPU instructions in their iPhones just to make JavaScript faster.)

xscott
Around 1999, HP had a project called Dynamo where they implemented a JIT PA-RISC virtual machine on actual PA-RISC hardware. In some cases, they got better performance than native because the JIT would recognize hot paths at run time. I only bring it up to show that it's not 100% certain VMs can't win over conventional native. When I compile with GCC or CLang, I don't think my executable is tracing the hot paths and rewriting itself as it runs.

https://www.hpl.hp.com/techreports/1999/HPL-1999-78.html

marcosdumay
I don't think this was entirely sarcasm, it's mostly hyperbole. I am sure there will be some generally used configuration that will do basically that, and for a good reason. I just have no idea what the reason may be.

About speed, JIT optimized bytecode can be faster than what is possible to static binaries. The JIT has more information than the compiler to optimize your code. Currently I don't know of any that is that fast, but there is no inherent limitation here.

traverseda
https://www.destroyallsoftware.com/talks/the-birth-and-death...

My understanding is that using interpreted bytecode you remove the need for hardware-based process isolation, which incurs some pretty significant performance penalties. Basically, if your software does a lot of IPC or syscalls, it's very possible for an interpreter based solutions to work better, if it's integrated at the kernel level.

Certhas
I didn't get that understanding at all. The insight is that by going bytecode only you can rethink your security model. You can do things that you can't do safely if you allow native code to run.

It's swapping out one abstract model for another, just that we are so used to the current abstract model that we don't perceive it as one.

mntmoss
There's also a lot of sticky points at the bottom of the stack that lead towards native solutions, starting with memory management and basic I/O functionality. Nobody wants to code directly against the hardware for very long, so you end up with a driver, and then driver and resource management, and then an operating system of some kind. Even on the early microcomputers it was the case that you have a boot ROM of some sort and would code against that for most tasks.

With WASM you have the same kind of thing but the added wrinkle of the browser-based I/O being a different set of "basic abstractions" from what you get in libc, and every solution that bridges the gap being a bit of a hack. Being bytecode doesn't really change the fact that you still have to deal with the resulting dependencies at some level.

QT is already experimenting with rendering to Canvas using WASM in the browser, I've tried to call them out on it as bad practice a couple of times in the past.

Rust on the other hand is doing some genuinely exciting, powerful stuff with allowing WASM to talk to the DOM and allowing native developers to target HTML directly within their apps. Rust's approach is to treat the language like a minimal, drop-in replacement for Javascript that doesn't require you to ship an entire rendering engine alongside it.

It is yet to be seen which approach to web portability is going to win. Obviously I'm rooting for Rust, and I personally think apps that are written using Rust's strategy will nearly always be higher quality than apps written using QT's strategy. But that doesn't necessarily mean that Rust will win, there are a lot of factors at play here. It'll be interesting to see.

But agreed, native apps are definitely coming to the web in some form or another. Funnily, the opposite is also true, since there's been a lot of buzz about using WASM for native sandboxing. I like to think that Gary Bernhardt[0] is pleased about that.

[0]: https://www.destroyallsoftware.com/talks/the-birth-and-death...

WASM machines--the next (hopefully) better version of Lisp machines (https://en.wikipedia.org/wiki/Lisp_machine)!

It looks like Gary Bernhardt was pretty spot on in his talk "The Birth and Death of JavaScript": (https://www.destroyallsoftware.com/talks/the-birth-and-death...)

monocasa
Lisp machines didn't go away because of some conspiracy, they just stopped making sense. The vast majority of the benefit was that since they ran on (and were compared to other) 1980s minicomputer hardware without instruction caches, pulling the interpreter into microcode meant that the interpreter's overhead wasn't competing with data fetches on the von Neumann memory bus.

Instruction caches (and JITs to a degree) solve the same problem in much more general ways. That's why Azul went out of their way to create an appliance to run Java code with custom CPUs, and ended up with a pretty standard RISC for the most part.

All of that applies to WASM machines too.

Nov 28, 2019 · 1 points, 0 comments · submitted by traverseda
Nov 24, 2019 · m_sahaf on Jslinux (2018)
Gary Bernhardt prophecy from his The Birth and Death of JavaScript[0] is coming true.

[0] https://www.destroyallsoftware.com/talks/the-birth-and-death...

chubot
JSLinux predates that talk by 3 years. Also the “prophecy” is a special case of Atwood’s law from 2007, where the “anything” is an operating system.
> Your browser is going to act as a VM to run a browser that will display the content.

Gary Bernhardt's talk "The Birth & Death of JavaScript"[1] was an ominous portent of a terrifying future. Unfortunately, some people apparently saw it as development roadmap.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

>Running a javascript interpreter, written in C and cross compiled to WASM, in a browser, does feel like a joke.

Every day I see more clearly how prophetic "The Birth & Death of Javascript"[1] (2014) was. I'd love to pluck 1996 Brendan Eich into the future and show him how far his little programming language would go.

[1]: https://www.destroyallsoftware.com/talks/the-birth-and-death...

BrendanEich
I was closely involved as CTO and then SVP of Engineering at Mozilla from the inceptions of both WebGL (originally Canvas 3D) and asm.js (see http://asmjs.org/ for docs), which led to the 4 day port via Emscripten or Unreal Engine 3 and Unreal Tournament from native to web and in Firefox at 60fps. This prefigured WebAssembly, which came in 2015 after it was clear from MS and the V8 team that one VM (not two as for Dart or PNaCL) would win.

Gary added the insight that system call overhead is higher than VM safety in one process (he may have exaggerated just a little) to predict migration of almost all native code. The general idea of a safe language VM+compiler being smaller and easier to verify than a whole OS+browser-native-codebase, as well as having lower security check overhead, I first heard articulated by Michael Franz of UCI, and it inspired my agenda at Mozilla that led to the current portable/multiply implemented JS+WebAssembly VM standard.

Aug 07, 2019 · MrRadar on Wine on Windows 10
Have you seen The Birth and Death of Javascript? https://www.destroyallsoftware.com/talks/the-birth-and-death...
I like to think of it more that Javascript/WebASM will finally accomplish what Java spent decades trying to do: be the completely ubiquitous hardware-independent code platform.

Javascript has truly become the "Write Once, Run Anywhere" language.

https://www.destroyallsoftware.com/talks/the-birth-and-death... (2014)

AnIdiotOnTheNet
How do you figure, when certain features of javascript are supported on some browsers and not others? You've just swapped OS dependence for runtime dependence. JS's solution to this problem? Another layer of abstraction to make the JS cross-browser.

WASM already has this problem, what with 5 or 6 different incompatible runtimes already in existence.

erikpukinskis
You just use the lowest common denominator, depending on your definition if “everywhere”. When in need use shims.

It’s not literally “run anywhere” it’s “for all intents and purposes run anywhere”.

xena
I basically want to do this _without_ javascript though. My implementation is in Go: https://github.com/Xe/olin
bsagdiyev
Please, no.
xena
Bad news, that talk has been a constant source of inspiration for my entire endeavors :)
bsagdiyev
That's fine, I'll continue to disagree and understand that we'll never work together since we move fast and don't break things where I work.
dang
"Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something."

https://news.ycombinator.com/newsguidelines.html

Negitivefrags
Spectre came along and ruined the awesome conclusion of that talk.

The idea was that the cost of using WASM would be entirely offset by the speedup of removing the cost of hardware memory protection. We could do that if everything ran in one big VM because the VM would enforce the memory protection.

Unfortunately, now we can't rely on a VM for memory protection anymore. We have to assume that any code running in the same address space can access any memory in the same address space.

So we need the hardware memory protection after all. You can say goodbye to your WASM-only future.

0815test
Well, Spectre is a largely-theoretical class of vulnerabilities, that doesn't even apply to chips that don't do speculation in hardware, and that is purely about information disclosure via side-channel mechanisms. It might be a bit of a concern for some users, but it's not the end of the world - for instance, the designers of the Mill architecture have a whole talk discussing how Spectre as such doesn't really apply given the architectural choices they make. And if running stuff in different address spaces is enough to mitigate it effectively, that still provides quite a bit of efficiency compared to an entirely conventional OS.
fulafel
Nitpick re "chips that don't do speculation in hardware" - load forwarding and speculative cache prefetching and branch prediction are done even by lots of current (and past) processors that don't do speculative execution and hence are considered "in-order" microarchitectures.
tntn
> doesn't even apply to chips that don't do speculation in hardware

This is an interesting way to put it. I would have said "applies to pretty much every CPU manufactured in the last decades." Your statement would make sense if speculation in hardware was some niche thing, but I think you would be hard-pressed to find an invulnerable CPU that is used in situations where people care about both performance and security.

That's great for the mill, but isn't relevant to the world outside of mill computing.

xena
This is part of why I want to make a custom OS where each WebAssembly process can be in its own hardware protected memory space. I'm looking at building on top of seL4.
colordrops
I assume that new chips will address this vulnerability, correct? Couldn't the VM detect whether the hardware is secure and decide whether to use hardware memory protection or not?
tntn
> new chips will address [these vulnerabilities]

It doesn't seem likely. The chipmakers will fix the vulnerabilities that break isolation between processes and between user-kernel, but the within-process issues will probably stick around.

Negitivefrags
At this point it seems practically impossible to deal with completely.

V8 at least have given up on the concept of trying to protect memory within the same address space.

https://v8.dev/blog/spectre

https://www.destroyallsoftware.com/talks/the-birth-and-death...

We're well on our way.

m_fayer
One of my favorite examples of life imitating art.
Not GP, but Gary Bernhardt is the guy who gave the classic "Wat" [0] and "Birth and Death of JavaScript" [1] talks, and some searching turns up "pretzel colon" as the "&:" operator in Ruby [2]. I assume he's mentioned it in a screencast or something, but I wasn't able to find it.

[0] https://www.destroyallsoftware.com/talks/wat

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

[2] https://technology.customink.com/blog/2015/06/08/ruby-pretze...

I love the talk on "The Birth and Death of JavaScript" by Gary Bernhardt: https://www.destroyallsoftware.com/talks/the-birth-and-death...

Highly encourage everybody to watch it and how people may interact with JavaScript more and more through things like WASM. Very funny talk too :)

adamnemecek
There's little relationship between wasm and js.
diegoperini
The talk predicts that Js will be dead (for user space) the moment it conquers the OS. Dead in this context means it will be invisible to the app developer just like C.
bobajeff
At the time, asm.js looked like it was going to possibly be the universal bytecode that runs everything. WebAssembly hadn't gone public at that time.
_bxg1
The talk is more about the web as an application platform, and its values, coming down to OS userland. Whether that means JS or Wasm or both doesn't really matter.
mythz
The age is nigh, its already become the easiest way to run and install Safari's JavaScriptCore on every platform:

    $ wapm install -g jsc
Can then use `jsc` to execute .js, as a REPL or inline expressions:

    $ jsc -e "console.log([1,2,3].map(x => x*2).reduce((x,acc)=>acc+x))"
syspec
You can also symlink the JavaScript core framework which contains the executable there without installing anything as an alternative
Every time someone submits wasm related content, I feel obliged to link this classic talk:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Bernhardt, Gary – The Birth & Death of JavaScript (PyCon 2014)

Jyaif
This talk is (thankfully) obsolete now. We don't need to write (or even to compile to) JS to have multi platform, we just need to write/generate wasm :-)
afiori
Honest question, why is this talk so relevant? I agree it had a good foresight, but I watched it and am a bit surprised by how much it is mentioned.
oblio
The presentation style is quite funny. It was even funnier at the time, when such a thing was considered almost inconceivable. Now, it seems prescient.
shurcooL
My guess as to why it’s mentioned so much is that the general expectation is that many people haven’t seen it yet. When people do watch it, they’re surprised by how much good foresight there was, and so they repeat the cycle.
wasm and wasi will eventually take over enabling higher level languages such as C#, Java and Python to be used at frontend (Blazor project is an example).

As Mr. Bernhardt says: JavaScript had to be bad for the evolution to happen. (https://www.destroyallsoftware.com/talks/the-birth-and-death...)

Apr 05, 2019 · streblo on I'm Joining CloudFlare
All of this reminds me of a talk by Gary Bernhardt called The Birth and Death of JavaScript (https://www.destroyallsoftware.com/talks/the-birth-and-death...), which although farcical is actually a really compelling vision of the future of infrastructure.

Congrats Steve! Excited to see how this turns out.

steveklabnik
I was lucky enough to be present for one of the times Gary gave this talk live. I’ve been joking that for the past few years, his nightmare is my dream. It, like a lot of Gary’s stuff, has been very influential on me.

Thanks!

davepeck
Just make sure you stay clear of the exclusion zone and you'll be fine.
This reminded me of the talk "The Birth and Death of JavaScript" (2014) by Gary Bernhardt, where he goes into some of the more absurd possible implications of what happens when applications cross-compiled to JS approach or surpass traditional desktop performance: https://www.destroyallsoftware.com/talks/the-birth-and-death...
Can't help being reminded of that talk by Gary Bernhardt: “The Birth & Death of JavaScript”[0] — exploring a hypothetical future where JS takes over everything without (most) anyone using it of their own volition.

0: https://www.destroyallsoftware.com/talks/the-birth-and-death...

tokyodude
Except entirely irrelevant as WASM is not Javascript.
_jn
Sure, though it’s still a web technology taking over an otherwise unrelated space ¯\_(ツ)_/¯
None
None
int_19h
I don't think it's unrelated. Despite the name, WASM isn't really a "web technology" - it's a sandbox technology and a compile-once-run-everywhere technology, and there has always been demand for that outside the web, even before the web existed. It might be that the web is what created enough demand for it to happen in the end, but what do we care?

The problem with JS was never that it's a web technology. It's that it's a bad technology that happened to be in the wrong place at the wrong time to get a first mover advantage.

techntoke
Can you show me an example of a WebAssembly app that runs in the browser with JavaScript enabled?
grungleshnorts
There are probably newer examples, but from when WebAssembly was coming out:

https://s3.amazonaws.com/mozilla-games/ZenGarden/EpicZenGard... (from https://www.webassemblygames.com/ )

https://alpha.iodide.io/

https://www.figma.com/blog/webassembly-cut-figmas-load-time-...

https://github.com/mdn/webassembly-examples/

https://github.com/emscripten-core/emscripten/wiki/Porting-E...

techntoke
Meant to say with JavaScript disabled.
This talk was really prophetic https://www.destroyallsoftware.com/talks/the-birth-and-death...
ForHackernews
> the text format defined by the WebAssembly reference interpreter (.wat)

https://www.destroyallsoftware.com/talks/wat

maliker
I like to share this talk with new junior devs instead of ranting about strangely defined behavior in javascript. Saves time, and it's more fun.
IshKebab
Maybe prophetic in the sense of "people really wanted this for years and it has finally been implemented". People have been talking about it since at least when NaCl debuted in 2011.
dijit
“If I had asked people what they wanted, they would have said faster horses.” - Henry Ford

JavaScript by itself is not a “nice” language, I would ask what the root of your request is: an easy garbage collected language that is ubiquitous?

derefr
This "long-held fervor" that culminated in WASM started back before Node.js existed, so back then, "Javascript" was just a thing browsers did (except for, say, Windows Scripting Host's support of it.) The fervor back then wasn't really driven by a desire for a ubiquitous anything; it was driven specifically by people thinking about browsers, and what is required to program web-apps in browsers.

What people have wished for, since... oh, 2001 or so, is the ability to write web-app frontends without needing to grok and deal with the awful runtime semantics of Javascript—the way you do when you write Javascript, yes, but also the way you do when you write in a language that directly transpiles to Javascript, like TypeScript or ClojureScript.

Languages like TypeScript may add semantics on top of the JS runtime's semantics, but they can't get away from the fact that the JS runtime is the "abstract machine" they program, any more than e.g. Clojure can get away from the fact that the JVM is the abstract machine it programs. That's why none of these languages were ever seen as a "saving grace" from the "problem of Javascript", the way WASM is.

WASM has its own abstract machine (which runs efficiently in browsers), which finally frees people from the tyranny of the Javascript-runtime-as-abstract-machine.

It's great that it's also now replacing the Javascript-runtime-abstract-machine in other contexts (e.g. Node-like server-side usage, plain-V8-like embedded usage) but that was never really "the thing" that anyone cared about.

---

Mind you, the NaCl fervor was for a ubiquitous VM—but the NaCl fervor wasn't nearly as large, and isn't really what's propelling WASM to prominence right now. Even then, it wasn't about "an easy garbage collected language that is ubiquitous", no. The goal of it was to be able to take code that you already have—native code, written to run fast, like a AAA game—and put it in a browser-strength sandbox, such that it can be zero-installed just by visiting a URL, with full performance. You know, like ActiveX was supposed to be. But better.

NaCl didn't really get us there, because it happened right as the architectural split between x86 and ARM really started heating up, and NaCl's solution to that split—PNaCl, a.k.a. sandboxed LLVM IR—was both too late and not really efficient-enough at the time to fully supplant the "Native" NaCl in NaCl messaging. (LLVM IR works well-enough now for Apple to rely on it for being a "unified intermediate" of both x86 and ARM target object code, but that shift only began with ~8 additional years of LLVM development after the version of LLVM that PNaCl's IR came from.)

WASM seems to get us there. But do we care any more? Everyone already has other solutions to this problem. ChromeOS can run Android apps; Ubuntu Snappy packages can expose GUIs to Wayland; Windows has a Linux ABI to run Docker containers. Ubiquity is a lot easier now than it was back then, for any particular use-case you might want.

On the embedded-scripting side of things, everyone has seemingly settled on embedding LuaJIT or V8. Do people even need embedded scripts to be fast, in a way that "WASM as compilation target" would help with? Maybe for the more esoteric use-cases like CouchDB's "design documents" or the Ethereum VM (https://medium.com/coinmonks/ewasm-web-assembly-for-ethereum...) But I doubt WASM will hit ubiquity here. Why would OpenResty switch? Why would Redis? Etc. You're not writing-and-compiling native code for any of these in the first place, so adding WASM here would only break existing workflows.

int_19h
> What people have wished for, since... oh, 2001 or so, is the ability to write web-app frontends without needing to grok and deal with the awful runtime semantics of Javascript

It was so even before 2001. Consider why the <script> element as the "type" attribute to begin with (and why it had the "language" attribute originally, before "type"). As I recall, W3C specs from that era gave examples such as Tcl! On Windows, you could use anything that implemented Active Scripting APIs - e.g. Perl. JS just happened to be the one that everybody had, because it was the first, and so it became the common denominator, to the detriment of the industry.

AsyncAwait
> I would ask what the root of your request is: an easy garbage collected language that is ubiquitous?

Honestly, I just want to be able to write the stuff I have to in JS today in my favorite language.

aaaaaaaaaaab
If by "people" you mean "a vocal minority", then sure.
ahupp
A lot longer than that. There was a standard calls ANDF in 1989 (architecture neutral distribution format) trying to solve the same problem. I'm curious why it didn't work out, but I imagine part of the problem was that it had to support both many different flavors of Unix as well as cpu arch.

https://en.wikipedia.org/wiki/Architecture_Neutral_Distribut...

rogerbinns
> I'm curious why it didn't work out

I worked at a multi-platform UNIX software vendor back then, and did the project of compiling our products to ANDF. At one point I counted 20 distinct UNIX variants that we had in our office (all workstations).

It was believed that UNIX systems complied with government procurement standards (eg this is where POSIX is relevant), but there were always bugs and differences in behaviour. In C code this is handled by #if pre-processor directives to adapt as needed. ANDF required turning that into runtime if statements instead. (An example would be if there were two different network interface calls depending on platform.) That would have been a herculean task and was in same places impossible such as the same API taking different numbers of parameters on different platforms. The ANDF compiler would have to pick one for its header files.

Even something like endianess is usually handled by #if and not an if statement, so you can see just how much effort would be needed just to have it compile. You still had to do all the multi-platform installation instructions and testing, so no one had an incentive to make things more complicated.

Java killed ANDF stone dead. While Java was most prominent for applets in the browser, it also competed with C++ on the backend. C++ compilers cost a lot of money, had licensing daemons that locked you to one platform, and implemented different subsets of the C++ standard and library. Java was more forgiving and the JVM provided standard features like multi-threading and networking. You will also note that Sun spent a lot of effort to keep Java standard.

chubot
It looks like that talk is from 2014. Ideas like that had been talked about for many years -- that is, moving all of computing to the browser.

July 2007:

Atwood's Law: any application that can be written in JavaScript, will eventually be written in JavaScript.

https://blog.codinghorror.com/the-principle-of-least-power/

JSLinux by Fabrice Bellard in 2011:

https://bellard.org/jslinux/

As far as I remember, the addition of Typed arrays to JavaScript made this feasible.

In typical Bellard fashion, he didn't really talk about it -- he just demonstrated it!

Quite possibly! Or maybe at install or link time. Or maybe we're looking at a future where almost all code goes through a JIT engine and you've only got a few normal cores that run the OS that manages everything. Or possible everything will be Javascript[1].

[1]https://www.destroyallsoftware.com/talks/the-birth-and-death....

Jan 25, 2019 · teddyh on Compilers for Free
So in practice, nobody uses actual x86 assembly anymore, but everybody uses it as a target for compilation. It’s just like The Birth & Death of JavaScript¹, but non-fictional, and on a lower level – a Birth & Death of x86 assembly, if you will.

1. https://www.destroyallsoftware.com/talks/the-birth-and-death...

MrBuddyCasino
Or JVM byte code.
gameswithgo
>nobody uses actual x86 assembly anymore,

well almost.

https://www.destroyallsoftware.com/talks/the-birth-and-death...
bsaul
i saw that video in the past, but watching it again now took me 15 minutes to realize this was supposed to be a fun video about an absurd future.
Good grief we've reached the point of a JS interpreter running in ring 0? Gary Bernhardt was a prophet.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

warent
Well, strictly speaking, WASM and JS are two different things, but basically yeah.
Dec 08, 2018 · earenndil on Nginx on Wasmjit
This talk is relevant https://www.destroyallsoftware.com/talks/the-birth-and-death.... Tl;dw in-kernel JIT has the potential to be 4% faster than direct execution. I am still dubious, however, as JIT requires more resources by a long shot than direct binary running.
Dec 07, 2018 · lwb on Nginx on Wasmjit
This is satire but still very interesting: https://www.destroyallsoftware.com/talks/the-birth-and-death...
Dec 07, 2018 · Ajedi32 on Nginx on Wasmjit
One step closer to METAL[1]

[1]: https://www.destroyallsoftware.com/talks/the-birth-and-death... (at 18:46)

ksec
Everything said in the talk went true. Which means we are very close to a Nuclear War. ( And it certainly looks like a possibility at the way things are going )
Dec 07, 2018 · traverseda on Nginx on Wasmjit
Obligatory birth-and-death-of-javascript

https://www.destroyallsoftware.com/talks/the-birth-and-death...

I suppose calling it METAL would have been too on-the-nose.

molf
Not sure why you're downvoted, this is a very interesting talk.
traverseda
It's pretty relevant to the discussion of "why would you want to run wasm in the kernel", but I'm not too worried about the votes.
reitzensteinm
It's not, though... WebAssembly doesn't really have all that much to do with js, any more than Flash or Java plugins would if they ended up being standardized instead. Every time there's a wasm thread it gets posted, but it misses the point of the talk to suggest that wasm is the prediction bearing fruit.

The talk is great, but I'd suggest that's the reason for the downvotes.

earenndil
> it misses the point of the talk to suggest that wasm is the prediction bearing fruit

Does it? It seems that the talk has two main points:

1. Javascript succeeded because it was (at least initially) just good enough not to be completely unbearable, but bad enough that people ended up using it primarily as a target for other languages.

2. Ring 0 JIT can be 4% faster that normal binaries.

WASM is primarily a target for other languages, and qualifies as a language that can theoretically be JITted 4% faster than native code can be run.

reitzensteinm
Point number one isn't applicable to wasm.

The execution inside the kernel is related, but nobody replies to a lua in kernel post with a link to the talk.

pjmlp
Because Lua isn't related to Web development, while JavaScript and WASM are.
reitzensteinm
Right, but that's my point. Wasm is tenuously connected with JavaScript because both are web technologies, so people link the talk.

But they couldn't be more different technically, and if wasm does indeed become the lingua franca of future computing it will be much more boring than the craziness of js doing the same.

The talk was great because it was about an insane yet plausible future. We now have a boring and probable future.

pjmlp
I don't believe in that, too much experience to believe in a miracle bytecode format that will magically succeeded where others failed.

It will just be yet another VM platform.

reitzensteinm
Even less reason to link to the talk then :-)
Electron is becoming the new Win32. Might as well own it and make it faster, more native.

I think it’s a better alternative to Apple's Marzipan.

It’s a tragedy for the Web but makes sense for Windows.

Anyway, as others have noted, this seems more and more prophetical every year:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

https://twitter.com/garybernhardt/status/1069787375247622144

A much better video about the same topic:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Well according to "The Birth & Death of JavaScript" prophecy, we will soon have war in 2020 all the way to 2025. So far everything in the prophecy has been true, so I am not sure if I am still alive in 10 years.

[] https://www.destroyallsoftware.com/talks/the-birth-and-death...

Gary Bernhardt kind-of predicted this. https://www.destroyallsoftware.com/talks/the-birth-and-death...
aaimnr
Indeed! Amazing talk and amazing mind, thanks for sharing. Reminds me of Rich Hickey (whom he mentions in the talk), similarly independent and deep thinking from the first principles.
imhoguy
This will happen sooner than in 2030s: https://github.com/piranna/wasmachine (WebAssembly on FPGA)
MrRadar
The point about this approach being "closer to the metal" than other cloud providers definitely brought this talk to my mind. I just hope the nuclear war he also predicted for 2020 doesn't come to pass :/
Relevant: https://www.destroyallsoftware.com/talks/the-birth-and-death...
ToFab123
So everything is eventually going to WebAssembly. Eventually.
The parallels with the Destroy All Software talk "The Birth and Death of Javascript" [0] are crazy. Seeing the section where they address the possibility of Node modules and system access from WASM is like seeing a flying car advertisement in real life.

[0]:https://www.destroyallsoftware.com/talks/the-birth-and-death...

tabtab
Re: like seeing a flying car advertisement

Ditto the feeling. I don't "get" WASM from a typical in-house CRUD development. Game makers, maps, movie editors; sure they need the speed. But some say it's gonna revolutionize most browser-based application dev, but I can't get specific examples that are relevant to us. And even for those domains listed, relying on inconsistent and buggy DOM as found in browser variations is a problem it probably won't solve. DOM will still suck as a general purpose UI engine.

WASM just makes DOM bugs run faster.

int_19h
Between WASM and modern graphics APIs, we might be able to actually kill DOM altogether. Something like this:

http://blog.qt.io/blog/2018/05/22/qt-for-webassembly/

mwcampbell
Let's not do that until we have a way to make non-DOM-based web applications accessible to screen readers and other assistive technologies.
tabtab
Meta-data can be embedded to describe and categorize content. But accessibility is usually not a goal for many "office productivity" applications (per my domain-specific standards suggestion). Usage of DOM alone does not guarantee accessibility either.

As far as Qt, while it may be a good starting point, I don't think direct C++ calls is practical. Some intermediate markup or declarative language should be formed around it: a declarative wrapper around Qt.

int_19h
> Some intermediate markup or declarative language should be formed around it: a declarative wrapper around Qt.

QML is exactly that.

mwcampbell
> accessibility is usually not a goal for many "office productivity" applications

I think I might be misunderstanding you. Are you saying accessibility is usually not a goal for the kind of applications that people need to be able to use to do their jobs?

tabtab
I believe there's a reasonable limit to how adaptable workplace software has to be to those with poor vision, etc.
pcwalton
The DOM isn't particularly "buggy", relative to, say, Win32 or Cocoa. It may or may not be a bad API, but implementations are pretty solid.

(I have to confess I've never understood the objections to the DOM. I have literally never had an instance in which I had to use the raw Win32 API that didn't turn into a miserable experience.)

tabtab
With Win32 and Cocoa you pretty much have one vendor with roughly 3 or so major releases. But with browsers you have roughly 8 vendors above 1% market-share with 3 or so (notable) major releases each. Therefore, you have to target roughly 8x more variations of the UI engine.

Look how hard Wine's job is to be sufficiently compatible.

I believe we need to either simplify the front-end standards (pushing as much as possible to the server), or fork browsers into specialities: games, media, CRUD, documents, etc. What we have now sucks bigly. Try something different, please!

pcwalton
Interoperability, and the standards process, is how we get specs that are sensible. Whenever I have to program using Win32, Cocoa, etc. I inevitably spend a ton of time reverse engineering how the proprietary APIs work. For DOM APIs, things generally work how they are supposed to work, because they were actually designed in the first place (well, the more recent APIs were).

Wine isn't comparable, because the Web APIs are designed by an open vendor-neutral standards committee and have multiple interoperable open-source implementations.

Your proposals break Web compatibility and so are non-starters. Coming up with fixes for problems in the DOM and CSS is the easy part. Figuring out how to deploy them is what's difficult.

tabtab
Re: "the standards process, is how we get specs that are sensible." - They are not sensible: different vendors interpret the grey areas differently. A sufficiently thorough written standard would be longer and harder-to-read than the code itself to implement such.

Re: "Your proposals break Web compatibility" -- Web compatibility is whatever we make it. A one-size-fits-all UI/render standard has proven a mess. What's the harm in at least trying domain-specific standards? We all have theories, but only the real world can really test such theories.

Also https://www.destroyallsoftware.com/talks/the-birth-and-death...
thsowers
The screenshot of GIMP running in Chrome running in Firefox is one of my favorites!
"The Birth and Death of Javascript" becomes more and more real

https://www.destroyallsoftware.com/talks/the-birth-and-death...

enraged_camel
I like the part about the Bay Area being a radioactive wasteland. :)
RightMillennial
Fallout: New Francisco
ndesaulniers
We're firmly within NCR territory here.
IgorPartola
The Hobologists send their regards.
Analemma_
Technically, he only said it was an "Exclusion Zone", so it could refer to real estate prices in the 2035 Bay Area for non-quadrillionaires.
earenndil
I still don't buy it. Javascript might buy a 4% performance improvement, but the increased resource usage makes that impractical for most scenarios. For server use, the increased wear makes it cheaper to buy more computers and have them last longer. For application use, performance is not relevant enough to make it interesting. So really the only possible application is video games.

EDIT: before people accuse me of making a false dichotomy: I acknowledge that there are other uses for computers, but am unable to think of any others where the increased resource consumption would be worth it. Another thing: cell phones, the battery would drain much more quickly if everything were a webapp. Perhaps video game consoles will switch to such a JIT, though...

IgorPartola
I mean Bitcoin is pretty much built on “increased resource consumption is worth it”. But I get your point.
keymone
It’s not. It’s built on “here’s a way to convert energy into financial security and have a transaction platform on top of it”.
empthought
Any person in the situation where “financial security” based on government-issued currency is tenuous enough to make cryptocurrency an attractive option is either 1. doing something illegal, or 2. would be better off with access to the energy used.

“There are no financial instruments that will protect you from a world where we no longer trust each other.” https://www.mrmoneymustache.com/2018/01/02/why-bitcoin-is-st...

keymone
there are 3 assertions here and they are all dumb.

first two aren't even about cryptocurrencies - every currency is more attractive than bolivar in Venezuela these days. i guess all those people are doing something illegal (surviving?) or will be better off with electricity (spoiler alert - they already have electricity).

the last one is even dumber and coming from supposedly somebody who should know their way around finances and yet quotes like the one you posted or this one:

> These are preposterous numbers. The imaginary value of these valueless bits of computer data represents enough money to change the course of the entire human race, for example eliminating all poverty or replacing the entire world’s 800 gigawatts of coal power plants with solar generation. Why? WHY???

just show how clueless he is.

there is a gigantic difference between not trusting and not having to trust.

and of course market cap numbers are preposterous. it's because they are meaningless and don't represent anything that exists in reality. not the amount of money that was spent acquiring those currencies, not the amount of money that can be made selling those currencies. it's meaningless numbers.

empthought
> There is a gigantic difference between not trusting and not having to trust.

This is the point, though. Everyone expecting financial security has to trust. There is no alternative.

keymone
the alternative is math. when you sign a transaction and it gets included in the blockchain - i don't need to trust anyone that i got the money, i don't need to fear the transaction will be reversed due to some banking policies, i don't need to fear my account will be closed, etc.

so you're right, i have to trust math and i am ok with that.

empthought
You still have to trust your counterparties, though, and they will always be less trustworthy in aggregate than the functioning of a financial system.

You're worried about the wrong player in this game.

keymone
no, that's not how it works. counterparty risk exists irrelevant of which financial system you operate in. you may be paying insurance to reduce the damages, you may sue them in court of law for breaking contract - all of that is irrelevant to financial system.

cryptocurrencies is simply different financial system where you don't need to trust middlemen.

it's really astonishing how complacent people have become towards trusting middlemen in financial systems. if you ignore the banking services that you're paying for either directly or indirectly via taxes - what is the bank doing for you? why should you pay for having a record in the database? why should bank be involved in facilitating or even censoring your transactions?

i'm worried about exactly the right player in this game.

empthought
“...[Bitcoin] also has some ideology built in – the assumption that giving national governments the ability to monitor flows of money in the financial system and use it as a form of law enforcement is wrong.”

Seems like the author I cited has it exactly right, then.

keymone
he has some things right, but not nearly enough.

is this the way you concede your opinion expressed above was wrong? because you just jumped from "financial system is not a risk, counterparty is a risk" to "but what about transparency and law enforcement"..

empthought
They are the same problem, and not a jump at all. You even used the phrase “court of law” yourself.

Financial security depends on the ability to show who bad actors are and undo their transactions. I don't see how this is possible without some sort of middleman involving civil government. Yes, the current banking system has flaws, but they are not technological flaws and cannot be solved by technological means.

keymone
> You even used the phrase “court of law” yourself

yes, in context of resolving dispute with counterparty

> Financial security depends on the ability to show who bad actors are and undo their transactions

no, that's in my opinion the opposite of financial security. i feel financially secure when i know no government, bank or corporation fuck up can affect the state of my account.

> Yes, the current banking system has flaws, but they are not technological flaws and cannot be solved by technological means.

the flaw of current banking system is humans. humans are often corrupt, incompetent, unreliable and with malicious intent.

basing financial security on assumption that only honest, competent, reliable and well intentioned humans end up in positions of power is obviously wrong.

i don't hold that assumption and think that taking humans out of the loop is the best way to address flaws of existing system.

the fact that flaws are not technological in no way means technology can't solve them.

Fnoord
> the flaw of current banking system is humans. humans are often corrupt, incompetent, unreliable and with malicious intent.

Software, including a blockchain or Bitcoin, is designed by humans.

keymone
A clock is designed by human. It doesn’t rely on human to tell time. You can figure out the rest I hope.
Fnoord
Something designed by humans can have design flaws.
keymone
yeah, i guess we should throw all those watches away.
kevincox
With projects such as https://github.com/rianhunter/wasmjit it looks like this talk is basically coming true.

Not exactly Javascript but running untrusted code in a safe language in the kernel can give already performance improvements for some workloads due to avoiding system call overhead. It will be interesting to see where this goes in the future.

pjmlp
Catching up with 1961.
earenndil
Oh, to be sure, the technology will be there (and not unlikely make it into game console). I just struggle to see any other inlet for it.

Not to mention that that benchmark is very synthetic and doesn't really reflect the kinds of speedups that will generally be found.

Another step towards Metal[0] becoming a reality.

[0] A (half) joke from Gary Bernhardt's excellent The Birth and Death of JavaScript (pronounced "yavascript"): https://www.destroyallsoftware.com/talks/the-birth-and-death...

Prescient https://www.destroyallsoftware.com/talks/the-birth-and-death...
IshKebab
Not really, it was what everyone was hoping would happen for ages. Remember PNaCl?
wcrichton
Except, WebAssembly isn't even close to Javascript. They're completely different languages. WebAssembly is closer to C than to Javascript.
gtremper
WebAssembly is basically just a binary encoding of asmjs, which is the subset of javascript discussed in the talk.
panic
WebAssembly is a statically-typed language which passes values around using a stack. It’s very different from asm.js.
shawnz
Yeah, but it doesn't have "Javascript" in the name, which makes it automatically better by way of bypassing everyone's irrational hatred of JS
None
None
callahad
While asm.js was basically just a textual encoding of C in JavaScript... round and round we go! :)
gtremper
I'd say it more a textual encoding of LLVM IR. Which makes the s-expression text format of WebAssembly a text encoding of a binary encoding of a javascript encoding of a compiler intermediate representation of your program. Round and round indeed.
jchw
On the surface level, sure. However, it's mostly just a lower abstraction way of accessing largely the same JIT. I'm pretty sure browsers supporting WebAssembly are doing so by reusing most of what they already have. And if you dig deeper, this was almost certainly inspired by tools like Emscripten and the Asm.js concept. After all, Asm.js accomplished a similar goal to WebAssembly, at the end of the day; Wasm is a cleaner, higher performance, less backwards compatible way of doing largely the same thing.

JS already unhinged from the browser pretty thoroughly. I think when it comes to Wasm it's almost as much about what it doesn't have as what it does have. Lack of DOM bindings and a GC make it much more suitable for hosting in more environments like the kernel.

As predicted by gary bernhardt[1].

FWIW, I don't think doing that will be a net win. I/O-bound applications might run slightly faster (I think gary bernhardt's number was 5%), but in exchange they'll take quite a bit more system resources, which mean increased power use and reduced lifespan.

1: https://www.destroyallsoftware.com/talks/the-birth-and-death...

lachlan-sneff
Could you explain what you mean by saying that they'll take more system resources?
earenndil
More ram, cpu usage, etc.

So, imagine you have a given task. One option is it takes 60ms to execute and during that time it takes 50% cpu. Or, it takes 50ms to execute but takes 80% cpu. The second one takes less wallclock time but more cpu time.

And his visionary talk on JavaScript, asm.js and others:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Link for the uninitiated: https://www.destroyallsoftware.com/talks/the-birth-and-death...

Reference in above comment is to the future “metal architecture” Gary starts detailing about halfway through.

Gary Bernhardt’s presentations are always masterfully done. This one is particularly funny.

jwilk
On HN: https://news.ycombinator.com/item?id=7605687
Obligatory mention of this tongue in the cheek and visionary look presentation "The Birth & Death of JavaScript": https://www.destroyallsoftware.com/talks/the-birth-and-death...

This is basically what is happening with wasm and it's happening much faster Gary Bernhardt was anticipating in that presentation.

IMHO wasm finally displaces javascript as the only practical language to run stuff in a browser. Frontend stuff happens in javascript primarily because browsers cannot run anything else now that plugins have been killed off. Wasm changes that.

At the same time a lot of desktop apps are in fact javascript apps running on top of electron/chrome. Anything like that can also run wasm.

Finally people have been porting vms to javascript for pretty much as long as wasm and its predecessors have been around. So booting a vm that runs windows 95 or linux in a browser is not very practical but was shown to work years ago. This is what probably inspired the presentation above.

I've actually been pondering doing some frontend stuff in kotlin in or rust. I'm a backend guy and I just never had any patience for javascript. I find working in it to be an exercise in frustration. But I actually did some UI work in Java back in the day (swing and applets). Also there's a huge potential for stuff like webgl, webvr, etc. to be driven using languages more suitable to building performance critical software if they can target wasm. I think wasm is going to be a key enabler for that.

Most of the stuff relevant for this has been rolling out in the last few years. E.g. webgl is now supported in most browsers. Webvr is getting there as well. Wasm support has been rolled out as well. Unity has been targeting html5 + wasm for a while now. And one of the first things that Mozilla demoed years ago with wasm was Unreal running in a browser.

I would not be surprised to see some of this stuff showing up in OSes like fuchsia (if and when that ships) or chrome os.

steveklabnik
I was lucky enough to see that talk live, and it's stuck with me. It's certainly where a big part of my enthusiasm for WebAssembly comes from.
Jun 19, 2018 · datalus on Qt for WebAssembly
Obligatory Gary Bernhardt video reference: https://www.destroyallsoftware.com/talks/the-birth-and-death...
We can't really get rid of speculative execution, but we will effectively need to get rid of the idea that you can run untrusted code in the same process as data you want to keep secure.

It's interesting that the future predicted by excellent (and entertaining) talk The Birth & Death of Javascript [1] will now never come to pass.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

wnoise
> We can't really get rid of speculative execution

Why not?

> but we will effectively need to get rid of the idea that you can run untrusted code in the same process as data you want to keep secure.

If we accept that, then we also need to get rid of the idea that we can branch differently in trusted code based on details of "untrusted data".

nhaehnle
Your last point is basically what Spectre v1 mitigations are all about, at least if you throw in the word "speculation" somehow. The rule is: don't speculate past branches that depend on untrusted data (though there are certain additional considerations about what the speculated code would actually do).

It's just that there are a lot of branches that don't depend on untrusted data. Speculatively executing past them is perfectly fine and extremely valuable for performance. That's why nobody wants to get rid of speculative execution.

int_19h
It sounds like what we really need is a memory model that reflects this notion of trusted and untrusted data. The "evil bit", basically, but for real.
Obligitory:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

carry_bit
If it's actually prophecy coming true, people might want to leave SF before it becomes part of the exclusion zone.
microcolonel
It's already happened, it's just that the SF/MV overmind migrated to another host and has resumed normal operation.
None
None
None
None
sp332
It's a really funny talk, but this is nothing we haven't seen before with Java, Python, .Net... heck one of the most popular unikernel implementations is in OCaml! We'll survive having more languages in/as kernels.
Gary Bernhardt explains it better in this talk: https://www.destroyallsoftware.com/talks/the-birth-and-death...

But to summarize, jumps between kernal-space and user-space are expensive. Instead of doing that, we can run a well-vetted interpretor in kernal-space, and run "userspace" programs in kernal-space, in the interpretor.

This actually isn't slower (or so it is claimed), because a JITed interpretor can be native speed on hot-code paths, and the inefficiencies for most workloads are more than made up for by not having expensive syscalls.

So what you end up with is something that is about as fast as normal compiled code for cpu-intensive workloads (maybe faster sometimes), much faster for workloads involving a lot of syscalls, and interpreted languages like python/javascript end up much faster as well, presuming they can take advantage of the efficient JIT implementation.

Personally, what most exites me about this technology path, is that it should reduce the cost of interprocess communication to near zero. Combined with a shared object model, and a capabilities system, it could be pretty awesome.

The next step of evolution predicted by Gary Bernhardt. https://www.destroyallsoftware.com/talks/the-birth-and-death...
no_identd
A few related matters which might seem unrelated until one starts seeing the bigger picture:

http://lampwww.epfl.ch/~amin/pub/collapsing-towers.pdf Amin, Nada; Rompf, Tiark - Collapsing Towers of Interpreters [January 2018]

http://bootstrappable.org/ (see also http://langsec.org )

https://docs.racket-lang.org/medic/index.html - paper explaining it: https://www.cs.utah.edu/plt/publications/fpw15-lf.pdf

sequel to that paper: https://dl.acm.org/citation.cfm?id=3136019

https://www.reddit.com/r/nosyntax

https://www.reddit.com/r/programming/comments/2gw9u8/program...

https://grothoff.org/christian/habil.pdf The GNUnet System

https://wiki.debian.org/SameKernel

http://drops.dagstuhl.de/opus/volltexte/2017/7276/pdf/LIPIcs... Wang, Fei; Rompf, Tiark - Towards Strong Normalization for Dependent Object Types (DOT)

https://www.reddit.com/r/MachineLearning/comments/7s9etv/r_b...

https://news.ycombinator.com/item?id=16343020 Symbolic Assembly: Using Clojure to Meta-program Bytecode - Ramsey Nasser

http://conal.net/papers/compiling-to-categories/

https://icfp17.sigplan.org/event/icfp-2017-papers-kami-a-pla...

https://www.reddit.com/r/asm/comments/7af7a4/def_con_25_xlog...

Each of these, and this, paves another brick into a road to a very very different computational paradigm... I post this here without much explanation (and I've left a lot of other very relevant stuff out) of how these parts fit together, and I apologize for that, but I simply lack the time to give this the proper writeup it would deserve.

brian_herman
The end is neigh and it is the death of javascript.
zaarn
I think it's the start of something wonderful...
emmelaich
Don't horse around. The word is `nigh`
giancarlostoro
I've seen this talk linked so many times and now that I've finally watched it, definitely recommend it it's pretty good.
And somewhere, Gary Bernhardt is shaking his head in disbelief.

https://www.destroyallsoftware.com/talks/the-birth-and-death... was supposed to be satire... but "asm everything" might actually kind of happen.

anildash
Yeah, I definitely thought of Gary when I was writing the piece. The first time as tragedy, the second time as farce!
https://www.destroyallsoftware.com/talks/the-birth-and-death... The end is neigh
pier25
Is it common for people to call it YavaScript?
pier25
Apparently it's a joke: https://www.quora.com/JavaScript-programming-language-Why-do...
See also https://www.destroyallsoftware.com/talks/the-birth-and-death...

You jest, but this sort of thing probably is the death of almost all native software, sadly.

laythea
As long as performance is important, native software will always have an edge. Depends on the application.
weberc2
I’m a big fan of native UI, but I doubt this particular claim. The web stack is almost certainly more performance than GTK, for example.
laythea
Apologies, when I say native software, I don't just mean the UI.
weberc2
I don't agree on this count either. You can have performance and maintainability by writing your business logic in a daemon. Only very few applications will find this IPC cost unacceptable, and WASM promises to raise the performance ceiling even further. There are lots of good arguments for native software, but I don't find the performance argument to be particularly compelling.
oblio
There are some things we mourn. Nobody mourns cross platform distribution of native apps. Anyone who does hasn’t had to manage the insanity of installer apps and of papering over a million different OS versions and app versions.

It’s 2018 and we still don’t have a common, wide spread, OSS framework for self-updating native apps that runs on all the major desktop operating systems. And let’s not even go into app store territory...

pjmlp
Well it would be great if there weren't hundred of FOSS desktop variants....
kuschku
There are only a handful you'd ever care about, and Qt apps work fine on all of them.
weberc2
But Qt itself is a dumpster fire. It just happens to be one of the least bad toolkits for Linux.

Disclaimer: My last job was Qt developer.

michaelmrose
This seems to be a common complaint as if we were all standing on a life raft pushing it down with the collective weight of our respective keisters.

If only some people would stop contributing to creative work you don't see the value of and distributing it for free on the internet!

At present you have

Debian and a bazillion ubuntu derivatives using a debian package. ubuntu/debian are going to have different versions of some libraries but you can package deps with your app.

Arch and derivatives have a pkgbuild. This is quite simple if anyone cares about your app your arch users will probably upload one for you to the arch user repo.

Fedora and suse have rpms. These will be similar but not identical.

3 packages and you can cover most of your potential users.

In the future you reasonably may expect to be able to distribute a flatpak and be done with it.

pjmlp
It is not only about package formats, since even when the format is the same, the expected directory layouts or installed libraries will be completely different.

No one really took FHS seriously, each installation is a special snowflake with its own GUI and dynamic libraries story across Linux variants is even worse than it was on other OSes.

michaelmrose
The cross platform part is called source code. If all the chunks you use to build your app are cross platform packaging wont be horrible.
oblio
Quick example: to delivery safely my app with an web app, I need to add HTTPS and that's basically it. These days I can add Letsencrypt trivially.

With modern Windows, macOS, Linux, iOS, Android, I'm going to have to run around for ages figuring out how to sign my packages. Despite what you might say, it's never easy or pleasant.

drb91
The best native apps aren't cross platform. The worst native apps often are (at least from a mac perspective). We are simply moving to the lowest common denominator for all platforms, to the profit of business and the loss of the end-user. We can now develop software that is equally shitty for everyone much cheaper than we could before.

Is that the end of the world? No, of course not. But we aren't building better tools over time--for instance, google docs is distinctly worse than the word I used on macintosh back in the 90s for all the use cases I care about. So is pages! All this software is more complex, and generally for little benefit.

ghettoimp
I don't disagree that the best native apps are amazing. But it seems so wrong to regard a desire to build cross-platform software as some kind of scheme by developers against their users. In so many cases, being able to use a program/service on many platforms is a huge part of what makes it valuable to the end-user. You can get your gmail from your computer, or your phone, or from a web cafe, and have it all work the same. You can share google docs with anyone and know they'll be able to access them. You can decide you're fed up with windows/macos/android/ios and you want to switch to something else, and most of your software will still be there waiting for you on the other side...
drb91
> In so many cases, being able to use a program/service on many platforms is a huge part of what makes it valuable to the end-user.

Sure, some users, in some cases, may happen to use some features that are new. The pitch isn't "this is a good tool", it's "you have to use this tool to interact with others or retain data portability." Seems pretty user hostile to me.

Should we actually start calling it "Yavascript"?

ref: https://www.destroyallsoftware.com/talks/the-birth-and-death...

djsumdog
I was wondering why he kept pronouncing it that way. I thought he was just doing funny German pronunciation thing.
nkozyra
It's very common to hear Scandinavian speakers pronounce it that way.
None
None
rvalue
did he call yavascript to avoid something like this?
eropple
Phonetic drift and "information lost to time" is a pretty obvious reference, I think.
swlkr
He was from the future in the presentation, so he knew that after this oracle debacle, everyone since 2018 onwards called it "yavascript"
wombatpm
I'm still holding out for JawaScript
dmoy
In that case everybody in the exclusion zone better start selling their real estate. Help the short term housing crunch while they're at it.
voxadam
Another comment pointed out that Netscape originally planned to call it LiceScript but was encouraged by Sun to change it JavaScript so I say if we're going to change the name we go with the original one. Either that or argue that the original holder of the Java trademark essentially blessed the use of its mark and continue calling it JavaScript.
kiwijamo
LiceScript? Pretty apt name for something that has infested the web!
vatueil
> LiceScript

Ahem, "LiveScript": https://en.m.wikipedia.org/wiki/JavaScript#History

voxadam
Of course you're right. I'll chalk that one up to poor proofreading on my part and not even try to blame autocorrect.
_bxg1
I was gonna say, dodged a bullet by not calling it _LiceScript_
rootlocus
I don't know... LiceScript has a nice ring to it.
dotancohen
Fitting. The direction the JavaScript community has been going in for the past half decade has left me scratching my head often.
The Birth & Death of JavaScript https://www.destroyallsoftware.com/talks/the-birth-and-death...

A talk from the 'future' about how everything became 'YavaScript'.

littlestymaar
This talk is awesome. It's funny but I learned a lot. And the prediction of the talk is happening! [1]

[1]: https://github.com/nebulet/nebulet

dnautics
let's hope the exclusion zone doesn't happen, k?
littlestymaar
You need to talk about this to Kim and Donald then.
you're reminding me of this 30 minute talk from 2014 "the birth and death of javascript", in which he supposes a future where CPUs basically run webassembly directly https://www.destroyallsoftware.com/talks/the-birth-and-death...
icebraining
Reminds me of Jazelle[1], which allowed some ARM CPUs to run Java bytecode directly. AFAIK it never really caught on.

[1] https://en.wikipedia.org/wiki/Jazelle

ZenPsycho
not exactly, the idea is to have a normal cpu like arm or x86 but the kernel doesn't ever deal with native binaries, only web assembly, and the performance penalty of running stuff in a VM gets offset by disabling memory protection and its performance penalty, since memory protection is already guaranteed by the VM.
baobrien
It also probably didn't help that Jazelle was locked behind an NDA.
And here I thought this entire time that Gary was joking when he predicted this, though I'm glad we didn't have to suffer a war to make this happen.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

You've basically just described "The Birth & Death of JavaScript" https://www.destroyallsoftware.com/talks/the-birth-and-death...
Hmm, HyperCard in the same list as Zope and node? Interesting. :-)

The idea that JavaScript "won" is a little controversial to me. I think it's huge and important, but the world is still changing. Embedded Python goes places that Node still can't. I absolutely see the value you describe in sticking to one ecosystem, but I don't think JavaScript/TypeScript/Node is the only way to get those benefits. (See also: Transcrypt) I really enjoyed the PyCon 2014 talk on the general subject: https://www.destroyallsoftware.com/talks/the-birth-and-death...

The most recent conversation I had with Ted was after someone had just demonstrated the HoloLens for him and a few others. Ted had some feedback for the UI developer, and it didn't have anything to do with JavaScript or that level of implementation detail at all. It was all about the user experience. I don't want to put words into his mouth, but like he says in this recent interview, this is all hard to talk about because it really has changed so quickly.

I do think you're right that a lot of what Ted wanted to see could be implemented today in JavaScript and Git. But I think about the technical meat of that vision to be about data-driven interfaces. I am simply not old enough to really understand how notions of "scripting" changed between the 60s and the 80s. But the fact that Xanadu was started in SmallTalk suggests to me that scripting was part of the vision, even if a notion like "browser extensions" might not have been in mind.

Completely agree that there are other voices to learn from, and other important mistakes that have been made since Xanadu! (I think Ted would agree, too.)

It's more than that even - given that a high portion of the code being run now is using virtual machines, a lot of that protection is redundant. If all code is run inside VMs and zero 'native' code is allowed, then you could run without needing protection rings, system call overhead, memory mapping, etc - which in theory could more than make up for the virtual machine overhead.

This was being explored with Microsoft's Singularity and Midori projects but seems to be a dormant idea.

A fun talk on this idea with JavaScript: https://www.destroyallsoftware.com/talks/the-birth-and-death...

tedunangst
And that, of course, is how you get Spectre.
jordanthoms
It should be possible to apply retpoline in the JIT to mitigate that.
contrarian_
Emphasis on "mitigate".
yjftsjthsd-h
Isn't that a unikernel?
SolarNet
No. Each copy of the VM is still it's own process. The kernel just trusts the VM to be safe.
bestham
I don’t think it works like that. You still need protection between the rings inside the VM and the CPU is providing that protection. The VM is not only a user process for the host, it is executing its kernel code in the virtualized ring 0 on the CPU (where its still the CPU that provides protection).
jordanthoms
As i understand it, with this approach you wouldn't have a separation between kernel code and the virtual machine - everything runs in a single address space and you rely on the virtual machine verifying the code as it JITs it.
All this seems like a pretty compelling reason to move to using VMs and type safety to provide process isolation instead of using the hardware, e.g. https://www.destroyallsoftware.com/talks/the-birth-and-death... or Microsoft's Singularity and Midori projects.

It's a shame there's so much inertia behind the current setup of hardware memory management etc that it seems it'll be a long time before anything actually happens here.

runeks
How would this approach have helped in this case?

As I understand it, the flaw in question allows reading kernel memory by executing user space code. How can a layer of software, on top of this type of buggy hardware, fix this issue?

In one sense, the issue has already been fixed by a layer of software on top — which restricts a bunch of stuff, and reduces performance — but I assume this isn’t what you’re looking for.

jordanthoms
Those systems don't use the buggy aspect of the hardware (memory protection) at all. Instead, all code is run inside a VM which provides memory protection and process isolation - there is no 'native' code at all.

Not using the hardware memory protection provides a ~20% performance boost, which makes up for the ~20% overhead of running everything through VMs.

Jan 02, 2018 · 1 points, 0 comments · submitted by xwvvvvwx
Every time I see such news, I tell myself maybe it is the time we port a browser into the Linux kernel and build METAL, like Gary Bernhardt foretold.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

no_gravity
Or directly into the cpu. Since there already is a web server in the cpu [1], adding a browser might enable us to run and use web applications without even installing an operating system.

[1] https://www.networkworld.com/article/3236064

ttflee
The METAL (proposed by Gary Bernhardt) is not about running in CPU, but running all software in byte code and render native code obsolete, by implementing software process isolation and save the overhead of syscalls, memory mapping and protection rings.
beagle3
IBM’s AS/400=eServer is basically implemented this way since 1988 years (some of it going back to 1979). Everything old is new again.

(It has 128 but pointers, btw - so it’s future proof for at least 20 more years)

fasquoika
The original bytecode OS is AFAIK the UCSD p-System (from 1978) https://en.wikipedia.org/wiki/UCSD_Pascal
I guess this just means we’re one step closer to Gary Bernhardt’s vision:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

nitemice
Every time something like this is posted, I always think of that video.

As ridiculous and played-for-laughs as it is, it's looking more and more accurate (in one form or another) every day.

AnIdiotOnTheNet
If we have learned anything in the past 2 years, it is that the difference between parody and reality is vanishingly small.
z1mm32m4n
What happens when the stuff like the “exclusion zone” and the war from 2020–2025 come true...? ¯\_(ツ)_/¯
Now you just need to compile a JavaScript interpreter from C++ to WebAssembly... Makes me think of “The Birth & Death of JavaScript” talk [1].

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

proc0
We need brainfuck in this pipeline somewhere
flavio81
I'd prefer C-INTERCAL or LOLCODE
lazyman75
Upvote for INTERCAL. Keywords like "please" and "maybe" should be part of every language :)
philplckthun
I have bad news for you ;) https://github.com/mbbill/JSC.js

Or good depending on the perspective of course.

To me, this is the most exciting prospect of WebAssembly by far: an IR that works natively across all major platforms, including the web. It's funny because it makes Gary Bernhardt's "The Birth & Death of JavaScript"[1] seem a lot less like a joke.

[1]: https://www.destroyallsoftware.com/talks/the-birth-and-death...

A good occasion to watch Gary Bernhardt's talk "The Birth & Death of JavaScript" [0] again, where he talks about the precursor of WebAssembly: asm.js and the future implication it "could" have in the future in a really humorous way. A few years old but still relevant.

You want Gimp for Windows running in Firefox for Linux running in Chrome for Mac? Yeah sure.

[0] https://www.destroyallsoftware.com/talks/the-birth-and-death...

krapp
>You want Gimp for Windows running in Firefox for Linux running in Chrome for Mac? Yeah sure.

I actually do. I want all code ever written and every environment it was ever written for to have a URL that will let me run it in the browser. Everyone else seems to want the web to go back to being whitepapers but I want actual cyberspace already!

greenhouse_gas
There's another reason why I want JavaScript in the browser to die:

We haven't had a new browser engine written from scratch since KHTML. Firefox is a descendant of Netscape, Chrome (and Safari) is a descendant of WebKit which is itself a descendant of KHTML, Edge is closed source, but I'm almost sure there's some old IE code in there.

Why?

It's simply too expensive to create a fast (and compatible) JS engine.

If WebAssembly takes off, I hope that one we'll have more than three browser engines around.

detaro
What do you base the assumption on that Javascript is the critical piece of complexity here? (it might very well be, but it's not obvious to me)

At least some of the JS engines are used in non-browser projects (V8 and Microsofts), which at least superficially would suggest you could write a new browser engine and tie it to one of those existing JS interpreters. WebAssembly will gain interfaces to the DOM as well, so the complexity of that interaction will remain.

gsnedders
> Edge is closed source, but I'm almost sure there's some old IE code in there

EdgeHTML is a fork of Trident, so yes. That said, I'm led to believe there's about as much commonality there as there is between KHTML and Blink: they've moved quite a long way away from where they were.

> It's simply too expensive to create a fast (and compatible) JS engine.

I don't think that's so clear cut: Carakan, albeit now years out of date, was ultimately done by a relatively small team (~6 people) in 18 months. Writing a new JS VM from scratch is doable, and I don't think that the bar has gone up that significantly in the past seven years.

It's the rest of the browser that's the hard part. We can point at Servo and say it's possible for a comparatively small team (v. other browser projects) to write most of this stuff (and break a lot of new ground doing so), but they still aren't there with feature-parity to major browsers.

That said, browsers have rewritten major components multiple times: Netscape/Mozilla most obviously with NGLayout; Blink having their layout rewrite underway, (confusingly, accidentally) called LayoutNG; IE having had major layout rewrites in multiple releases (IE8, IE9, the original Edge release, IIRC).

Notably, though, nobody's tried to rewrite their DOM implementation wholesale, partly because the pay off is much smaller and partly because there's a lot of fairly boring uninteresting code there.

WorldMaker
> Notably, though, nobody's tried to rewrite their DOM implementation wholesale

Depending on your definition of "wholesale", the Edge team claims it took them 3 years to do exactly that:

https://blogs.windows.com/msedgedev/2017/04/19/modernizing-d...

gsnedders
Oh, yeah, that definitely counts. I was forgetting they'd done that. (But man, they had so much more technical debt around their DOM implementation than others!)
Klathmon
I completely disagree that the issue is JavaScript here.

In my opinion, the issue is the DOM. It's API is massive, there is decades of cruft and backwards compatibility to worry about, and it's codebase is significantly larger in all major open source browsers out there.

ravenstine
I'm not sure I agree that the DOM is that bad(more that people are using it improperly and for the wrong things), but yeah, modern JavaScript is hardly to blame for anything. The closest thing to an argument I've heard is "mah static typing".

Asking WebASM to be everything, including a rendering engine, is asking for problems at such an atrocious level.

xtian
> I'm not sure I agree that the DOM is that bad(more that people are using it improperly and for the wrong things)

If an API makes it easy to make mistakes it's a bad API. Blaming "people" is a cop-out.

fasquoika
If an API is meant for documents and you're using it for applications, that's not the API's fault
ravenstine
So what you're saying is that people are stupid?

If people are misusing the DOM API at a fundamental level(to do non-DOM related things), that's not a fault of the API. It's as if everyone has forgotten that DOM means Document Object Model. The vast majority of websites and web apps are not very complicated on the client-side, so I'd say that the DOM API generally does its job well. Trying to do anything that's not about constructing a document or is doing heavy amounts of animation or node replacement using a document-building API is asking for a bad time. It's quite implicitly attempting to break fundamentals of computer science.

Making the API one uses to render pages in a browser a free-for-all doesn't solve the problem, and you end up losing many of the advantages of having actual standards. What would be better is for the browser to provide another set of APIs for things beyond the "expertise" of the DOM. This kind of the case right now in some regards, but there's a reason why React and Glimmer use things like virtual DOMs and compiled VMs. I'd argue that a standardization of such approaches could be a different API that probably shouldn't even be referred to as a DOM because they are meant to take a lot of shortcuts that aren't required when you are simply building a document. In a sense, WASM is intended to fulfill this purpose without replacing the DOM or JavaScript.

Object-oriented programming is quite often misued/misunderstood. Does that mean it's a "bad" concept? I wouldn't say so. Education tends to suck, and people are generally lazy and thus demand too much from the tools they use.

I'm not copping-out because I'm not putting the DOM on a pedestal. Calling it a bad API because it doesn't work well for a small minority of cases is a total mischaracterization. If it was an objectively bad API, it wouldn't have seen the astounding success it has.

EDIT: I'm not saying that programmers are stupid... but that their expectations are sometimes not congruent with reality.

Klathmon
I'm not implying that the DOM is bad (IMO it's one of the most powerful UI systems that I've ever used. Both in what it's capable of, as well as the speed that I'm able to develop and iterate something with it), just that it's BIG.

There's a lot of "legacy" there, a lot of stuff that probably should be removed, or at least relegated to a "deprecated" status.

austincheney
Naw, the DOM is fairly small. Go here for a summary http://prettydiff.com/guide/unrelated_dom.xhtml

HTML is far larger than the DOM. Like comparing an ant to Jupiter.

austincheney
WebAssembly has nothing to do with JavaScript. When people make this association it is clear they are painfully unaware of what each (or both) technologies are.

WebAssembly is a replacement for Flash, Silverlight, and Java Applets.

tyingq
At the moment it is because the only performant visual output is the canvas.

They are adding native DOM access, which changes things.

45h34jh53k4j
and js. eventually!
austincheney
JS is a language and not a bytecode media. Perhaps chocolate will replace cars and airplanes. I love me some chocolate.
stcredzero
JS is a language and not a bytecode media.

That's an arbitrary distinction that's driven by developer group politics, not a meaningful technical distinction. (Much like the old Rubyist, "It's an interpreter, not a VM.")

Machine languages were originally intended to be used by human beings, as were punch cards and assembly language. There's no reason why a person couldn't program in bytecode. In fact, to implement certain kinds of language runtime, you basically have to do something close to this. Also, writing Forth is kinda close to directly writing in Smalltalk bytecode. History also shows us that what the language was intended for is also pretty meaningless. x86, like a lot of CISC ISAs, was originally designed to be used by human beings. SGML/XML was intended to be human readable, and many would debate that it succeeded.

austincheney
> That's an arbitrary distinction that's driven by developer group politics

Not at all. JavaScript is a textual language defined by a specification. Modern JavaScript does have a bytecode, but it is completely proprietary to the respective JIT compiler interpreting the code and utterly unrelated to the language's specification.

> There's no reason why a person couldn't program in bytecode.

True, but that isn't this discussion.

tree_of_item
The point is that a "textual language defined by a specification" can serve the exact same purpose that a bytecode does. And JavaScript is very much on this path.

That is this discussion, because the fact that people program directly in JavaScript does not prevent it from being in the same class of things as a bytecode.

fasquoika
>JS is a language and not a bytecode media

Does it even matter when most people are using it as a compiler target (even just from newer versions of the language)?

mastax
Yes but the point is JavaScript as the "one true way to do client side scripting" can be replaced by webassembly in that capacity.
austincheney
It cannot. WebAssembly bytecode does not have an API to web objects or the DOM. It is literally a sandbox for running bytecode only.
sdfh238
I love it when computer nerds get so convinced ephemeral things are exactly that which they claim
sli
I am extremely unclear on whatever point you're trying to make, here, because it really does seem to come from a place of ignorance on WASM and JS. It makes no sense.

It seems like you're claiming, in a really roundabout way, that WASM will never have DOM access, even though it's planned[1]. There are even VDOMs[2] for WASM already. Future WASM implementations that include DOM access can absolutely, and for many folks will, replace Javascript.

[1]: https://github.com/WebAssembly/design/blob/master/FAQ.md

[2]: https://github.com/mbasso/asm-dom

mr_toad
It will take more than just DOM access to replace Javascript. Just off the top of my head you'd also need access to the event loop, XHR, websockets, audio & video, webRTC, Canvas, 3D, Filesystem, cookies & storage, encryption, Web Workers and more.
austincheney
> It seems like you're claiming, in a really roundabout way, that WASM will never have DOM access

I am not going to say never. It does not now and will not for the foreseeable future though. I know DOM interop is a popular request, but nobody has started working on it and it isn't a priority.

Part of the problem in implementing DOM access to a unrestricted bytecode format is security. Nobody wants to relax security so that people who are JavaScript challenged can feel less insecure.

steveklabnik
> I know DOM interop is a popular request, but nobody has started working on it and it isn't a priority.

I linked to the latest proposal downthread; people are absolutely working on this.

wootness
It's not about being "javascript challenged", it's about wanting to use a non-crappy language riddled with problems.
tptacek
Which of the security concerns browser Javascript deals with do you think are intrinsic to the language, as opposed to the bindings the browser provides the language? If the security issues are in the bindings (ie: when and how I'll allow you to originate a request with credentials in it), those concerns are "portable" between languages.
austincheney
It isn't due to the language but to context of known APIs provided to the language that can only be executed a certain way by the language.

How would a browser know to restrict a web server compiled into bytecode specifically to violate same origin? The browser only knowns to restrict this from JavaScript because such capabilities are allowed only from APIs the browser provides to JavaScript.

tptacek
I really don't understand your example. Are you proposing a web server running inside the browser as a WebAssembly program, and the browser attempting to enforce same origin policy against that server? That doesn't make much sense.
austincheney
Yep, it doesn't make sense and that is the problem. There is no reason why you couldn't write a web server in WASM that runs in an island inside the browser to bypass the browser's security model.
tptacek
This does not make any sense, sorry.
tedunangst
Not sure if this is directly relevant, but there have been all sorts of type confusion bugs when resizing arrays, etc. Stuff in the base language. They exist independent of API, but merely because the language is exposed.
hackcasual
Flash, Silverlight and Java Applets all provided APIs and functionality above and beyond what JavaScript or the browser DOM provides. WebAssembly is the opposite, as it is much more restricted than JavaScript. WASM is a sandbox inside a sandbox.
skocznymroczny
How about Servo?
flanbiscuit
yeah, wasn't this (or related to it) at the top of HN just yesterday?

https://news.ycombinator.com/item?id=15686653

fasquoika
IIRC Servo uses quite a bit of Firefox code

Edit: Looking at the project it seems like it uses SpiderMonkey, but is otherwise new code

mastax
Servo doesn't render major websites properly (last I checked). Their UI is placeholder. Their Network/Caching layer is placeholder. There's no updates, configuration, add-ons, internationalization.

Servo is not meant to be a real browser. That's not a bad thing, but I don't think you can use it as an example of a browser built quickly by a small team.

scott_karana
Chrome's V8 engine was actually written from scratch, unlike Webkit's JavaScriptCore (which descended from Konqueror/KJS, as you say). Google made a big deal about marketing this fact at the time. (1)

And while yes, Mozilla's Spidermonkey comes from the Netscape days, and Chakra in Edge descends from JScript in IE, plus aforementioned JavaScriptCore, each of those engines still evolved massively: most went from interpreted to LLVM-backed JITs over the years. I suspect that no more than interface bindings remain unchanged from their origins, if even. ;-)

(1) I can't currently find the primary sources from when Chrome released on my phone, but here's a contemporary secondary one: https://www.ft.com/content/03775904-177c-11de-8c9d-0000779fd...)

lern_too_spel
If the issue is JavaScript, what explains the explosion of JavaScript engines? I agree that JavaScript is a cancer whose growth should be addressed, but implementation complexity isn't a reason.
wiredearp
If these proposed browsers don’t ship with a JS engine [1], do you also hope to have more than one internet around?

[1] Such as V8, Chakra, JavaScriptCore, SpiderMonkey, Rhino, Nashorn, there is a variety to choose from, also experimental models such as Tamarin, they are almost certainly not the critical blocker for developing a browser.

ams6110
IE/Edge heritage goes back to Spyglass Mosaic.
pavlov
The real problem is CSS. Implementing a compliant render engine is nearly impossible, as the spec keeps ballooning and the combinations of inconsistencies and incompatibilities between properties explode.

Check out the size of the latest edition of the book "CSS: The Definitive Guide":

https://twitter.com/meyerweb/status/929097712754098181

Until CSS is replaced by a sane layout system, there's not going to be another web browser engine created by an independent party.

irrational
Isn't Grid and Flexbox supposed to be that sane layout system? At least that's what I've heard from those who have used them.
woah
React Native has what seems like a pretty sane css-like layout system. Maybe this could become the basis for a "css-light" standard that could gradually replace the existing css, and offer much faster performance for website authors who opt in.
gsnedders
I presume parent's point is about how they then interact with other layout modes (what if you absolutely position a flex item, for example), along with the complexity of things that rely on fragmentation (like multicol).
infogulch
Even if Grid and Flexbox are awesome and perfect and the solution to all our problems, they don't make everything else in css suddenly disappear, a new layout/render engine still has to implement every bit of it, quirks included.
gsnedders
I think Blink's LayoutNG project and Servo both show that you can rewrite your layout implementation (and Servo also having a new style implementation, now in Firefox as Stylo). I think both of those serve as an existence proof that it's doable.
pavlov
It's doable if you already have a large team of experienced web engine development experts, a multi-million budget and years to spend on just planning.

Implementing an open standard shouldn't be like this. Even proprietary formats like PDF are much simpler to implement than CSS.

tptacek
A minimal, 80% PDF, maybe. A complete PDF? No.
madeofpalk
It's not doable, part from when it is.

'Open standard' has nothing about something being simple and straight forward. What CSS is trying to do is complicated because of a whole bunch of pragmatic reasons.

Last time a browser tried to 'move things forward', ngate aptly summed it up as

    Google breaks shit instead of doing anything right, as usual.
    Hackernews is instantly pissed off that it is even possible to
    question their heroes, and calls for the public execution of
    anyone who second-guesses the Chrome team
pavlov
I never claimed that the existing browser vendors can’t do it incrementally — they certainly can. What I wrote was: “... there's not going to be another web browser engine created by an independent party.”
robocat
TCP/IP is an open standard, yet I suspect you would have the same problems implementing it (Microsoft famously copied BSD stack at first).

You could probably say the same thing about any complex open standard, like FTP etc.

steveklabnik
Also relevant: callahad a few weeks ago: https://twitter.com/nybblr/status/923569208935493632

Netscape navigator on DOS in Firefox via WebAssembly.

indescions_2017
Or how about a live coding environment for a Atari VCS (1200) emulator ;)

http://8bitworkshop.com/?platform=vcs&file=examples%2Fhello

md224
My CS background is a bit weak... is the hypothetical Metal architecture he describes supposed to be satire or actually a good idea?
__s
Implement a WASM JIT in kernelspace & you don't have to have a userspace while still having hot code hopefully optimized to remove bounds checking. Now all your programs are WASM modules & we can replace your CPU with some random architecture that doesn't have to care about supporting more than ring0. Oh why not implement a nearly-WASM CPU? Probably just change branches to GOTO. Now the only program people care about, their browser, can have a dead simple JIT for this architecture, with WASM-in-the-browser being nearly as fast as any other program
ninkendo
There’s prior art for this too, Microsoft started a research project called Singularity that was essentially a kernel that only executed .NET bytecode, and had similar advantages (everything in ring0, no syscall overhead, etc.)

It died pretty unceremoniously though.

nimish
It died because it couldn't become an actual product and had a lot of very smart engineers spending a lot of time on something that had no future. Some of the core tech was reused and turned into other products.
steveklabnik
We had Joe Duffey talk about it at RustConf this year! https://www.youtube.com/watch?v=CuD7SCqHB7k
jayd16
Mostly satire because the math doesn't really work out in such a way.
Ajedi32
It doesn't? How so? I was under the performance savings calculations he used were at least plausible. (Though obviously just a back-of-the-napkin estimate.)
steveklabnik
Some say that joking is a socially acceptable way to say socially unacceptable ideas.

I think it's a great idea, though many disagree. It's basically ChromeOS but to the next level.

fny
I'm more excited about the prospects of running V8 inside ChakraCore inside Quantum.
tomxor
I'm more exited about the prospect of running all of FF or Chromium inside of Edge so I can cut my workload down by 50%
sp332
And for a somewhat more practical but at the same time more exotic example, the Internet Archive has a ton of old minicomputers and arcade games running in MESS/MAME, each compiled to webasm. One click and you can boot anything and play it in your browser. https://archive.org/details/softwarelibrary

https://archive.org/donate/

ricw
This is amazing. If only it would also work in my phone. Probably for the best to stop me “wasting” time ;).
Ajedi32
Are you sure that's actually using WASM? It sounds to me like it's currently using ASM.js compiled via Emscripten. (Though in theory there's no reason why it _couldn't_ be WASM, since Emscripten supports WASM as a compiler target.)
sp332
I thought they switched over back in July. https://twitter.com/textfiles/status/884084207688892416
vesinisa
That is gorgeous. I just booted Win 3.1 to Minesweeper in less than a minute on my phone's browser.

Too bad makers of the original Minesweeper did not think to build touch input support.

There's the Gary Bernhardt classic "The Birth & Death of Javascript"

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Obligatory "The Birth & Death of JavaScript" reference: https://www.destroyallsoftware.com/talks/the-birth-and-death...

METAL is coming.

Oddly, no one here linked to Gary Bernhardt's talk: The Birth & Death of JavaScript (YavaScript)

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

As always, the fantastic Gary Bernhardt takes this through to its logical conclusion https://www.destroyallsoftware.com/talks/the-birth-and-death...
jerf
First, while presented humorously, I take it somewhat seriously as well. And one place where I disagree with it is that unless you consider WebAssembly as Javascript, it isn't true. It isn't Javascript destined to take over the world, it's WebAssembly.

You will know WebAssembly is here and in charge when the browsers compile Javascript itself into WebAssembly and take at most a 10% performance hit, thus making it the default. Call it 2022 or so.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Your comment immediately made me think of this. Highly recommended talk for anyone that hasn't seen it. It goes through a "fictional" (maybe not so much anymore) history of javascript until 2035. We are getting pretty close to javascript all the way down.

Reminds me of a 'future' talk about how JavaScript took over the world as a language even though no one actually programmed in. This was because as long as you could transpile to ASM, you could get native performance via JS.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

(Worth watching no matter what your background is, it's funny and informative)

Aug 10, 2017 · thousande on WebAssembly: A New Hope
Like running the Windows version of Gimp in Wine in X Window in Chrome inside Firefox on a Mac? http://imgur.com/a/wRals

From: https://www.destroyallsoftware.com/talks/the-birth-and-death...

https://www.destroyallsoftware.com/talks/the-birth-and-death...

...this is extremely relevant to what you're saying. A talk worth watching.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Unfortunately some people seem to have missed that this talk was labeled "comedy" and "absurdist", are actually trying to implement a JavaScript kernel. The talk just got the name wrong - it was "Electron", not "Metal".

I know it's an experimental project, but that's exactly the time to learn Ruby+Tk, or {anything}+Qt, or any of the other cross-platform toolkits. If you have to bundle an entire GUI server ("the browser") with your app to shoehorn HTML into use cases it wasn't designed for, you're doing it wrong.

roryisok
> you're doing it wrong

"wrong" is subjective here. You could also argue that Ruby+Tk is "wrong" because Ruby is not as efficient as coding in C++, or C, or Assembly.

What we're talking about here, (and every time this comes up, again, and again, and again, over and over) is what shortcuts you're willing to take to get to MVP. There are a lot of Electron success stories out there, and when you're a solo dev starting out, and want to make a cross platform app, Electron is approachable and proven.

Apr 18, 2017 · psiclops on Google Earth Redesigned
Last part of your comment reminds me of the birth & death of javascript [0]

[0] https://www.destroyallsoftware.com/talks/the-birth-and-death...

Apr 07, 2017 · 1 points, 0 comments · submitted by mromnia
I found that the killer application for Javascript would be to write github repos for Javascript wrappers for typescript. Seriously, why is CSS as Javascript objects a thing? Are we so bored with the stack we've been using that we can only make things interesting by cross compiling everything? It makes me think of this video: https://www.destroyallsoftware.com/talks/the-birth-and-death...
mxstbr
That's exactly why we made styled-components![0]

I really dislike writing styles as JavaScript objects, so styled-components let's you write actual CSS in JavaScript. We have also added support for a bunch of editors, so you don't miss out on syntax highlighting just because of that.

I'd encourage you to check it out!

[0]: https://www.styled-components.com

pygy_
I just discovered http://typestyle.io/ which should make it much easier to write CSS in JS... Auto-completion everywhere...

I'll probably steal the approach for my own CSS-in-JS lib (j2c).

roboguy12
+1 for typestyle. My team and I have been using it for the past few weeks in trying to clean up a rather monolithic css project, and it's been really cool to work with.
We are hurtling faster and faster towards this talk every day https://www.destroyallsoftware.com/talks/the-birth-and-death...
Ajedi32
I kinda see the future in that talk as that as a worthwhile goal. If we could get all the security and interoperability advantages of the web but with native-level performance, support for multiple languages, and backwards-compatibility with legacy software that'd be huge.
So let me get this straight: WASM is basically asm.js, but without having to go through javascript?

The future has changed! https://www.destroyallsoftware.com/talks/the-birth-and-death...

adamnemecek
I know it's easy to dismiss it but wasm will be th e biggest deal in the world.
Zikes
There are already talks about compiling Go to wasm: https://github.com/golang/go/issues/18892
stymaar
I'm not sure I see any interest in that tbh: what's the point of porting a memory-safe (with GC and runtime) on a VM especially dedicated to remove the costly memory safety part of JavaScript.

Plus, the concurrent model of Go doesn't really shine if you run it in a single threaded configuration (which is most likely to be the case in wasm).

I might be missing something but to me it sounds like pure hype to put go and wasm together.

themihai
>> I'm not sure I see any interest in that tbh:

Running other languages(than JS) in the browser is interesting for sure.

rubber_duck
>what's the point of porting a memory-safe (with GC and runtime) on a VM especially dedicated to remove the costly memory safety part of JavaScript.

a) binary portability - compile and ship WASM link/run on any machine with WASM VM - eg. package up to NPM and run trough any V8 target

b) extra sandboxing layer for security (it's designed for browsers where you are supposed to run non-trusted code)

c) portable APIs (assuming WASM targets expose the same underlying platform like node or w/e)

d) WASM is single threaded right now but from what I've seen it's a top priority for next release to spec out shared memory threads

stymaar
I agree with d), but I'm quite skeptical about the other 3.

a) and c) means it's portable on everything a JS VM runs on, which is not that much more than what Go runs on.

b) is legit, but that really sounds like an overkill.

rubber_duck
Well a) and c) also means you can run in the browser - client/server code sharing and such can be a really big win.
bradfitz
You're confusing concurrency and parallelism.

Even with GOMAXPROCS=1 (1 CPU running Go code), it's very liberating to be able to write blocking Go code and not worry about callbacks or async/await and let Go's runtime deal with it all while you write concurrent code.

stymaar
> You're confusing concurrency and parallelism.

I'm not, it's just that the most appealing feature of Go is it's ability to use parallelism for concurrency with a decent overhead, which makes it straightforward to scale vertically.

> Even with GOMAXPROCS=1 (1 CPU running Go code), it's very liberating to be able to write blocking Go code and not worry about callbacks or async/await and let Go's runtime deal with it all while you write concurrent code.

IMO, Async/await is a much cooler pattern than Goroutine + channels do do concurrent stuff on one thread. I find it way easier to use, and less error prone. The drawback is that you need a different paradigm when you want to take advantage of parallelism.

Of course it's a matter of personal preferences, but the prevalence of async/await in different programming languages indicates that at least I'm not the only one thinking this way :).

emn13
async/await is possibly simpler to implement. At least: in go effectively every method is async (at least the api doesn't show what is and is not), and that means you can't afford the inefficiencies that typical async/await implementations have because you'd be paying them all over the place.
ojr
javascript shows no sign of dying since that talk, more people are doing desktop software (Electron) and more people using js to make native apps (React Native). Node.js is also popular on the server. Babel also took off which lessen the need for compile-to-js languages and makes the future of javascript into more javascript. Javascript is more ubiquitous than ever, wasm is best suited for hardware intensive applications and library authors, this is overkill for normal client side javascript
acjohnson55
Those techs you mentioned are successful largely because they use Javascript as a runtime. Supposing those runtimes also eventually support WASM (with a good story of linking binaries), I see no reason why other languages won't begin to be used. Perhaps not for a few years though.
Ajedi32
> WASM is basically asm.js, but without having to go through javascript?

Sort of, but not exactly. See section 2 of the linked article.

coldtea
>WASM is basically asm.js, but without having to go through javascript?

No. It also has non-JS accessible language features, and the article lists several of them.

I really suggest watching these talks for anyone who hasn't. The first especially gets at what kinds of issues javascript has, and what might happen to it.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

https://www.destroyallsoftware.com/talks/wat

And thus the era of METAL[1] begins...

[1]: https://www.destroyallsoftware.com/talks/the-birth-and-death...

None
None
None
None
spraak
This is super enjoyable thank you ^_^
andrewflnr
And compelling. Can you really make that work securely in the kernel?
amelius
It's a nice idea, but it goes against layered security [1]. Perhaps computer-assisted proofs can make that a non-issue.

In any case, I would very much like to see an implementation of this.

[1] https://en.wikipedia.org/wiki/Layered_security

Mar 04, 2017 · 1 points, 0 comments · submitted by mromnia
The Birth & Death of JavaScript.

A talk by Gary Bernhardt.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

kmi187
The man is funny :)
steinuil
His other talks are great too! My favourite is "A Whole New World". https://www.destroyallsoftware.com/talks
Dec 12, 2016 · oevi on Show HN: Web Bluetooth
This talk revolves around exactly this question: https://www.destroyallsoftware.com/talks/the-birth-and-death...

You might enjoy watching it.

Reminds me of this talk by Gary Bernhardt:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

fidz
I don't want his prediction happen, unless we have WebAssembly.
akerro
There it is, shoot yourself. https://i.imgur.com/pC6EV0v.png
None
None
cyberpunk
I've not watched that in a while! Enjoyed it again, so thanks!

Have we not almost arrived at this dystopian javascript hella-future though?

I mean, with unikernels which run node in ring 0:

http://runtimejs.org/

And that we can run an emulated linux in browser:

http://bellard.org/jslinux/

I'm doing my best to ignore it all....

Nov 24, 2016 · nitemice on The Lua VM, on the Web
The future is coming!

https://www.destroyallsoftware.com/talks/the-birth-and-death...

omginternets
This really was a fantastic talk.

Where can I read more about this kernel-level sandboxing that's discussed towards the end?

Additionally: as I understand things, the methodology behind asm.js could (in principle) be applied to any JIT-ed language. Is the choice of doing this in JS only because JS is the language of browsers (i.e.: kernel-level sandboxing can easily use existing browser sandboxing code) or are there other technical considerations behind this decision?

bogomipz
I have watched this talk on two occasions now. Gary Bernhardt is really a great speaker. I am curious has anyone subscribed to the screencasts at:

https://www.destroyallsoftware.com/screencasts

It's a bit pricey at $29 dollars a month but if all of the content is as interesting, entertaining and thought provoking as this talk I could see it being worth the price. I would be curious to hear anyone's feedback.

Oct 24, 2016 · kenOfYugen on Operating Systems
I believe something along the lines of runtime.js [1] is implied.

The timeline suggests a reference to Gary Bernhardt's talk, "The Birth & Death of JavaScript" [2].

1. http://runtimejs.org/

2. https://www.destroyallsoftware.com/talks/the-birth-and-death...

"The Birth and Death of Javascript" by Gary Bernhardt (probably the most talented speaker on tech) at https://www.destroyallsoftware.com/talks/the-birth-and-death...

I'd mention Bret Victor's work before (maybe Drawing Dynamic Visualizations?), but Bret cheats by writing a lot of amazing code for each of his talks, and most of the awesome comes from the code, not his (great nonetheless) ability as a speaker.

Then you have John Carmack's QuakeCon keynotes, which are just hours and hours of him talking about things that interest him in random order, and it still beats most well prepared talks because of how good he is at what he does. HN will probably like best the one where he talks about his experiments in VR, a bit before he joined Oculus (stuff like when he tried shining a laser into his eyes to project an image, against the recommendations of... well, everyone): https://www.youtube.com/watch?v=wt-iVFxgFWk

Agreed.

Was gonna post this if it wasn't up already.

After this, The Birth and Death of Javascript: https://www.destroyallsoftware.com/talks/the-birth-and-death...

He could've taken the concept further tho. I think there are real hardware simplifications you could do if the OS is a jitting VM - no memory mapping unit and take out the expensive fully-associative TLBs.

josh_carterPDX
This is great. I love that he goes into the future. :)
caub
this is quite outdated
elliotec
but it's from 2035...
SonOfLilit
I always have trouble when telling people in person to go watch this - how should I pronounce the "J" in Javascript?

* SPOILER ALERT, and seriously go watch it first *

If I pronounce "J" I do him an unjustice, and if I pronounce "Y" I ruin a great surprise that comes quite a few minutes into the talk.

natdempk
I always go with the "J" pronunciation. It maintains the expectation that the talk makes a joke out of by breaking. I would rather give everyone that first time experience of hearing the "Y" pronunciation than do Gary an injustice.
Gary Bernhardt mentions an OS written in JS in his video: https://www.destroyallsoftware.com/talks/the-birth-and-death...
I'm not an expert, but the memory required to render the seemingly simplest of interfaces in html/js/css in a browser today seems excessive. Some web sites bring a reasonably powered desktop to its knees. I'm not sure I want this problem on my desktop too. Though I guess this is just one step closer to having METAL. https://www.destroyallsoftware.com/talks/the-birth-and-death...
wpietri
I feel your pain. My first computer had 4K of RAM; my second had 48K; my third, 640K. I learned to work small. It pains me to see the equivalent of Hello World taking up untold MB.

But when I think about software as a business rather than an art, I have to concede that it's a very rare circumstance where RAM efficiency matters as much as I'd like. Note the way the cost of memory has declined:

http://www.jcmit.com/mem2015.htm

Watches now have 100,000 times the RAM that I started with. Costs are dropping by 1-2 orders of magnitude per decade. Something that is absurdly wasteful now could well be economically reasonable very soon.

haberman
The weight still hurts the user experience. All that RAM still has to get written to disk for sleep/hibernate, and loading it back from disk is in the critical path of wake from that sleep. It still fills up CPU caches, which keeps them from running at maximal efficiency (both speed-wise and power-wise).

Extra weight will always matter for people who want to deliver first-rate user experiences.

vidarh
Memory isn't the issue to me. Latency is.

Most of the applications I work with today do a lot, that's true, but their UI latency is often worse than it was on my 7.16Mz M68000 Amiga 500, and other things as well are just slow.

I've mentioned here several times in the past that on my laptop I can "boot" Linux-hosted AROS (so the problem is not the Linux kernel, nor X) with a custom startup script to boot it straight into a full featured, scriptable text editor in less time than it takes to start Emacs.

I'm sure it's possible to tune my Emacs setup (for example, I found out by a fluke, that the default Emacs setup on debian will wait for a DNS request to complete or time out before it starts - break your DNS setup and Emacs will hang for ages) or pick another editor (many of the other ones I've tried are either just as slow or feature-limited compared to the Amiga editor in question - FrexxEd, co-written by the same guy that started curl), but the point is that we've come to accept the kind of slow startup and UI latency that was unacceptable back then.

E.g. people spent weeks tuning and trimming AmigaOS commands to make them the smallest possible so we could make as many of them as possible RAM resident to avoid the tiny fractions of a second it'd take the load-time linker to load them.

I'm happy we don't need to think that much about the RAM any more. But we do need to think about the latency.

There's the attitude that we should just throw servers at this instead of developer time. That's fine when you can compensate by e.g. throwing more RAM in and/or a program is run relatively rarely or where paying for a beefier server in some data centre can achieve the same performance. But it's not true when latency grows into user noticeable levels because you can't get high enough single-core performance, and that program is run a lot.

wpietri
I don't disagree. I am certainly frustrated every time my phone feel sluggish, which is several times a day.

On the other hand, I've been frustrated with the slowness of computers for a long time. CPU speed, RAM, disk, everything has gotten way better. But I'm still just about as irritated, and I suspect that things are just about as sluggish.

Again, I think it's an economic equilibrium. Things are fast enough that most people buy them; those of us who want things faster aren't numerous to outvote those who want fancier features or cooler UI bling instead.

I hope this changes, but I'm not holding my breath.

onion2k
Poorly written websites can bring a desktop PC to its knees, but so can a poorly written native application. That isn't an argument against leveraging web technology so much as an argument in favour of well written applications. And at least with a chromeless-browser-pretending-to-be-application you have the protection of the browser process sandbox so an app that goes awry isn't going to take your computer down that badly.
aikah
> That isn't an argument against leveraging web technology so much as an argument in favour of well written applications

The difference is there is very little room for optimisation in most javascript runtimes. They don't support multi-threading and the memory is impossible to manage. "Lower" level languages always allow better performance tweaking when necessary. Javascript allows next to none. You can't tell javascript :"Give me an array of 10 elements", or "give me a integer of that length". So no "headroom" for performances with Javascript.

mmatants
Most performance problems with websites/webapps out there are just due to sheer nastiness of the shovelware that they embed. No need to have fine-tuned debugging and profiling when the main fix is "don't accidentally run this jQuery selector 1000 times on every click". Very low hanging fruit.
BinaryIdiot
> the memory is impossible to manage.

Nonsense. There are plenty of strategies for managing memory efficiently in JavaScript. Yes you can't do a C++ level of allocation, decallocation, etc but you most certainly can manage the amount of memory your code uses.

fenomas
This is a non-argument. Firstly, JS does indeed have typed arrays and other tools for managing memory. More importantly, unless you're doing 3D or similar it's vanishingly rare that the memory used by objects you've allocated in JS will be measurable compared to the memory used for DOM objects and rendering generally. Replacing JS with some other language wouldn't affect the memory usage of typical web pages.
jaquers
I think the spirit of the argument is that if you have requirements for heavy computation such that you need multithreading + low level memory management, then you probably shouldn't use this. Use the right tool for the job. This is just one of them.

Plus there are alternatives to threads. Look at the state of Atom or VSCode. Much progress has been made in terms of perf and these are not trivial applications.

dilann
> Use the right tool for the job

That's a funny argument in a thread about js. Js has started as a small language to do scripting on the pages, now they are sticking it everywhere.

goatlover
Because people don't actually use the right tool for the job when it comes to software. Instead, they take the tool they know, ignore the better tools they would rather not have to learn, and make the known tool do things it was never designed for.

Excel is a great example of that, but it's done with programming languages as well.

mrec
Slightly nitpicky, but while lack of support for multithreading might make a particular app slow or unresponsive, I'd have thought it makes it less likely for that app to be able to bring the entire system to its knees.
pcwalton
> They don't support multi-threading

Yes, they do, via Web Workers. They just don't support multithreading at the level of concurrent access to the DOM, and they don't support shared memory. Very few native libraries support concurrent access to UI widgets, and not many native applications make heavy use of shared memory for compute either. (Most native applications don't have heavy compute needs in the first place…)

> Give me an array of 10 elements

new Array(10)?

> give me a integer of that length

Uint8Array, Uint16Array, Uint32Array?

aikah
Please :

> Yes, they do, via Web Workers.

Which are not part of the javascript spec, it is DOM related. And web workers were never meant to increase performance. In fact in practice they don't, they often make code slower. They just guarantee that the UI thread will not block.

> new Array(10)?

Which doesn't allow any specific runtime optimization as the array can be resized at anytime

> Uint8Array, Uint16Array, Uint32Array?

Which doesn't give me an integer of a specific size BUT any array of integer.

etnos
>That isn't an argument against leveraging web technology so much as an argument in favour of well written applications

On point, as someone who has written both native and web apps. it really comes down to execution. Elitism aside, Web based apps can work for some scenarios

catnaroek
> Poorly written websites can bring a desktop PC to its knees, but so can a poorly written native application.

Even well-written Web applications tend to use more system resources than their well-written desktop counterparts. A few days ago, I was surprised to find that Chromium was using 6 GB of RAM, while all other processes combined (including three Emacs instances, running fancy modes) were using just 1GB. And, no, I wasn't playing browser games or doing anything fancy in Chromium: just viewing text and images.

> And at least with a chromeless-browser-pretending-to-be-application you have the protection of the browser process sandbox so an app that goes awry isn't going to take your computer down that badly.

It would be much better to use programs that don't need to run in a sandbox in the first place. (To be fair, OS-enforced memory protection can be considered a kind of sandboxing too.)

jaquers
No disagreement about memory usage, except to say that [some web sites] are usually either poorly or unethically coded. I was on theverge.com yesterday, wondering why the network indicator was going crazy on a blog post. Pull up dev tools and it's ad code downloading megs worth of data, endlessly. Whether that's on theverge, or bad actor for ad code; I don't know or care - but when people talk about bad websites that's usually the primary example in my experience.
segmondy
See the spotify client, popcorn time, visual studio code. these are javascript. They run great.
_prototype_
Also the Slack web client
rbanffy
Have you checked how much memory they use?

I know we have 8-core laptops with 16 gigs of RAM, but, still, it's excessive.

freyr
Any idea why the memory usage is so high? Is it related to the Electron/Chromium platform, or bad application design?
rbanffy
Would need to check, but my bet is caching of partially rendered HTML as bitmaps, as well as a full JavaScript JIT environment.

It's just a high footprint enironment.

segmondy
Don't have an 8 core, or 16 gigs of ram. I'm talking about my desktop. I have a 4 core i5 from 2013. Spotify has 173M in res. Visual studio code with 10 files open has 99M in res. I have firefox open with about 125 tabs open, 6 terminal windows, 2 pdf files open, VLC open but not playing, IRC in about 15 channels. Load average is 0.60. Memory load across all CPU's about 10%. 8gigs used out of 12gigs of ram. With firefox using 5.1 gigs.
dismantlethesun
It is pretty excessive, but maybe web assembly would find a real home in making applications that do well on both native and web.
n00b101
WebAssembly will hopefully end JavaScript's stranglehold on the web.

The justification for using JavaScript for server/desktop/mobile applications seems to essentially be that a certain large group of programmers only know JavaScript.

Udik
I guess you tend to reuse the technologies you're familiar with, especially when they are very well proven for solving a very complicated problem that requires a lot of expertise (I'm talking about fancy UIs, of course).

Which is also why I think the emphasis on JS is wrong: the main selling point of these technologies is NOT javascript, is HTML5/ CSS. Javascript is just a (very handy indeed) scripting language like any other: frankly the logic and requirements of most applications around are not complex enough to justify the usage of anything more solid or complicated than that. Just today I was contacted by a former colleague, a Java developer, asking advice on using Electron to develop a quick desktop application.

On the other hand, there is a long list of languages that transpile to javascript. So where are the masses of serious C#/ Java/ <pick-your-language> developers using transpilers to write the logic of their web applications?

n00b101
So where are the masses of serious C#/ Java/ <pick-your-language> developers using transpilers to write the logic of their web applications?

I think it is catching on with C++ game developers, who want to be able to run the same codebase natively or in the web browser. The Unity 3D game engine has built-in support for this using WebGL, for example.

I don't know if CLR or JVM languages have been successfully transpiled to the browser using asm.js / WebAssembly (it wouldn't be very efficient at all, so I doubt that will ever be popular).

gjolund
Agreed, Im so tired of being forced to write js.
contextfree
I'm not sure. All the browser APIs will still use the JS object model. If we look at the situation on the JVM or CLR, it seems like languages specifically designed for those object models (even if adapted from other languages) have been more successful than attempts to directly port existing languages. e.g., Scala, Clojure, F# or Powershell vs. Jython, JRuby, C++/CLI or IronPython.

My bet would be that languages specifically designed for the JS ecosystem (whether JS itself or something like CoffeeScript or TypeScript) will continue to be the most popular for writing most of an app, with some apps dropping down to C or C++ or Rust compiled to wasm for a subset of performance-critical code.

vidarh
I'm sure there'll be a mix. But consider things like Opal (Ruby => javascript), and it seems like there's at least some people who very much want to be able to work in a single language, but for that language to not be javascript...

I'm sure once webassembly is ready, that'll at least get more popular.

catnaroek
Too much wishful thinking during the second half of the talk, though.

Also, while fancy runtime systems can improve the performance of dynamic[0] languages, it doesn't come for free: the price to be paid is the loss of elegance. For instance, a JIT compiler could inline a virtual method that seems not to be overridden anywhere, but if later on it turns out that the virtual method was overridden somewhere, the “optimization” has to be undone. How can anyone in their right mind trust a language that requires such dirty implementation tricks to achieve decent performance?

[0] By which I mean “less amenable to static analysis”, regardless of whether the language has a static type system. For instance, Java and C# are dynamic languages in this sense.

smallnamespace
> How can anyone in their right mind trust a language that requires such dirty implementation tricks to achieve decent performance?

Isn't this a rather broad brush with which to paint all JIT language implementations, including Java, C#, and, say, PyPy?

contextfree
it's interesting that C# has recently moved away from JIT and reflection for client apps (.NET Native), though.
catnaroek
Could you name a JIT compiler that doesn't do this kind of thing, yet offers performance comparable to AOT-compiled languages?

(FWIW, I'm not saying it's impossible. It's perfectly possible, but you'd need a source language that offers much better static guarantees than the typical language that a JIT compiler is written for.)

---

Sorry, can't reply to you directly, because “I'm submitting too fast”. So my reply goes here:

> for the simple reason that type systems cannot capture all relevant runtime context.

Type checking isn't the only kind of static analysis out there. And there's no need to use statistics to optimize anything at runtime when your ahead-of-time compilation step already emits optimal target machine code.

> Java is a good example here, since it's strongly statically typed.

Java is as dynamically typed as it gets: `instanceof`, downcasts and reflection, all conspire to reduce the usefulness of static type information to zero.

> By your reckoning, all greedy optimizations that CPUs do like branch prediction and prefetching are also similarly 'inelegant', because they can be wrong and require rolling back.

Yes, indeed. It's more elegant to know beforehand what exactly you have to do, and then do just that and nothing else.

smallnamespace
> that _doesn't_ do this kind of thing

Why does that matter to anyone?

> you'd need a source language that offers much better static guarantees than the typical language that a JIT compiler is written for

Java and C# are both statically, strongly typed languages where JITs are the dominant implementation.

Almost by definition, it's impossible to write a JIT compiler that outperforms AOT compilation without looking at runtime data, because AOT compilers have a lot more time to look for difficult static optimizations. The reason JITs can keep up is because they have access to information that an AOT compiler does not.

> And there's no need to use statistics to optimize anything at runtime when your ahead-of-time compilation step already emits optimal target machine code.

This is simply untrue. For example, it's not possible to statically determine whether a function should be inlined or not. However, a JIT can see that it's used in a hot loop and dynamically inline.

For any language, no matter the type system, runtime information will always be a superset of compile-time information. There will always exist optimizations in a JIT that aren't possible in an AOT compiler.

> Java is as dynamically typed as it gets: `instanceof`, downcasts and reflection, all conspire to reduce the usefulness of static type information to zero.

Idiomatic Java code doesn't use these features heavily. Just because it's possible to wipe out type information doesn't mean that the vast majority of code that an AOT or JIT compiler sees won't be strongly typed.

catnaroek
> For example, it's not possible to statically determine whether a function should be inlined or not.

MLton has absolutely no problems inlining functions, even higher-order functions, at compile time. This is only difficult in languages with virtual methods, because they can be overridden anywhere. If anything, that's an indictment of virtual methods, not AOT compilers.

> However, a JIT can see that it's used in a hot loop and dynamically inline.

What if it's a virtual method call that's known to be overridden in several places? You can't inline it, even if it's in the middle of a hot loop.

> For any language, no matter the type system, runtime information will always be a superset of compile-time information.

Runtime information is always anecdotal, specific to one particular run of a program, so...

> There will always exist optimizations in a JIT that aren't possible in an AOT compiler.

... for every “optimization” a JIT can perform, there will always exist a program for which the “optimization” will have to be rolled back after it has already been performed, because it turned out to be unsound.

> Idiomatic Java code doesn't use these features heavily.

Language implementations must work correctly whether you write idiomatic or unidiomatic code.

> Just because it's possible to wipe out type information doesn't mean that the vast majority of code that an AOT or JIT compiler sees won't be strongly typed.

Most code I write in Python could be given static types too. That doesn't make Python a statically typed language.

And “strongly typed” doesn't really mean anything.

smallnamespace
> MLton has absolutely no problems inlining functions, even higher-order functions, at compile time

Of course, but how does it know which functions to inline? If you inline everything, then you will blow through your cache.

> What if it's a virtual method call that's known to be overridden in several places? You can't inline it, even if it's in the middle of a hot loop.

That's not true--a JIT could optimistically replace with a concrete realization.

> Runtime information is always anecdotal, specific to one particular run of a program, so...

That's a benefit. No matter what AOT compiled code you have, it is possible to speed up execution if you know what code paths you will take.

> ... for every “optimization” a JIT can perform, there will always exist a program for which the “optimization” will have to be rolled back after it has already been performed, because it turned out to be unsound.

Yes, but so what? As long as it improves performance in the average case, and the worst case is bounded, then that is a net win. You can equally well write deliberately obfuscated code that an AOT compiler has trouble with.

> Language implementations must work correctly whether you write idiomatic or unidiomatic code.

Implementation is correct. Only reflection is slow. If you don't want that, don't write reflection.

catnaroek
> Of course, but how does it know which functions to inline?

Small functions and higher-order functions are the most natural candidates. (The two categories greatly overlap in most cases.)

> That's not true--a JIT could optimistically replace with a concrete realization.

You'd have to roll back an unsound optimization in the middle of a hot loop. I'm pretty sure that's not what you want.

> it is possible to speed up execution if you know what code paths you will take.

That knowledge can be encoded statically in many cases, if only you used the right languages.

> As long as it improves performance in the average case, and the worst case is bounded, then that is a net win.

This is only the case when your program wasn't close to optimal to begin with.

> You can equally well write deliberately obfuscated code that an AOT compiler has trouble with.

Yes, but languages amenable to static analysis will actively get in your way if you try to write such obfuscated code. The static analysis either tells you that your program is gibberish, or outputs nonsensical gibberish of its own. So the path of least resistance is to write code that the static analysis knows how to optimize. Which is not the case in dynamic languages (including pseudo-static ones like Java).

> Only reflection is slow. If you don't want that, don't write reflection.

So. basically, you're telling me to ditch Java's entire library and framework ecosystem?

smallnamespace
> Small functions and higher-order functions are the most natural candidates. (The two categories greatly overlap in most cases.)

But then you're just guessing. Isn't that also inelegant?

Let's play devil's advocate: how do you decide the cutoff on function size for inlining? Well, you would profile a bunch of programs with various cutoffs... now all you have is a heuristic, and MLton will inline some functions that it shouldn't, and it will fail to inline other functions that it should.

It will do worse than a JIT at this, because the JIT has more information.

> That knowledge can be encoded statically in many cases, if only you used the right languages.

Sure, but you won't ever succeed in encoding all of it, which is why runtime techniques can have a place.

> This is only the case when your program wasn't close to optimal to begin with.

That's simply untrue, and you can prove that formally -- given a machine M and a program P that produces outputs on a set of inputs I, it is always possible to come up with a program P' that produces those outputs with fewer steps on some subset of I, in return for taking more steps on the rest of I (except in the trivial case where the running time is completely independent of input).

You can view a JIT as iteratively replacing P with P' after it sees which inputs I are most common, and this is true no matter what M, P, or I are. In particular, there exists a version of P' that is faster than the statically optimized version of P on your program's input.

> Yes, but languages amenable to static analysis will actively get in your way if you try to write such obfuscated code.

I don't see how that isn't equally applicable to writing code to fool your JIT.

> So. basically, you're telling me to ditch Java's entire library and framework ecosystem?

Framework code doesn't generally run inside your inner loops, so I don't see how that should affect either your AOT or JIT compiler much.

catnaroek
> In particular, there exists a version of P' that is faster than the statically optimized version of P on your program's input.

Sure, but I'm interested in what the program does on all meaningful inputs, not a specific one. Otherwise, I'd just precompute the answer and hardcode it.

> I don't see how that isn't equally applicable to writing code to fool your JIT.

AOT compilers are supposed to provide feedback to the programmer about what the program means (e.g., inferred types, type errors). JIT compilers are not.

> I don't really understand why optimistic heuristics bother you so much.

Um, because they can be wrong, and then you need to fix errors, which makes the system more complex?

> Seems like you just have an aesthetic preference.

Yes, for simplicity, and for thinking before writing code.

smallnamespace
> Um, because they can be wrong, and then you need to fix errors, which makes the system more complex?

Yes, but I don't understand why compiler complexity concerns you, so long as the whole thing works. AOT compilers are also extremely complex, and are also full of heuristics.

You seem really hung up on the fact that an optimization can be rolled back at some point, but why should you care? JIT optimizations can be 'wrong' in the same way that caches can miss. It the right engineering solution to eliminate caching, over some belief that one should never be 'wrong' anywhere in a program, even though the eventual answer is always correct?

This might matter in system with real-time performance demands, but then you should be equally concerned about the garbage collector, for example.

catnaroek
> Yes, but I don't understand why you compiler complexity concerns you, so long as the whole thing works.

Because I find it easier to trust simpler systems than complex ones.

> AOT compilers are also extremely complex, and are also full of heuristics.

Yep, those heuristics are annoying too. (But less so than the ones JIT compilers use, because at least they don't involve temporarily breaking my program.)

> unless you've profiled it and see this having a real adverse effect on overall performance?

How many times do I have to repeat that what annoys me is the excessive complexity?

gpderetta
>Let's play devil's advocate: how do you decide the cutoff on function size for inlining

AOT compiler writers have been tuning inline heuristics for more than 40 years. Sure sometimes you have to help the compiler with annotations or PGO, but in the large majority of cases things just work.

In fact AOT can deal much better with the massive code explosion due to aggressive inlining than JIT compilers which have a very tight time budget for optimisations.

mafribe

   trust a language that ...
Where's the problem?

JIT compilers work. Most high-level languages require compiler tricks for achieving performance from Scala to Haskell to Prolog.

It's useful to distinguish between (1) having a clean, easy to understand semantics and (2) having a fast implementation. Use all the hackery in the world to get your language fast, as long as it's abstract semantics is easy and canonical.

catnaroek
> Most high-level languages require compiler tricks for achieving performance from Scala to Haskell to Prolog.

To make things perfectly clear: I'm not against optimizations being performed automatically by compilers or runtime systems. What I'm against is unclean designs: deliberately performing an unsound optimization and then rolling it back is an unclean design.

> Use all the hackery in the world to get your language fast,

The language implementation is a program itself, and I don't have any good reasons to trust a hackish language implementation any more than I trust other hackish programs - that is, not at all.

> as long as it's abstract semantics is easy and canonical.

Hah! This thread is about JavaScript.

mafribe

   is an unclean design.
It's not unsound. Otherwise you'det g incorrect results. You could say the design is wasteful, because you optimise and then throw away the optimisation. But it's hard to do better for some kinds of languages.

   trust a hackish language implementation 
I agree. And indeed JIT compilers are hard to get right. But in practise even JIT compilers are much higher quality than applications: ask yourself, how many of the bugs in your code turned out to be compiler bugs, vs how many were ultimately your mistakes?
catnaroek
> It's not unsound. Otherwise you'd incorrect results. You could say it's wasteful, because you optimise and then throw away the optimisation.

The optimization is unsound. If it weren't, it wouldn't have to be rolled back occasionally.

If you're talking about the combination of the optimization and the rollback mechanism, it's not unsound, but it's inelegant. A runtime system designed this way only understands your program in a statistical sense (based on concrete execution profiles, which may vary from one run to another), never with the full certainty that static analyses (type checking, abstract interpretation) can give you.

> how many of the bugs in your code turned out to be compiler bugs, vs how many were ultimately your mistakes?

Of course, most were my mistakes. But the very reason why those bugs made it into the final executable is the lack of powerful static analyses in the first place. Curiously enough, when I use languages that make static analyses possible, I write programs with less bugs and they perform better without relying on fancy runtime system tricks.

---

Sorry, can't reply to you guys, because “I'm submitting too fast”. So my replies go here:

@mafribe:

> That's an orthogonal issue.

It's not. Static analyses gather valuable information that can be used to emit efficient code.

> More powerful static analysis is also more time-consuming.

So perform it ahead of time!

> One of the design goals of Javascript JITs is to make web-pages as responsive as possible. That rules out complicated static analysis.

Of course, a browser can't spend much time statically analyzing JavaScript programs, but programs can be statically analyzed (gasp!) before they're deployed.

---

@smallnamespace:

> Why does that matter to anyone?

Because this implementation technique is unnecessarily complex, and a far simpler alternative exists: Know beforehand what your program has to do. Think before you write code.

> Java and C# are both statically, strongly typed languages where JITs are the dominant implementation.

Their type systems can be easily subverted, so they're not “strongly typed” in my book.

---

@smallnamespace: Oops, sorry, I accidentally swapped my two replies to you: this one and https://news.ycombinator.com/item?id=12481956 .

mafribe

   The optimization is unsound. 
The JIT compiler is sound, w.r.t. to the source language's semantics. That's the only thing that matters for the programmer.

   but it's inelegant.
Elegance is in the eye of the beholder. I was blown away when I first encountered JIT compilers.

  lack of powerful static analyses
That's an orthogonal issue. More powerful static analysis is also more time-consuming. One of the design goals of Javascript JITs is to make web-pages as responsive as possible. That rules out complicated static analysis.
smallnamespace
> it's not unsound, but it's inelegant. A runtime system designed this way only understands your program in a statistical sense (based on concrete execution profiles, which may vary from one run to another), never with the full certainty that static analyses (type checking, abstract interpretation) can give you.

This is a false dichotomy. You can always build a JIT that uses runtime statistics to speed things up, even in languages that are quite amenable to static analysis, for the simple reason that type systems cannot capture all relevant runtime context. Java is a good example here, since it's strongly statically typed.

By your reckoning, all optimistic heuristics that CPUs do like branch prediction and prefetching are also similarly 'inelegant', because they can be wrong and require rolling back.

Optimistic heuristics have a long history in computer science, and IMO it seems strange to single one particular use case as being particularly evil.

raquo
No one likes that, but

1) HTML/CSS/JS is sill the only truly cross-platform (mac/windows/linux/web/ios/android/etc.) UI platform/ecosystem in 2016, and it will probably stay that way in the foreseeable future because OS makers love their walled gardens.

2) An app's memory efficiency is not a top 10 priority of an average solo-or-small-team developer's concerns, because that's not what most users pay for.

Electron/NW.js apps will only become more common, so I hope things will improve with asm.js/webassembly/etc.

Florin_Andrei
It would be interesting to see the evolution of memory usage for the popular platform du jour from the earliest decades until today. My uneducated guess says it will be an exponential.
gr3yh47
Python would like a word with you about point 1)
woah
I use a network simulation software inside of lubuntu inside of a VirtualBox inside my Mac. VirtualBox is a stream of headaches, and you suffer a big performance hit. What I wouldn't give for the software to serve an http/css/js interface so that I could run it with docker or natively instead.
vetinari
Make sure that the virtual machine has "paravirtualization interface: kvm" setting for linux guests, that the guest tools are installed and if you really use network, adapter type: virtio-net doesn't hurt. And also, that you have enough RAM. Then, virtual machines in VirtualBox are fine.
woah
Thanks! Good samaritan :)
raquo
Honest question – how do you build cross-platform application UI in python?
mixedCase
Kivy.
eyko
I'm not familiar with the Python ecosystem but I imagine there are bidings for GTK, QT, or other UI toolkits. The same applies to most (popular) languages. The claim that HTML/CSS/JS is the "only" truly cross platform stack is simply not true.

The main hurdle that other languages faced was not having a native UI. For example, GTK on OSX or Windows did not feel native. Key bindings were often not native. Similar story with Java (was it Swing?).

The HTML/JS/CSS combo has exactly the same issue (it's not comparable to a native GUI - Cocoa or whatever it is.

falcolas
> The main hurdle that other languages faced was not having a native UI.

Electron faces this same hurdle, but with a twist. It just doesn't care; it picks a rendering target (the web) and simply uses it everywhere. Perhaps that's the approach QT and others need to take as well: stop trying to match Apple's UI on Apple, and Windows' UI on Windows.

mgkimsal
> stop trying to match Apple's UI on Apple, and Windows' UI on Windows.

Great idea, but make sure it at least looks/feels good.

I remember Swing stuff from the late 90s and... IMO the primary issue wasn't so much that "it doesn't look like Windows" but that ... it was a very poor experience.

Copy/paste/keys - yeah, that's an annoyance, but if the UI is clean, friendly, easy to understand and be productive on, people can look past the differences of the native host OS. Swing (and GTK and others) really don't provide a 'better' UX (imo).

corv
Python on iOS?
nodja
It's not pretty but: https://kivy.org/docs/guide/packaging-ios.html
corv
Interesting
st3v3r
Python can work on desktop, but it's solutions for mobile are iffy at best.
reitanqild
Java as well for basically anything but iOS.

C# might even cover that now.

__derek__
Xamarin, I reckon?[1]

[1]: https://www.xamarin.com/

baldfat
Well I have yet to see Python make a decent cross platform application on all of those platforms especially iOS and Android.

I love Python but people over state its benefits and effectiveness.

Loic
Python is wonderful for business logic and scientific computations. HTML/JavaScript is wonderful to make beautiful interactive user interfaces.

So at the moment, I am using PyQt/PySide and load my HTML/JavaScript/ReactJS GUI in a webview with a simple bridge object between my Python code and the JavaScript code.

As soon as I need native interactions with the file system, etc., I am using Python, which is robust and proven (and I am used to it). For example, if I need a dialog to save a file, I just use QFileDialog.

At the end, because of a clear separation between the GUI and the computation/system interactions with the bridge object, I just need a different bridge object to have everything running fully online.

One point is of course that I am not developing for phones and tablets, I package everything with pyinstaller for the good old desktop users.

adamnemecek
There's also Qt but they have been moving in the web direction too.
lostInTheWoods3
And don't forget JavaFX
adamnemecek
Do people use that?
fdgdasfadsf
On point (1) you've kind of loaded the comparison by including web in that list...
raquo
It's a common business scenario: start out as a web app, then as business picks up let's start providing native apps.

Or the other way: start out with a native app on the platform we most care about + a web app for "access from anywhere".

Sep 01, 2016 · rybit on Hosting My Static Site
+1 on screw the absolutely necessary caching. :)

I couldn't agree with too much tooling can abstract what is usually simple - just serve some HTML. I think that the whole API economy and saas/paas stuff really has to be evaluated carefully. You have business considerations around lockin, time to integrate vs time to build your own, etc. I think that they work really well when you're building something simple, but there is a range of the size of your site where it is more of a hinderance. The decision to use a service should be about what it gives you, not because it is cool.

Aside: I have totally been that engineer that has made something "clever". I am sure there are other engineers that curse me for what I thought was a great tool b/c I looked at the site for 0.1 seconds (sorry!).

I really wanted to address the talk about static.

Let's take the instance of a blog (like any of the heroku/rails tutorials out there). Yes, you must have a canonical place for the copy to live. Be it in a db or flat files on disk or in your git repo. But you don't need to have the actual request go to the origin for that info and then jam it through some jinja/unicorn/etc template. Just to render a silly article to the end user. When you write that article, you know what that page is going to look like, _why dynamically generate it_? This is the way that static can work, generate all the versions of the content and rely on JS to do magic on the frontend (https://www.destroyallsoftware.com/talks/the-birth-and-death...). Removing the whole call back to the origin db for what is essentially static content. This obviously is going to be faster than a DB query + template render + network traffic, as well as more secure. It is an http GET ~ hard exploit vector.

Now does this extend into the arena of apps (react, angular, newest fanciest JS framework). The actual assets are also static, no? They should be served exactly the same as the HTML we have. Then it is up to the JS to query whatever service/API you want and automagically generate some HTML.

The big thing is that services like wordpress/drupal/rails have made it very easy for people to build sites in a classic LAMP stack, but that is kinda flawed in a lot of ways. Wordpress's plugin system essentially lets you remotely run your code on their servers. That is a dangerous game to play. All to do something that doesn't even need a server in the first place. Why risk it when you don't need to? And you'd get some nice improvements if you don't. People shouldn't even know what a LAMP stack is to make there business site.

Now is this approach right for every site? Nopezzzz. I don't believe in silver bullets, but there are a lot of sites that fit this mold. And it is a different approach to building your site out.

Either way - sorry to hear about Capistrano. Shell scripts ftw (though I have some that are terrible out there too).

This (funny and accessible, don't worry) talk by Gary Bernhardt is what helped me understand asm.js for the first time: https://www.destroyallsoftware.com/talks/the-birth-and-death...
Aug 04, 2016 · Analemma_ on Visual Studio Code 1.4
The future as foretold by "The Birth and Death of JavaScript" (https://www.destroyallsoftware.com/talks/the-birth-and-death...) gets closer every day :)
sdegutis
I honestly wouldn't be surprised if someone like MS announces a new open source "JS to efficient native machine code" compiler one of these days.
mythz
MS Chakra JS VM is already open source: https://github.com/Microsoft/ChakraCore

Like all high-performance JS VM's their JIT's already emit efficient machine code. If Google, Apple or MS could get JS running any faster they would've.

Ever watched "The Birth & Death of JavaScript" ?

https://www.destroyallsoftware.com/talks/the-birth-and-death...

It's a future prediction of how "javascript lost but programmers won".

Just wait until after the nuclear world war in the early 2020s.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Jul 23, 2016 · nhaliday on Rust: The New LLVM
Speaking of which, https://www.destroyallsoftware.com/talks/the-birth-and-death...
programminggeek
Great talk, Gary is a cool dude.
May 04, 2016 · tambourine_man on Node OS
THE BIRTH & DEATH OF JAVASCRIPT

https://www.destroyallsoftware.com/talks/the-birth-and-death...

weitzj
Great Talk to watch
We should just let V8 run in the kernel and do away with system calls. https://www.destroyallsoftware.com/talks/the-birth-and-death...
Someone else linked to a talk that mentioned removing all the layers in some theoretical architecture called METAL (this is a old talk) basically running asm.js (again talk is old) directly through the Kernel and even removing overhead that Kernels need for making native code safe (such as the Memory Management Unit) and as a result it would run faster than normal native code.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

The major thing to be gained from all this then is software that can run fast but not have to be recompiled for all the different systems and hardware.

Your comment reminded me of this highly entertaining talk - https://www.destroyallsoftware.com/talks/the-birth-and-death...

Although tongue in cheek, I think it gives some food for thought. I feel like WebAssembly is to asm.js what the modern JS profession is to old follow-your-cursor effect on webpages - it becomes something to take seriously and use, and having done a bunch of porting things with Emscripten the idea of a browser within a browser doesn't sound as crazy as it used to!

jerf
To be honest, WebAssembly isn't really javascript anymore. asm.js was, albeit only sorta-kinda-just-barely (but in an important way), but WebAssembly isn't. There's a reasonable case to be made that in 20 years "everything" will be WebAssembly, but we won't be calling it Javascript, thinking of it like Javascript, or using it like Javascript.

In the long term, this is the death knell for Javascript-as-the-only-choice. Javascript will live on, but when left to fend for itself on its own merits, it's just another 1990s-style dynamic scripting language with little to particularly recommend it over all the other 1990s-style dynamic scripting languages.

But Javascript programmers need not fear this... it will be a very long, gradual transition. You'll have abundant time to make adjustments if you need to, and should you not want to, there will still be Javascript jobs for a very long time.

Klathmon
You act like JavaScript's only upside is the fact that it's required in the browser.

IME the opposite is true. I'm seeing companies flock to it outside of browser contexts in areas where "code reuse" or "isometric/universal" style programs aren't even possible.

jerf
It's not that it's the only upside. It's that once out of the browser, it isn't really a standout language. For instance, if you're going to have a "fair fight" out there, one can't help but notice that Python already has everything that we're standing around waiting for in ES6, plus the next couple of iterations.

And that's just Python. You should also check into Perl, Ruby, Lua, and PHP, and that's without straying even a little outside of the "1990s dynamic scripting language" field, to say nothing of what you can find if you leave that behind.

It just isn't that impressive of a language once you remove the browser prop. It isn't necessarily bad, or at least no worse than some other popular languages as well, but there's nothing uniquely good about it in the greater language landscape.

To be honest, anyone who thinks that Javascript does have some sort of unique advantage needs to get out more and learn a few more languages. Even Python, which you'll find goes very quickly if you already know JS. Javascript is very, very not special. Again, since people seem to confuse these things, that does not make it bad, but it's very not special. Very boring, middle-of-the-road scripting lanugage that is, if anything, well behind the other ones in features because of its multihead, no-leader-in-practice development model.

Klathmon
But JS isn't just a "worse python" either. I've gotten around when it comes to languages, from business-basic to c++ to go to python, php, ruby, js, lua, and lisp. Having spent a non-trivial amount of time in each, JS has by far one of the best ecosystems i've ever seen.

See, I hate talking about languages because it's hard to define what a language even is.

Is it purely the syntax? Is it syntax+standard library? Or is it the whole set of syntax+libraries+ecosystem+idioms?

From a purely syntax point of view, js is lacking some things and while they are getting fixed, it's taking longer than most would like. And i agree that in this aspect js is currently "mediocre" at best.

From a "whole ecosystem" point of view, js is wonderful. It's fast, secure enough to give arbitrary code from anyone in a browser, has a stupidly huge set of libraries, an "idiomatic" style which works very well for some problems, and it's almost literally everywhere and on everything, has multiple competing implementations that helps drive performance and reduce bugs.

Yeah, it's got it's quirks (and in JS's case, a lot of them), but every language it's age and older does.

Now if there was some way to magically take all of the "other" parts from js and apply them to another language, you'd have an overnight success, but the fact is that the language syntax is such a small part of what a language truly is.

You should check out runtime.js [1] if you haven't. Instead of duktape it uses v8. I'd like to see an equivalent project using spider monkey so that C/C++ code could be compiled into efficient JS. Check out this speech if you haven't already: The Birth & Death of JavaScript [2]

[1] https://github.com/runtimejs/runtime [2] https://www.destroyallsoftware.com/talks/the-birth-and-death...

Cool. "We took a dynamic language and made it fast(er again)".

And you're gonna apply that to the darkest corners of ES6? Whee, language theory nerd paradise!

I can't wait for the compilers targeting that using all sorts of really odd idioms that in effect work like fuzz testing.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Jan 13, 2016 · lumpypua on Elm in the Real World
Any commentary on "The Birth and Death of Javascript"?

https://www.destroyallsoftware.com/talks/the-birth-and-death...

As far as I can tell, Javascript the runtime is a juggernaut that there's no stopping. If a platform gets popular tooling fixes itself.

skybrian
It does not fix itself. It gets fixed through a lot of hard work by many people who shouldn't be hidden behind the curtain.

But you are right that if a platform is popular enough, it will usually attract people that will do this work.

pjmlp
Personally, even though I also do have quite some webdev experience, I am wishing that mobile native development wins.

But on HN that is a sure way to be downvoted.

KingMob
I'm personally hoping the future holds some way to blend the openness/interoperability of the web with the performance of native apps. Maybe React Native, maybe web assembly, etc.

I hope that pure mobile development doesn't win, because the mobile app ecosystems are way more controlled than desktops/servers, and if everything went Android/iOS, it would be a major loss for freedom.

This whole conversation just makes me smile at how much truth and more is coming out of Gary Bernhardt's The Birth and Death of JavaScript[1].

https://www.destroyallsoftware.com/talks/the-birth-and-death...

This library reminds me of some of the "future" introduced in this (satirical, yet insightful) talk: https://www.destroyallsoftware.com/talks/the-birth-and-death... Discussion: https://news.ycombinator.com/item?id=7605687

This is extremely cool, by the way.

Conceptually, this is a reasonable line to draw in the sand. Minified Javascript should be thought of as equivalent to assembly language or Java bytecode[1], and if the policy is to not ship compiled binaries without source available, minified JS should fall into that category.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death... is funny, but the honest truth is that it's likely the future of web technologies, as Javascript is so deeply entrenched in the fiber of the web standards and the practical implementation of the browsers.

Jun 07, 2015 · wz1000 on HTML is done
Isn't this problem solved by having HTML/JS/CSS as a compile target, which is what we are heading to now, with languages and technologies like CoffeeScript, PureScript, Elm, ClojureScript, ghcjs, emscripten etc.

Even on the backend, in the end you have to run native machine instructions. HTML/JS/CSS substitute for that on the front end. Of course, x86_64 may be a better compile target, but the advantage of the existing languages over any new system is that there has been a lot of work that has already gone into making them fast, backwards compatibility, all browsers already implement them, and there are a lot of things that can target them now.

See also: https://www.destroyallsoftware.com/talks/the-birth-and-death...

afarrell
> HTML/JS/CSS as a compile target

That only works if your abstractions don't leak and your libraries are stable enough so that you can avoid having to debug on the lower level anyway.

crimsonalucard
Personally to me that's "ugh." HTML and CSS as a compile target? What's the lowest level primitive in html? A div? A span? ew. It should be compiled to something that has access to primitives such as lines, polygons or pixels.

Compiling to js is ok as that language is as close to the metal as you can get on a vm anyway. So long as performance isn't affected then it's really no different to the developer. Still the platform would be more elegant if it didn't compile to some intermediary high level language.

vonkow
Access to primitives such as lines, polygons or pixels, you say? Canvas, SVG, and WebGL fit that bill.
crimsonalucard
Sure. And I know about these. The original poster was talking about languages with HTML+CSS+JAVASCRIPT as compile targets, not canvas, svg or webgl. I'm simply addressing his comment.

Either way, WebGL/canvas operate as child elements in an html page, you still need to use javascript, and SVG isn't GPU accelerated. It's still, overall, an ugly mess. But definitely, with improvements, canvas, svg and webgl are all candidates for my aforementioned "next steps."

Houshalter
How is it that much different than using OS APIs to do common tasks like text, windows, buttons, scroll bars, etc? If you really want to do pixel level stuff, there is canvases and images and SVGs.
crimsonalucard
How would I replace html in the browser? Create a new rendering api in canvas? Can it be done? Sure. Will it be pretty? Not so sure.

Imagine your operating system can only compile one single language: Perl. And with perl the only way you can render anything on the screen is with a QT api and QT UI primitives.

Programmers can still do anything within this ecosystem. Technically you can have perl as compile targets for any other language. Lets also pretend that QT has this little UI element called canvas that has a api allowing you pixel level control.

While technically you could do anything in a platform like the one I described above, I'm sure you can easily see why it's still bad.

pron
You need HTML for search.
_RPM
> Compiling to js is ok as that language is as close to the metal as you can get on a vm anyway

JavaScript is as low as the metal you can get in a Virtual Machine? hmm...

reagency
asm.JS carries you rather close to bare metal.
crimsonalucard
Of course you can go lower, but there will be sacrifices. Brendan Eich actually addressed this same topic in response to some other thread I started a while back. I quote him below:

"Apart from syntax wins, you can't get much lower-level semantically and keep both safety and linear-time verifiability. Java bytecode with unrestricted goto and type confusion at join points makes for O(n^4) verification complexity. asm.js type checking is linear.

New and more concise syntax may come, but it's not a priority (gzip helps a lot), and doing it early makes two problem-kids to feed (JS as source language; new syntax for asm.js), which not only costs more but can make for divergence and can overconstrain either child. (This bit Java, pretty badly.)"

Actually we should abandon the ring concept all together for more promising security models.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

frik
The suggested solution is to have process isolation implemented using software - namely asm.js-enabled JavaScript virtual machines embedded in a Linux kernel, which save you from needing hardware isolation, reducing overhead. Gary calls this idea "METAL".

I found little resource about the project, but there is a discussion on Reddit: http://www.reddit.com/r/compsci/comments/25w7vt/javascript_b...

And we had the discussion on HN too: https://news.ycombinator.com/item?id=7605687

Nevertheless an interesting topic, that doesn't deserve a downvote of my parent.

Have you seen Gary Bernhardt's THE BIRTH & DEATH OF JAVASCRIPT?

https://www.destroyallsoftware.com/talks/the-birth-and-death...

d2xdy2
I saw this a few months ago, and it never gets any less insane (in a good way) to watch. Just following Atwood's Law, I suppose.
Apr 21, 2015 · derefr on Going “Write-Only”
Gary Bernhardt's "The Birth and Death of Javascript"[1] is a rebuttal to this, I think.

Effectively, imagine that a compiler, put in static-compilation mode, would link everything required to run a piece of code (the relevant windowing toolkit, all the system libraries, copies of any OS binaries the code spawns, copies of any OS binaries the code sends messages to over an IPC bus, etc.) into something resembling a Docker container, or a Unikernel VM image. Imagine, additionally, that all this then gets translated into some standard bytecode for a high-level abstract platform target that has things like graphics and networking primitives—say, asm.js. Now everything can live as long as something still executes that bytecode.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

steveklabnik
This will be really cool in the future, someday. But it's a sci-fi like talk for a reason.

(and funny enough, when I saw him give this talk, the above quoted situation is exactly what I thought of...)

derefr
Maybe it's unrealizable at the moment for x86 code. On the other hand, for random digraphs of arcade-game microcontrollers, JSMESS is doing pretty well at achieving this. :)
Probably not. While it wasn't intended to be so, JavaScript has become the lingua franca of web browsers and is therefore heading towards a future of being a compilation target. If your program can't compile to JavaScript, it doesn't really run on browsers (in much the same sense as "If your program can't be compiled to x86 assembly, it doesn't really run on computers").

https://www.destroyallsoftware.com/talks/the-birth-and-death...

libria
> heading towards a future of being a compilation target

Having to trace/debug compiled javascript from another language is a horrific future. In your analogy, C programmers would have to debug assembly.

I agree it's the most probable future, but let no one call it innovation when it's simply resignation from lack of options.

bad_user
You haven't heard of source maps then.
TazeTSchnitzel
"Having to trace/debug compiled x86 machine code from another language is a horrific future."
wtetzner
There's no reason Javascript can't get better at being the assembly language of the web, including making debugging of other languages easier.
ajkjk
It seems to me that we should really be looking for browsers to compile (or at least appear to compile, even if they skip this as an optimization) Javascript into a lower-level 'assembly' language, so that Dart can be made to compile to the same form. A LLVM for browsers, basically.

JS is a terrible assembly language. But the work that's been done on making it passable for that purpose is remarkable, so we're getting by with it.

Touche
You have to ask yourself if it's easier to get all browser vendors to agree on an intermediate language or to get them to agree on new apis that makes javascript better for that purpose.
WorldWideWayne
Alternatively, make a new type of browser that completely replaces Javascript, make it popular and leave the old world behind.
ajkjk
You'll never get adoption for a new browser that can't parse existing websites, unfortunately. There's no 'clean break' option.
WorldWideWayne
People adopt new things all the time. Look at what happened when smart phones came long - everybody started making apps for them. I think people on smart phones spend more time in apps than they do on the web.

Anyway, the new browser could totally embed another browser engine to show legacy web pages.

fixermark
New browser that fully supports existing technologies is probably the best option.

Similarly, there's nothing technically stopping someone from building a new scripting core in the browser itself and then implementing the JavaScript engine atop that (other than the need to support a brand-new untested framework in an environment of high-performance already-existing JavaScript engines, of course).

marcosdumay
It's easier to make them agree on an intermediate language. Mostly because one can make such VM as a BSD licensed plugin.

Just make the DOM available as some virtualized "system calls" and you'll avoid the fate of Java and Flash.

fixermark
What's a plugin?

Is that one of those things that "nobody" willingly installs in their browser unless they absolutely have to because much of the market doesn't even understand what they are and the ones that do still want to avoid them because they decrease browser stability?

marcosdumay
Whatever it is, somehow most people still have Flash installed on their computers, and a lot of them have Java.
fixermark
Those are two of the exceptions to the rule.

Quite literally; there is a section of the Mozilla browser codebase that actively seeks out those plugins (along with QuickTime, Windows Media Player, and Acrobat Reader) because users who lack them experience a "broken web" (http://lxr.mozilla.org/mozilla-central/source/dom/plugins/ba...).

I'm not saying becoming one of those plugins is impossible---clearly, it's been done more than zero times. I'm saying that I really wouldn't base any business decisions around the assumption that it will happen for your plugin.

Mar 25, 2015 · Buge on Fear of Apple
> a web-based web browser

Reminds me of this: https://www.destroyallsoftware.com/talks/the-birth-and-death...

Also, https://www.destroyallsoftware.com/talks/the-birth-and-death...
jerf
I actually acknowledged that in the version of this I posted 4 days ago: https://news.ycombinator.com/item?id=9071064

However, he has JS hanging on for longer than I bet on... once asm.js gets a good DOM binding I expect the explosion of language diversity to take about two years, tops, and for it to rapidly become clear that JS is now just another way of accessing the DOM. I think there's more pressure built up there than people realize, because right now there's no point in thinking about it, but once it's possible, kablooie. Node's value proposition, IMHO, is in some sense correct, but backwards; it's not that we want to write in Javascript on the server, it's that we want "client language = server language"... and once there's no longer a technical handcuff pinning the client side of that equation to Javascript, it will not take that long for it to no longer be Javascript. It is not an impressive language, even within its own 1990s-style dynamic language niche.

(I think this is not because it's "bad", but because it has been developed in this really terrible multiple-vendors-that-actively-don't-want-to-cooperate way for most of its lifetime. It's gotten past that, I think, but during those decades all the other scripting languages were marching right along. None of the other languages could have survived such a process and gotten to where they are today, either.)

jamii
I wonder how much React-style UI libraries can make up for the poor DOM access. Do all the calculation in asm.js and just dump out the diff for the plain js dom updater to deal with.
Touche
GC-ed languages are going to have to include the GC which I doubt can compete with JavaScript. No one wants to write CRUD apps and manually manage memory. I don't see asm.js being used outside of games.
wmf
Unless someone invents a language where non-GC memory management is easy and that language can compile to both the client and server...
iopq
Rust makes it not quite "easy" but at least it makes it automatic and not error-prone.
jerf
"GC-ed languages are going to have to include the GC which I doubt can compete with JavaScript."

There's no particular reason why not. It's all just bits and bytes in the end, and asm.js gives a pretty low-level view of the world. And if you're starting from a baseline of a language that can easily be 5-10x faster than browser-based JS you can afford a bit extra on the GC side.

Javascript isn't magic. It's just a language. It isn't even a particularly special one, once you ignore its browser support, and it certainly isn't one focused on performance (I stopped buying the "languages don't have performance characteristics" line a while ago). It gets to run the same assembly instructions everybody else does. It isn't as fast as a lot of people here suppose, and it isn't that hard to beat out its performance even now.

Touche
But why, what is the upside vs. just transpiling like the dozens of languages that already do?
jerf
Why are you asking as if it's some sort of theoretical question when asm.js is in hand, right now, and it performs wildly better than raw Javascript? Of course we'd rather compile to something that's faster than Javascript than compile to Javascript.

(Sorry, I can't condone the word "transpile". Usage of it just reveals someone who doesn't understand compilation technology and thinks there's somehow something "special" about compiling to one intermediate language ("javascript") vs. another ("assembler").)

I can't believe how many people seem to believe that Javascript is a C-speed level language, and downmod anyone who observes it's not. Well, it's still not. It's easy to see that it's not. It's not even close. If it were asm.js wouldn't exist. (I mean, if you're having trouble with my claim here, stop and think about that for a moment... if Javascript is so fast, why does asm.js even exist?)

teamonkey
Bear in mind that Unreal and Unity both have some form of internal garbage collection systems that are compiled to asm.js. In the case of Unity, C# is transpiled into asm.js code. You could write in potentially any GC language, it's just that the GC needs to be included.
"For example, will asm.js eventually take over traditional web development? Theoretically, you can compile any compiled language to asm.js, so you'll have a lot more choice for the language you want to use to create your webapps."

I've outlined this progression before, which seems obvious to me, but I haven't seen anyone else discuss it.

    1. Get asm.js into every browser.
    2 or 3. Observe that asm.js is very verbose, define a simple
            binary bytecode for it.
    3 or 2. Figure out how to get asm.js decent DOM access.
The last two can come in either order.

And the end result is the language-independent bytecode that so many people have asked for over the years. We just won't get there in one leap, it'll come in phases. We in fact won't be using Javascript for everything in 20 years [1], but those of you still around will be explaining to the young bucks why certain stupid quirks of their web browser's bytecode execution environment can be traced back to "Javascript", even when they're not using the increasingly-deprecated Javascript programming language.

[1]: https://www.destroyallsoftware.com/talks/the-birth-and-death...

ndesaulniers
Hey, you're good at this. ;)

It's funny to see that people so opposed to JS are too short sighted to see that asm.js could enable what they've wanted all along. Look at that!

cpeterso
asm.js is a cross-platform bytecode; it just happens to be ASCII and look like JS. :) Arguments against asm.js as a bytecode are mostly about aesthetics and "elegance".

http://mozakai.blogspot.com/2013/05/the-elusive-universal-we...

adamnemecek
I think that eventually we might ditch DOM and use WebGL or canvas or something instead of it, like on the desktop.
munificent
And throw accessibility out the window. Sorry visually-impaired people. The web of the future isn't for you. :(
tree_of_item
You can have accessibility without the DOM, and really the DOM is not such a great way to do this anyway. Just do things like write explicit audio-only interfaces.
tree_of_item
I can't reply to bzbarsky for some reason, but:

I assumed we were talking about vision impairment, because that's what the comment I replied to mentioned. Of course you can implement whatever else you want as well.

I question this "semantic DOM" idea: the trend has been towards filling the DOM with tons of crap in order to make applications, not documents. Do accessibility agents even work well on JavaScript heavy sites today?

Accessibility can and will be had without the DOM; while it is a concern, it shouldn't prevent things like WebGL + asm.js apps on the web.

SixSigma
> I can't reply to bzbarsky for some reason

there's a rate limit to stop threads exploding

bzbarsky
No idea why you couldn't reply to me, but....

My point is that visual impairment is not mutually exclusive with other impairment, even though people often assume it is, consciously or not. This is an extremely common failure mode, not just in this discussion.

And while of course you _can_ implement whatever else you want, in practice somehow almost no one ever does. Doubly so for the cases they don't think about or demographics they decide are too small to bother with.

How well accessibility agents work on JS heavy sites today really depends on the site. At one end of the spectrum, there are JS heavy sites that still use built-in form controls instead of inventing their own, have their text be text, and have their content in a reasonable order in the DOM tree. At the other end there are the people who are presenting their text as images (including canvas and webgl), building their own <select> equivalents, absolutely positioning things all over the place, etc. Those work a lot worse.

You are of course right that accessibility can be had without the DOM, but "webgl" is not going to be it either. Accessibility for desktop apps typically comes from using OS-framework provided controls that have accessibility built in as an OS service; desktop apps that work in low-level GL calls typically end up just as not-accessible as your typical webgl page is today. So whatever you want use instead of the DOM for accessibility purposes really will need to contain more high-level human-understandable information than which pixels are which colors or what the audio waveform is. At least until we develop good enough AI that it can translate between modalities on the fly.

amelius
Speaking about AI, is it really that hard to do the OCR'ing of the images? I'm no expert, but I was under the impression hat this was a solved problem.
bzbarsky
That's a pretty narrow view of "accessibility". For example, you just assumed that your user can't see (or can't see very well?) but can hear.

Users who are deaf and blind? Out of luck. Users who are deaf and not blind but need your thing zoomed to a larger size? Maybe out of luck maybe not (depends on whether the WebGL app detects browser zoom and actively works to defeat it like some do). Users who are deaf and not particularly blind but happen to not be able to tell apart the colors you chose to use in your WebGL? Also out of luck.

What the DOM gives you is a semantic representation that the user can then have their user agent present to them in a way that works best for them. Reproducing that on top of WebGL or canvas really is quite a bit of effort if you really want to target all users and not just a favored few groups.

RussianCow
You use WebGL to create standard GUI applications on the desktop? WebGL and canvas are in no way replacements for the DOM.
runeks
Perhaps it's to make it backward compatible, so that, for example, the traditional DOM is implemented as a Javascript library that parses HTML and renders it onto a WebGL surface?
coldtea
>You use WebGL to create standard GUI applications on the desktop?

Increasinly high end apps do.

RussianCow
Graphically intensive apps do it for performance. But the tradeoff is you lose all the native UI controls that the OS gives you--form fields, text selection, animation, etc. So I don't think replacing the DOM with OpenGL or similar is a good solution for general-purpose apps.
adamnemecek
All of those would be implemented in a framework.
bzbarsky
Yes, they already are. That framework is called the DOM. People keep complaining about it and trying to come up with replacement frameworks that end up slower and less capable...

The DOM definitely has its problems, mind you. Investing some time in designing a proper replacement for it is worth it, as long as people understand going in that the project might well fail.

adamnemecek
Which framework has tried to replace the DOM?
comex
Flipboard's from the recent story? The resulting site breaks accessibility and the layout obviously has less capability, but it's certainly not slower.
adamnemecek
Huh, that's actually seeming not that dissimilar from what I had in mind.
bzbarsky
None have tried to replace all of it that I know of.

People have tried to replace things like text editing (with canvas-based editors), CSS layout of the DOM (with various JS solutions involving absolute positioning), native MathML support (MathJax; this one has of necessity done better than most, because of so many browsers not having native MathML support). There are a bunch of things out there that attempt to replace or augment the built-in form controls, with varying degrees of success.

adamnemecek
That's my point. Currently, you cannot really replace the DOM since that's kind of the extent of the exposed APIs.

None of the projects that you mention are really relevant to the discussion. I agree that they didn't change shit but it's precisely because they are still built on the DOM and cannot really go below that in the abstraction layer.

coldtea
>I agree that they didn't change shit but it's precisely because they are still built on the DOM and cannot really go below that in the abstraction layer.

There exist UIs done in Canvas and WebGL that are arguably below the DOM in the "abstraction layer" and don't need to use much more DOM nodes besides opening a canvas/gl panel...

(Most full screen 2G/3G web games fall into this, for starters, and those are in the tens of thousands. But there are also lotsa apps).

xxgreg
> None have tried to replace all of it that I know of.

https://github.com/domokit/mojo/tree/master/sky https://github.com/domokit/mojo/tree/master/sky/specs

coldtea
>Yes, they already are. That framework is called the DOM. People keep complaining about it and trying to come up with replacement frameworks that end up slower and less capable...

The ones I've seen are actually faster -- Flipboard for one gets to 60fps scrolling on mobile, IIRC. And of course all WebGL based interfaces on native mobile apps that re-implement parts of Cocoa Touch et al, are not that shabby either.

Sure, it doesn't have as much accesibility, but that's something that can be fixed in the future (and of course people needing more accessibility can always use more conservative alternatives).

adamnemecek
E.g. Mac OS X uses OpenGL to render GUI, I guess I should have made myself more clear.

> WebGL and canvas are in no way replacements for the DOM.

That's kind of debatable. If you have access to a fast graphics layer from the browser, you can build a DOM replacement of sorts. I think that famo.us works kind of like that.

RussianCow
To some degree, yes. You just have to be able to re-use system UI controls like fields. So you wouldn't be able to just use WebGL/canvas/whatever in place of the DOM, you'd need to come up with a new API.
adamnemecek
I know. I was thinking that there'd be something like Qt that would render the widgets using WebGL.
RussianCow
Until the web becomes the dominant operating system, I don't think that's reasonable because you'd have to implement an entire UI kit (with all UI components, behaviors, animations, etc) but can't guarantee that it will behave at all like the underlying OS. There's only so much you can re-create in the browser.
pavlov
It's true that OS X uses OpenGL for GUI compositing, but that's only the lowest level. Above, there's a very important piece of the GUI stack called Core Animation which provides layer compositing.

Core Animation is used by both the native GUI as well as the browser DOM. When you use layer-backed compositing on a web page (e.g. CSS 3D transforms), WebKit implements it with a Core Animation layer. So DOM-based rendering enjoys the same benefits of GPU-accelerated compositing as native apps -- although obviously with rather different semantics since HTML+CSS doesn't map directly to Core Animation.

If you implement your own GUI framework on top of WebGL or Canvas, you're not getting Core Animation compositing for free, so you need to replicate that functionality in your custom framework. (This applies equally to native apps: a WebGL app is equivalent to a Cocoa app that renders everything into a single OpenGL view, and a HTML Canvas app is equivalent to using a single CoreGraphics view.)

I don't think the WebGL/Canvas route makes sense for most apps other than games and highly visual 3D apps. You'll just spend a huge amount of time building your own implementations of all high-level functionality that is already provided by the OS and/or the browser: layer compositing, text layout, view autosizing, and so on. If you're doing standard GUIs, why go to all that trouble?

adamnemecek
I agree that it would be a lot of effort to pull off since you'd have to duplicate a lot of the standard OS features in the browser but if eventually the DOM becomes an even bigger bottleneck, it might be a viable solution.
RussianCow
> You'll just spend a huge amount of time building your own implementations of all high-level functionality that is already provided by the OS and/or the browser

Not only that, but you can't make a 100% guarantee that your implementation will look and work exactly the same as the native one on the underlying OS. For instance, I can re-create all the native Windows UI controls and re-implement all their behavior in exactly the same way, but what if the user has a custom theme installed? Everything breaks. (WPF has a similar problem.)

frik
You can use HTML-DOM and WebGL together (overlays, or render as texture).

The WebGL support could be improved in Internet Explorer.

Please vote: https://wpdev.uservoice.com/forums/257854-internet-explorer-...

amelius
Is it possible to mix them, while showing, for example, videos, which are clipped by paths or partly obscured by overlaying elements?
frik
I would say yes e.g. with CSS Regions and WebGL. The other way around you have to render the HTML and the video on textures.
ndesaulniers
Maybe, as the DOM becomes more loaded with more abstractions, people will start re-implementing abstractions the DOM already provides; just the subset of abstractions they want. Whether their implementation can beat the native code of the DOM, and the bandwidth concerns of reshipping the same logic is another story.
spyder
Yea, it's already happening with React and other frameworks which are using virtual DOM.
seanmcdirmid
Shared memory multi threading will still be a big barrier to porting over many native applications, like games. Unless asm.js fixes that?
AgentME
There are plans for that: https://bugzilla.mozilla.org/show_bug.cgi?id=933001
seanmcdirmid
Nice! I'll finally be able to implement Glitch for the web (glitch is like react but uses replay to shake out data races).
Fiahil
I share your vision. However, I think getting asm.js a good DOM access definitively coming on 2), because it's an easy way to get visual feedback from anything running in a browser.

Oh, and thanks for the very enjoyable link :)

woah
JS seems to actually be picking up a lot of speed outside the browser?
jerf
First, that's not particularly relevant to the question of what happens to the browser itself. Second, the field of "things you can run that is not Javascript" when not in the browser is already incredibly rich, so we already live in a flexible world. Third, frankly I'm not particularly overwhelmed by the prospect of Javascript's longevity in the server space being a long-term phenomenon... an awful lot of what gets linked on HN is less "cool things to do with JS" and "how to deal with the problems that come up when trying to use JS on the server".

And fourthly, and why this reply is worth making, bear in mind that if the browser becomes feasibly able to run any language, rather than having Javascript occupy a privileged position by virtue of being the only language, the biggest putative advantage that Javascript-on-the-server has goes poof in a puff of smoke. If Javascript has to compete on equal footing, it really doesn't have a heck of a lot to offer; every other 1990s-style dynamically typed scripting language (Perl, Python, Ruby, etc) is far more polished by virtue of being able to be moved forward without getting two or three actively fractious browser vendors to agree on whatever the change is (just look at how slow the ES6 has been to roll out, when I'd rate to contain roughly as much change as your choice of any two 2.x Python releases). And it has no answer to the ever-growing crop of next-gen languages like Clojure or Rust. Without its impregnable foothold in the browser, Javascript's future is pretty dim. (In fact I consider the entire language catagory of "1990s-style dynamic scripting language" to be cresting right about now in general, so Javascript's going to be fighting for a slowly-but-surely ever-shrinking portion of pie.)

reissbaker
Depends on how JS evolves. It got a pretty serious setback when ES4 blew up and everyone went back to the drawing board on ES5 and ES6; the ES6 launch makes it (I think) better for most use cases than Python/Ruby/et al, because the VM is an order-of-magnitude faster than most of the popular choices and ES6 is a reasonably usable language even for someone unused to Javascript's current quirks: it has a real module system, the class syntax is sane and similar to how every other lang does it, the confusing `this` binding is fixed with arrow-functions, generators and Promises get rid of deeply-nested callback chains, `let` and `const` get rid of confusing variable hoisting, etc.

Google and Microsoft are both very seriously experimenting with typed variants of JS (TypeScript from Microsoft and SoundScript from the V8 team), and Mozilla had in fact already proposed adding static typing back in the ES4 days, so I wouldn't be surprised if the next couple of versions of the ES spec include static types. The future for JS is brighter than you think — although it's brighter only because it looks like JS will become a better language, not because of JS in its current, mostly-still-ES5 state.

None
None
beagle3
> Observe that asm.js is very verbose, define a simple binary bytecode for it.

I suspect that is never going to happen, for two reasons:

1) verbosity, on its own, does not make a difference on the web - HTML and Javascript are both generally super verbose, and haven't had any accepted "simple binary encoding" designed for them in those 20 years. What does get implemented is minifiers and compressors (gzip encoding or otherwise), both of which will provide benefits to asm.js comparable to what a bytecode would, and would not require any buy-in from browser maker (the same attribute that has made asm.js successful and PNaCL unsuccessful so far).

2) Historically, anything that is not backwards compatible and does not degrade gracefully is NOT easily adopted by Browser makers, or by websites, unless it provides something that cannot be achieved without it (e.g. WebGL gets some adoption because there is no alternative; but ES6 will get little to non in the next 3 years except as a source language translated to ES5)

comex
Well, if such a bytecode were being standardized, someone would surely write a shim JS library to convert it to JavaScript on browsers that didn't have native support yet. And I think the idea of making a binary format would be more popular for a sublanguage which is essentially guaranteed to be machine-generated and inscrutable, especially given that I've seen a lot of commenters here and elsewhere have a knee-jerk reaction against the idea of using JavaScript syntax, than for HTML/CSS/JavaScript, which have a long history of being written and read manually without any (de)compilation steps, even if most big webapps are minified.
beagle3
But it works the other way around: it will not be standardized before there's an implementation. If it ever happens (which I think is unlikely), the standardization will follow the shim.

And the knee jerk reactions are meaningless. The people who ship stuff don't seem to mind, and they are the ones who make things matter.

Jan 23, 2015 · alexvoda on The Emularity
This prediction is becoming true day by day:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Jan 07, 2015 · sarciszewski on JavaScript in 2015
> All you have done is used a function without understanding what it was doing, or reading the documentation

These aren't my examples. I haven't done anything. I credited the person who provided them: Gary Bernhardt.

https://www.destroyallsoftware.com/talks/wat

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Next time before you make an accusation, reread the post before pressing the reply button.

Nov 13, 2014 · jakozaur on AWS Lambda
Since Intel already ships CPU for them, maybe they should ask for an CPU with asm.js as their assembly language :-P.

The jokes ppl make are starting to get a bit more real: https://www.destroyallsoftware.com/talks/the-birth-and-death...

How to solve the context switch overhead issue: https://www.destroyallsoftware.com/talks/the-birth-and-death...
JoeAltmaier
How about: a cpu that has scores of hyperthreads? They don't block in the kernel; they stall on a semaphore register bitmask. That mask can include timer register matches another register; interrupt complete; event signaled.

Now I can do almost all of my I/o, timer and inter-process synchronization without ever entering a kernel or swapping out thread context. I've been waiting for this chip since the Z80.

rbanffy
While not exactly a chip (it never reached board stage) I designed a processor in college where the register file was keyed to a task-id register. This way, context switches could take no longer than an unconditional jump.

I dropped this feature when I switched to a single-task stack-based machine (inspired by my adventures with GraFORTH - thank you, Paul Lutus). This ended up being my graduation project.

> But it frustrates me how many people complain about how much it sucks when there are no projects (with any support) attempting to really change things. If my opinion of javascript is wrong and it really is that bad - I'd think there would be more of a movement to move away from it.

You should watch this https://www.destroyallsoftware.com/talks/the-birth-and-death...

Programmers can't always just start something "new", they have to work on existing platforms to be able to deploy their software quickly and efficiently. This video explain than even though javascript was never intended to run a 3D engine, it now is able to. As he explains it in the video, it's a hack, and it doesn't work that well for most binaries.

If I could, I would start a new browser, without html, with more dynamic languages, with a clang VM, with protocol buffers, etc. But in this age of patents and market shares in IT, I don't expect having enough exposure to have users installing this future browser. If one software company can't deploy its app on the dominant systems, it's screwed. But it doesn't mean JS fits every possible job.

This talk just keeps getting more and more relevant: https://www.destroyallsoftware.com/talks/the-birth-and-death...
mateuszf
It may become self-fulfilling prophecy.
I think the best answer to your question is this video :

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Title : THE BIRTH & DEATH OF JAVASCRIPT By : Gary Bernhardt From : PyCon 2014

"The Birth & Death of JavaScript"[0] talked about this. From my memory, in "the future" described in the talk, everything is ported to JS so that the VM does the isolation automatically and all the overhead can be thrown away.

[0] https://www.destroyallsoftware.com/talks/the-birth-and-death...

And another by the same guy, in a similar vein: https://www.destroyallsoftware.com/talks/the-birth-and-death...
This seems oddly relevant: https://www.destroyallsoftware.com/talks/the-birth-and-death...
spolu
haha :) Yes... influenced the conception of Breach for sure!
This is eerily close to some of the things discussed in The Birth and Death of JavaScript[1].

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...

TazeTSchnitzel
Though that imagines a future where we're using asm.js and probably OdinMonkey, not generic runtimes like V8.
thegeomaster
My thoughts exactly. I recently posted the link to that talk here on HN somewhere, and now seeing this is really creepy. Especially:

>All kernel components, device drivers and user applications execute in a single address space and unrestricted kernel mode (CPU ring 0). Protection and isolation are provided by software. Kernel uses builtin V8 engine to compile JavaScript into trusted native code. This guarantees only safe JavaScript code is actually executable.

>Every program runs in it's own sandboxed context and uses limited set of resources.

angersock
How's that creepy?

It's freakin awesome!

The number of uses we put Javascript to is indeed frightening, given its "fragile" nature and heavy criticism it attracts every now and then.

There is a great, amusing, borderline sci-fi talk by Gary Bernhardt about the future of Javascript and traditional languages compiled to Javascript. My recommendations: https://www.destroyallsoftware.com/talks/the-birth-and-death...

Jun 20, 2014 · wildpeaks on Webkit.js
That reminds me of the hilarious talk "The Birth & Death of Javascript" where everything get converted to asm.js, even operating systems.

https://www.destroyallsoftware.com/talks/the-birth-and-death...

shayief
Check out this kernel built on V8 engine, designed to run JavaScript code https://github.com/runtimejs/runtime
Jun 20, 2014 · RussianCow on Webkit.js
Reminds me of Gary Bernhardt's talk "The Birth and Death of JavaScript": https://www.destroyallsoftware.com/talks/the-birth-and-death...
Holy hell, Gary Bernhardt was right all along and the future will be METAL... https://www.destroyallsoftware.com/talks/the-birth-and-death...
transpile it all into js and run in node! (I know this may be limiting for some features/libraries, but only a matter of time.)

See https://www.destroyallsoftware.com/talks/the-birth-and-death... on taking this to an "extreme" -- linux, Gimp on X windows, even Chrome -- transpiled and running in a firefox tab.

May 21, 2014 · phillmv on Arrakis
Heh, this sounds analogous to what Gary Bernhardt finished "The Birth & Death of Javascript" with: https://www.destroyallsoftware.com/talks/the-birth-and-death...
thirsteh
Or what a bunch of projects have been doing for years -- Erlang on Xen, HalVM, etc. etc.
Makes me think of this video "The Life and Death of Javascript " which brings what the future could be like with asm.js: https://www.destroyallsoftware.com/talks/the-birth-and-death...
May 08, 2014 · loup-vaillant on How fast is PDF.js?
> Web standards (HTML/CSS) and language (Javascript) were not designed to be used as a compilation target for complex programs.

http://asmjs.org/https://www.destroyallsoftware.com/talks/the-birth-and-death...

Now they are.

> I don't have any solutions to this, it's too late, we are already committed to browsers being full operating systems.

When we do get to that point, and ditch the underlying MacWinuX, there's a good chance they won't be much more complex and much less secure than what they replaced. A typical MacWinuX desktop setup is already over 200 Millions lines of code. I'd be happy to drop that to a dozen million lines instead (even though 20K are probably closer to the mark http://vpri.org/html/work/ifnct.htm). It also shouldn't be much slower than current native applications.

Heck, it may even be significantly faster. Without native code, hardware doesn't have to care about backward compatibility any more! Just patch the suitable GCC or LLVM back end, and recompile the brO-Ser. New processors will be able to have better instruction sets, be tuned for JIT compilation… The Mill CPU architecture for instance, with its low costs for branch mispredictions, already looks like nice target for interpreters.

---

> I do think it's worth considering the security price we are paying to make things like PDF.js possible.

Remember the 200 million lines I mentioned above? We're already paying that security price. For a long time, actually.

---

That said, I agree with your main point: the whole thing sucks big time, and it would be real nice if we could just start over, and have a decent full featured system that fit in, say 50.000 lines or so. Of course, that means forgoing backward compatibility, planning for many cores right away… Basically going back to the 60s, with hindsight.

Alas, as Richard P. Gabriel taught us, it'll never happen.

anon1385
>Heck, it may even be significantly faster. Without native code, hardware doesn't have to care about backward compatibility any more! Just patch the suitable GCC or LLVM back end, and recompile the brO-Ser. New processors will be able to have better instruction sets, be tuned for JIT compilation… The Mill CPU architecture for instance, with its low costs for branch mispredictions, already looks like nice target for interpreters.

Heh, I hope you appreciate the irony in that one. On the one hand we have people arguing that we have to stick with the existing web platform for backwards compatibility reasons, but on the other you are suggesting it would be easy to switch the entire world to new totally incompatible processor architectures to make aforementioned web platforms performant.

loup-vaillant
It's a matter of how many people you piss off. Ditch the browser, you have to change the whole web. Ditch the processor, and you have only a couple browsers to change.

Apple did, it you know? Changing from PowerPC to X86. And they had native applications to contend with. I believe they got away with an emulation mode of some kind, I'm not sure.

I for one wouldn't like to see the web take over the way it currently does. It's a mess, and it encourages more centralization than ever. But if it does, that will be the end of x86. (Actually, x86 would die if any virtual machine took over.)

martindale
Ah, thanks for posting Gary's talk. Great stuff.
None
None
peterashford
I don't think you have to forgo backwards compatibility. Implement a standard VM and library set that everyone can compile to. Implement HTML/JS as a module in the new system. Problem solved.
loup-vaillant
Well, it's not just HTML/JS. It's Word/OpenDocument, SMTP/POP/IMAP… Those modules are going to make for the vast majority of the code. We could easily go from 50K lines to several millions.
May 02, 2014 · jarrett on Thinking in Types
I hear that, and I hope that project works out. But I'm still interested in developing native apps. There's a reason so many professional applications (games, intensive apps like Photoshop and Blender, etc) are still native.

You may have seen The Birth and Death of JavaScript:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

I don't know whether that prediction will come true. But obviously it hasn't thus far. For now, if I need native performance, I need native code. Sadly, that means C, C++, or Java until such time as Haskell libraries compile reliably.

shudder. Node.js-powered linux kernel builds? One small step towards https://www.destroyallsoftware.com/talks/the-birth-and-death... coming true ;)
If you have not seen Gary Bernhardt's talk, you should:

https://www.destroyallsoftware.com/talks/the-birth-and-death...

Apr 17, 2014 · 635 points, 227 comments · submitted by gary_bernhardt
lelandbatey
First, I very much love the material of the talk, and the idea of Metal. It's fascinating, really makes me think about the future.

However, I also want to rave a bit about his presentation in general! That was very nicely delivered, for many reasons. His commitment to the story, of programming from the perspective in 2035, was excellent and in many cases subtle. His deadpan delivery really added to the humor; the fact that he didn't even smile during any of the moments when the audience was laughing just made it all the more engaging.

Fantastic talk, I totally loved it!

None
None
j_horvat
I was lucky enough to hear Gary give this talk in January at CUSEC and it was even better in person. Everyone in the room was clearly hanging off his every word, the actual technical content was pretty insightful and his humour was spot on.
madeofpalk
Also, Java-YavaScript
peedy
Many people from Europe do that. It does sound cooler.
chinpokomon
Since JavaScript and Java have almost nothing in common, I think that's a very reasonable pronunciation. The words look similar, but have very different functional meaning.
robocaptain
It sounds so natural that I immediately started thinking I had actually been saying it wrong all these years.
kangax
fwiw, it's the way everyone in Russia pronounces it
danabramov
Yup. I thought it was part if the joke (that in a few generations, we might pronounce old language names differently).
Kiro
How do you usually pronounce it?
madeofpalk
I think I'm going to adopt this new pronounciation.
IgorPartola
This is actually how you say it in Russian. Try Google Translate.
ricket
pronunciation*

(it's one of few weird words that change spelling when you add a suffix, such as fridge/refrigerator)

tinco
The reason why metal doesn't exist now is because you can't turn the memory protection stuff off in modern CPU's.

For some weird reason (I'm not an OS/CPU developer) switching to long mode on an x86 cpu also turns on the mmu stuff. You just can't have one without the other.

There's a whole bunch of research done on VM software managed operating systems, back when the VM's started becoming really good. Microsoft's Singularity OS was the hippest I think.[0]

Perhaps that ARM cpu's don't have this restriction, and we will benefit from ARM's upmarch sometime?

[0] http://research.microsoft.com/en-us/projects/singularity/

pjmlp
In a way that is no different from the older Xerox PARC systems or the Oberon based ones at ETHZ.

All of them are based on the concept of using memory safe languages for coding while leaving the runtime the OS role.

Except for C and C++, language standard libraries tend to be very rich and to certain extent also offer the same features one would expect from OS services.

As such, bypassing what we know as standard OS with direct hardware integration, coupled with language memory safety, could be an interesting design as well.

That is why I follow the Mirage, Erlang on Xen and HaLVM research.

gary_bernhardt
I didn't want to go into this level of detail in the talk, but... I think you still want the MMU enabled, just not used for process isolation. With virtual memory totally disabled, a 1 GB malloc takes 1 GB physical memory even if it's not touched, you can't have swap at all, memory fragmentation kills you dead, etc. It still has a lot of utility outside of isolation.

I don't have a good sense of how the performance cost of hardware isolation breaks down into {virtual memory enabled,TLB thrashing,protection ring switching}. That's one of the reasons that I reduced the speed-up from "25-33%" in the MSR paper down to 20% in METAL. Maybe the speed-up would be less than that if virtual memory were still enabled.

Unfortunately, that distinction may have been blurred in the talk. That is, I may have implied that METAL would turn the MMU off entirely. If so, it was an oversight. I've done the talk end-to-end at least fifty times, which is how I smooth my execution out. Occasionally it can "smooth" the ideas out a bit too, leading to small inaccuracies. It's sort of like playing the telephone game with yourself (which is a very strange experience).

The MSR paper that I quote came from the Singularity team, so your reference is right on. Reading "Deconstructing Process Isolation" in fall of 2012 was probably the germ of the core narrative of the talk.

microcolonel
There's a new CPU architecture in the pipe(should have good silicon within five years, if their projections can be trusted), and it has a very good design around system calls, and also in terms of virtual memory.

It has a single 64-bit address space and only protection contexts, and due to the general design of the system it doesn't require any register push(or registers at all, at least in the traditional sense). In addition, it has primitives which would allow programs to call directly into drivers and kernel services without a context switch.

Anyway, I don't mean to sound like an advertisement, and we've yet to see any silicon, so the jury's out.

Aside: Starting process address spaces at 0 is not really a convenience as far as I know(other than offering consistent addresses for jumping to static symbols), it's a way to enable PAE on 32-bit machines so that single contexts(typically processes) can use the whole address space.

sparkie
There's an even bigger revolution in CPU design, with Ivan Sutherland's Fleet - which does away with the clock and sequential execution of instructions - instead, the programming model is based on messaging and traffic control - you direct signals to the units in the chip which perform the computations you want, asynchronously by default - if you need synchronicity, you need to program your own models for it.

While these probably won't be available in the next 5 years, and probably won't be acknowledged by existing programmers for decades - I think these ideas will take over.

http://arc.cecs.pdx.edu/publications https://www.youtube.com/watch?v=jR9pAaQlVRc

wolf550e
You forgot to provide a link: http://millcomputing.com/docs/
tinco
After the Operating Systems and Computer Organization courses in my first year of universisty I became a little obsessed with the idea of software managed operating systems. Cool to hear Singularity inspired you as well.

Now I didn't read the paper, but I think the 20% is purely the MMU, I think the protection ring switching thing is much less significant, so I think if you leave the MMU still on that 20% profit is still very optimistic.

Now, if you forget about compiling C (which defeats the purpose of your talk) and just compile managed languages like regular JS, the garbage collector can build a great model of memory usage. Therefore I think it could be much better to let the garbage collector manage both the isolation, and the swapping. So everything in software.

The swapping process would suffer some performance, but that's just CPU cycles, as everyone knows persistent data access isn't even in the same league as CPU memory access.

So yeah, that would mean that you would have to run all untrusted code in managed mode. And with untrusted I would mean code you can't trust with full physical memory access.

nteon
On linux, system calls don't result in a TLB flush - kernel data structures and code are in a different portion of the virtual address space (starting from the top of VM memory, if I remember right) that is tagged as not being available from ring 3. So system calls are quite fast.

EDIT:

Kernel memory begins at PAGE_OFFSET, see here: https://www.kernel.org/doc/gorman/html/understand/understand...

Kernel memory lacks the flag _PAGE_USER so that it isn't accessible from userspace: https://www.kernel.org/doc/gorman/html/understand/understand...

gary_bernhardt
I didn't know that! It certainly makes sense. Context switches still thrash the TLB, though. The performance cost of that has gotten better as time has gone on, but I wonder how many transistors (and how much power) CPUs are burning for that mitigation. The "how computers actually work" digression originally had a section on context switches, but I removed it early on because I felt like that section was dragging.

To try to paint a very rough picture of the larger thoughts from which this talk was taken: I think that microkernels and the actor model are both the right thing (most of the time). When implemented naively, they both happen to take a big penalty from context switch cost. But Erlang can host a million processes in its VM, and we're using VMs for almost everything now anyway.

The obvious (to me) solution is to move both the VM and an Erlang-style, single-address-space scheduler into the kernel. Then you can have a microkernel and a million native processes without the huge overhead of naive implementations. There are surely many huge practical hurdles to overcome with that, and maybe some that can't be overcome at all, but it sure sounds right when written in two paragraphs. ;)

jerf
You know about http://erlangonxen.org/ ? Also something like http://corp.galois.com/halvm . That's probably still not quite low enough level to turn off the MMU, but it's getting there.
None
None
nly
What you seem to be missing with re: asm.js is that, while the JIT to native code gets you your super-fast integer operations, it's still critically incomplete with regard to memory access. Every single individual memory access has to be bounds checked or pushed through some other indirection inside the runtime. Google demonstrated similar ideas with NaCl, which achieved safety with a similarly restricted native code and a just-in-time verification step. Even if these memory accesses could be made as efficient as those performed by the CPUs access protection, you're still not gaining anything you don't already have.

Regarding context switches: A full CPU context switch on x86 (not to ring1 but between two arbitrary points within a single userland address space) takes a few dozen instructions and about 40-80 cycles. A single cache-line miss resulting in a load from main memory on the other hand takes at least twice that (~200 cycles). Again, hits from jumping around in memory will dominate.

How significant is a 20% overhead from virtual memory? Probably about the same as getting 1% more of your memory accesses back in to high level caches.

cornholio
I agree, and I think the whole premise of the performance gain is based on a "have the cake and eat it too" fallacy. Sure, the virtualized syscall to the virtualized OS will be free, but the painting of the font on the screen or the reading of the socket data will be done by the actual bare metal OS which the VM will invoke to get the actual job done.

So as long as we are talking about interprocess communication there will be a gain, but not for the actual hardware facing operation.

Then again, you are trading a hardware enforced isolation which is simple and proven for a isolation enforced by a complex and fragile VM.

andhow
On x64 in Firefox, at least, there are no bounds checks; the index is a uint32; the entire accessible 4GB range is mapped PROT_NONE with only the accessible region mapped PROT_READ|PROT_WRITE; out-of-bounds accesses thus reliably turn into SIGSEGVs which are handled safely after which execution resumes. Thus, bounds checking is effectively performed by the MMU.
nly
Interesting approach, might have to look at the code. Nonetheless it highlights how useful the MMU is and how none of this is free.
spyder
Looks like Erlang is already getting one step closer to the metal:

http://erlangonxen.org/ http://kerlnel.org/

Also there is another project that can be related to that goal:

"Our aim is to remove the bloated layer that sits between hardware and the running application, such as CouchDB or Node.js"

http://www.returninfinity.com/

vanderZwan
I guess this is in a way a response to Bret Victor's "The Future of Programming"?

https://vimeo.com/71278954

exodust
Thanks for link. Liked the 70s vibe and humour.

From about 14:40 he gets animated, basically conducting! Would love to know a programmer's explanation for the function or purpose of arm waving and hand signals in a presentation. Not knocking, just curious!

guard-of-terra
Well, he just tries to reinforce that we have two symmetrical interconnected systems and yet they have to figure out how to talk to each other. What he has on the screen.
gary_bernhardt
It is in a sense. I had an early form of the idea that became this talk in the spring or early summer of 2013. Bret's talk (which I loved!) was released shortly after. That made me think "I have to do this future talk now in case the past/future conceit gets beaten into the ground."
jerf
It's not far off my predictions: https://news.ycombinator.com/item?id=6923758

Though I'm far less funny about it.

jongalloway2
Coincidentally, I just released a podcast interview with Gary right after he gave this talk at NDC London in December 2013: http://herdingcode.com/herding-code-189-gary-bernhardt-on-th...

It's an 18 minute interview, and the show notes are detailed and timestamped. I especially liked the references to the Singularity project.

mgr86
I'm missing some obvious joke...but why is he pronouncing it yava-script.
tambourine_man
I thought it was supposed to be some future pronunciation thing, imagining the way languages evolve. I've seen SciFi movies where in the future english is heavily influenced by spanish.
philangist
I thought it was supposed to be a callback to this scene in Anchorman, but I'm not sure. https://www.youtube.com/watch?v=N-LnP3uraDo
mianos
No hard 'J's in many languages (Like Slavic languages). It's pronounced 'y'. Anyone have a list?
jarek-foksa
The sound ʤ seems to occur in most Slavic languages [1], I guess the primary reason why "Java" is read as "Yava" is because people tend to apply local pronunciation rules to commonly used foreign words, either because of lack of knowledge of native pronunciation or because native pronunciation sounds just silly.

[1] http://en.wikipedia.org/wiki/Voiced_palato-alveolar_affricat...

gurkendoktor
YavaScript is a very common pronunciation in Germany, the dj sound only appears in "loannames" like Jennifer. My grandfather always told me to find a nice yob :)
bttf
Ask a Hispanic friend.
mgr86
gotcha.
nkozyra
Rather, ask a Scandinavian friend.
JacksonGariety
My scandinavian friends call it yay-va-script.
Kiro
I'm from Scandinavia and have never heard anyone pronounce it like that.
Hansi
I'm Icelandic and we use yava-script, I find it hard to figure out how yay-va sounds.
aaronem
Perhaps it would help to see that pronunciation rendered in IPA for English [0]: /ˈiː.və.ˌskrɪpt/

Note particularly the italicized phoneme (), which corresponds to the "long A" sound in English, e.g., the 'a' in 'face'. I'm unfamiliar with the Icelandic language, but according to the English equivalents listed on Wikipedia's page on IPA for Icelandic [1], the corresponding phoneme in that language appears to be ei.

[0]: http://en.wikipedia.org/wiki/Help:IPA_for_English

[1]: http://en.wikipedia.org/wiki/Help:IPA_for_Icelandic

JacksonGariety
YAY as opposed to nay VUH as in vagina SCRIPT pronounced normally
SimeVidas
At the 8:00 mark, he accidentally pronounces it correctly for a moment, and then "corrects himself" by mispronouncing it :-)
paul_f
I'm assuming the original pronounciation was lost in the war.
Kiro
How else would you pronounce it?
jetsnoc
He's in character of it being 2035 and the pronunciation was lost/changed.
100k
I was hoping he'd drop in some reference that would explain it, like the take over of world government by Norway after the war (sort of like Poul Anderson's Tau Zero http://en.wikipedia.org/wiki/Tau_Zero). But I guess he just wanted it to be inscrutable.
lomnakkus
I think you're probably right -- he almost slips up at one point, but corrects himself before pronouncing the "va".
cjbprime
For context, this was one of the most enjoyed talks at PyCon this year.
cbhl
I was fortunate enough to get to see this at CUSEC (the Canadian University Software Engineering Conference) and would similarly agree that this was one of the most enjoyed talks there, too.
TazeTSchnitzel
JavaScript at PyCon?
SSLy
Well, you need JS for the client side even if you use python for server one (eg. flask or django)
wiredfool
Yep. It was on the schedule.
chris_mahan
I think you'll find that the python ecosystem is very large and varied.
clebio
apropos, Bokeh.
granttimmerman
> xs = ['10', '10', '10']

> xs.map(parseInt)

[10, NaN, 2]

Javascript is beautiful.

TazeTSchnitzel
It's due to parseInt having an optional second parameter, the radix, and map passing the index as the second paramater, hence:

  xs = [
    parseInt('10', 0),
    parseInt('10', 1),
    parseInt('10', 2)
  ]
davidkassa
Thank you. I read many comments to find an explanation.
octatone2
It's not optional if you lint your code :)
jisaacks
And since 0 is falsy we get 10 for base 0.
TazeTSchnitzel
Which is odd. I wonder why they checked for falsiness and not the argument being undefined.
kevincennis
It's not exactly checking for falsy values. Although all falsy values will lead to a radix of 10 being applied.

parseInt internally uses the ToInt32 abstract operation on the radix parameter. Once it has that value, it explicitly looks to see if the value is 0. If it is, it uses a radix of 10.

https://people.mozilla.org/~jorendorff/es6-draft.html#sec-pa...

Edit: I hope that doesn't come off as pedantic. My point wasn't to disagree as much as it was to just add some further explanation.

meowface
This is HN, there's no such thing as being pedantic. :)
Thirdegree
Or rather, there is but it's thoroughly welcome.
mturmon
Although keep in mind that excessive pedantry is frowned upon.
conradk
Just like with any language, as long as you read the docs of the stuff you use, you don't get this problem (you might get others with automatic type conversion and missing arguments like the speaker says, but not this)... this is just stupid.

Try this:

int subtract(int b, int a) { return a - b; }

int test = subtract(5, 3); // != 2, just read the damn docs

Oh, C sucks now !

The talk is quite fun and interesting to watch though.

And the end is pretty cool.

DonHopkins
By "this problem" do you mean "the this problem"?

And by "this is just stupid" do you mean "this === just stupid"?

If you think that's bad, you should see this!

coldtea
The thing is, good language design means you don't have to read the docs.

The number one thing taught at user interaction / usabillity courses is that users don't read the documentation. Or skim it and go directly to one or two parts they want to check (Sure, some bizarro outliers do read it all).

Besides, a golden rule from the UNIX era is the "principle of least surprise". Don't define stupid behavior as default, as in this case (both for parseInt and Map).

nathansobo
Both behaviors make sense in isolation. It's not always so easy.
tolmasky
Arguably the issue is not with the behavior but rather a deeper design problem within the language itself. Notice that in Obj-C no one would ever get confused regarding the second parameter of parseInt:withRadix:.
gary_bernhardt
Juggling these interactions is exactly what makes good language design so difficult and time consuming. In the talk, I mention JS' ten-day design time several times for exactly this reason. Language design is hard and ten days just isn't enough time to carefully consider how everything will fit together. Try to imagine a programmer, even a brilliant one, noticing the map/parseInt interaction ten days after starting to learn JS, especially in 1995 when these high level languages were far less common. Seems unlikely!
elwell
> good language design means you don't have to read the docs

Let's assume JavaScript is a poorly-designed language, and Clojure is a well-designed language. In the first month of language use, the user of Clojure will have looked at the docs many more times.

1stop
But the user of javascript will have more bugs.
coldtea
That's because:

1) Clojure has a larger API -- Javascript doesn't have 1/10 that.

2) Javascript has a familiar (to many) Algol-derrived braced syntax and lots of common C/C++/Java/etc keywords. Clojure is only familiar to Lisp/Scheme users.

If those things were equal, Clojure would win the "don't have to look caveats up" contest, because its design is more coherent, and doesn't give you unexpected results and undefined behavior like Javascript does.

Obviously, you somehow you need to first know that "parseInt" is called "parseInt()" and not "atoi()" for example. But I wasn't implying never reading anything, including function reference. Just being able to code without needing to study and/or memorize lots of arcane edge cases.

nathansobo
I just wrote this line about two hours ago and my tests weren't thorough enough to catch the bug it introduced. Just when I thought I knew JavaScript. Thanks for saving me some time.
iamthepieman
I'm sure you know this by now but you can keep the syntax and use Number instead

xs.parseInt(Number)

matb33
I think xs.map(Math.floor) takes the cake if I recall speed tests properly
_random_
Always remember - it's not a language but a loosely parsable texty thing.
bsder
Always give the base to parseInt in Javascript.

Always. The moment you don't, all kinds of bugs follow.

I have to go cry now at the number of times this has bitten me.

TazeTSchnitzel
Are there any other problems aside from octal?
ajanuary
Not just javascript, a whole bunch of language's parseInt implementation will interpret the base from a leading zero etc.
ajanuary
A useful function

    function overValues(f) { return function(x) { return f(x); } }
Then you can do

    ['10', '10', '10'].map(overValues(parseInt));
However, usually you're going to want to do the equivalent of this

    ['0101', '032'].map(function(s) { return parseInt(s, 10); })
Because Javascript interprets a leading zero as an indicator of base.
nollidge
I think you accidentally a word...

> as an indicator of base.

As an indicator of base 8 (octal).

v413
You need:

['10', '10', '10'].map(Number);

octatone2

  var xs = ['10', '10', '10'];
  xs.map(function (str) {
    return parseInt(str, 10);
  });
  
  > [10, 10, 10]
Fixed that for you. Why?

  map callback params: (value, index, originalArray)
  parseInt params: (string, radix)
Your code is passing the map array index to parseInt's radix.
medikoo
Where people use parseInt, they usually should use Number: ['10', '10', '10'].map(Number); // [10, 10, 10]

;)

None
None
just2n
There are so many good WTFs in JS, but this is not one. parseInt expects 2 arguments and Array.prototype.map provides 3 to the callback it is given. Both of these facts are very well documented and known.

    var mappableParseInt = function(str){
        return parseInt(str, 10);
    };

    ['10', '10', '10'].map(mappableParseInt);
I'd suspect this snippet is more a snipe at people who don't know JS very well and expect parseInt to be base-10 only.
CatMtKing
You could say it's a snipe at the weak type system that Javascript has.

I dunno, as someone without much experience with Javascript, it is a little odd that arrays return the index alongside the value by default.

xiaomai
arrays don't do that, but map() does. Normally in js you can just ignore arguments you don't care about, but it does lead to surprises like this one.
anaphor
It has little to do with the type system. Variadic arguments can be typed given a type system that supports it.
None
None
clebio
Alternatively, if a function expects two arguments, the language could take exception at the fact that three were handed in. Quietly accepting arbitrary arguments could be considered breaking contract. It does have a wtf-ey whiff.
ahoge
A bit less annoying with ES6:

  >>> ['10', '10', '10'].map(x => parseInt(x, 10))
  [10, 10, 10]
oscargrouch
with a function named "parseInt" anyone would expect the inputs as base 10.. otherwise this shoud be called "parseHex" for 15 or at least "parseBytes(input, base)"

The programmers are not the ones to blame on that.. this is really a bad contract between the language and the programmer

Its the equivalent of a function named "getStone()" to return you a " Paper{} " :)

djur
"int" doesn't imply anything about the base.
None
None
Iftheshoefits
It does for human beings. We use base-10 for basically everything. This is true even for most programmers in most situations. Human beings aren't computers, and we aren't abstract math processing units who by default consider numbers abstracted (e.g. as elements of Rings). This goes double for string representations of numbers--in the majority of cases a number represented by a string is a number meant for human consumption in a normal human context; not some machine running in base-2 (or -8 or -16). It is certainly reasonable to expect "parseInt" to parse an integer out of a string in base-10 by default, and entirely unreasonable to expect to be required to provide a base as anything except an optional argument, and certainly it is unreasonable to expect that that second optional argument is treated as not optional in a composition operation.
djur
I agree with your conclusions. However, the poster I was responding to was suggesting that the category of "int" necessarily excludes non-decimal representations in the same sense that the category of "stone" excludes "paper".

I think in this case it's not parseInt that's at fault, it's the fact that map optionally passes additional arguments.

ajanuary
> It is certainly reasonable to expect "parseInt" to parse an integer out of a string in base-10 by default

It does.

> and entirely unreasonable to expect to be required to provide a base as anything except an optional argument

It is optional.

> and certainly it is unreasonable to expect that that second optional argument is treated as not optional in a composition operation

I don't understand what you're saying here. It's never treated as required, it's just map supplies a parameter in that position, so it get's used. That's how optional parameters work.

The wat (if there is one) is that map provides extra arguments.

None
None
nollidge
>> ...parse an integer out of a string in base-10 by default

> It does.

Not quite: in some browsers (IE), a string starting with '0' gets interpreted as octal. So parseInt('041') === 33 in IE.

Guess how I found out about that.

hcarvalhoalves
I don't think `parseInt` accepting an optional second argument is the surprising behavior there. The real WTF is `map` passing more than one argument, and the loose behavior of JS regarding argument passing overall.
briantakita
It's only WTF because it's not the same as other implementations of map. Once you can internalize the map implementation, it's no longer WTF & actually makes sense.
rplnt
That's the weird part. Why does array provide three arguments? But I agree, that's something you can learn. I guess.

But it's WTF anyway. I have a function that takes either one or two arguments, I provide three, and everyone seems to be OK with that.

just2n
Well that decision is pretty necessary when you realize that JS has no syntax to indicate a function is variadic (we use the arguments magic variable, but use of it does not necessarily indicate that a function is variadic) and that implementation supplied functions are not required to have their arity exposed via Function.prototype.length (http://es5.github.io/#x15.3.5.1).

There's no way to know, even at run-time, whether a function is being called with too few or too many arguments, since that's equivalent to the halting problem. So the sensible alternative is just to default everything to undefined, and silently ignore extraneous arguments.

But yes, if JS was strict with how it handled argument definition lists and had support for indicating infinite arity, I'd agree, this would be a WTF, or at least strange. But I think it makes a lot of sense, all things considered.

gnuvince
So much for abstraction if you need to understand the implementation of every function you'll every use in JavaScript.
yourad_io
> understand the implementation of every function

Rather: remember three things that make up the majority of Array iterators' callback functions' signatures:

Element, Index, Array.

Shared by: .map, .every, .forEach, .filter, and probably some that I am forgetting. The exception I think is just .reduce[Right], which by definition requires its previous return value, so you have (retVal, elem, i, arr).

Quite literally, if you remember .map callback, you remember .every callback :)

Javascript deserves shtick for its truly bad parts (with, arguments, ...) and some missing parts, but .map and its friends aren't it.

meowface
It makes sense but it still strongly violates the principle of least surprise. No other language I know of does this, nor do I think this would be a particularly desirable feature.
vorg
I suspect Nashorn, the just released edition of JavaScript for the JVM, will be heavily promoted by Oracle and become heavily used for quick and dirties manipulating and testing Java classes, putting a dent into use of Groovy and Xtend in Java shops. After all, people who learn and work in Java will want to learn JavaScript for the same sort of reasons.
dsparry
Very impressive to have been recorded "April 2014" and released "April 2013." Seriously, though, great presentation.
icameron
Agreed! I too was wondering isn't the discovery of time travel the bigger story here? /s
saraid216
No, at the end of the day, the discovery of time travel ended up being a really trivial achievement because of paradox. Now, the scientific knowledge we picked up en route was monumental, but that's something else.
joelangeway
He says several times that JavaScript succeeded in spite of being a bad language because it was the only choice. How come we're not all writing Java applets or Flash apps?
bsder
Because Java, Flash, etc. couldn't easily manipulate the DOM.

Javascript won for this reason.

dpweb
or VBScript for that matter.. I think there's some confusion about why JS won. JS couldn't easily manipulate the DOM either until JQuery in 2005-2006.

The fact that Java, ActiveX etc.. had full control of the system and causes problems ensuring security was an issue, but it is not the reason why JS beat them all.

Don't discount the power of 1) free and 2) easy to use software that is 3) not controlled by a single corporation. JS is the only web programming language that is all of these.

Yea, maybe Python or Clojure in the browser would be cool. I would argue Clojure is absolutely more difficult for a novice to learn, and Python provides what additional benefit? JS was there first.

The only reasons why plugins existed is you couldn't do these kinds of things in the DOM. JQ, and the subsequent advances in browser technology, HTML, CSS, JS - made it so you can. Also, other things being equal - programmers will choose elegance over bloat, less layers of abstraction over more. Plugin architecture became just an unnecessary layer between the programmer and the browser, after HTML/JS/CSS caught up.

JS did not become ubiquitous by accident, or because it was the only choice. There were many choices (all being pimped by big well-funded companies). JS won because it was the better than the alternatives.

solomatov
DOM in that time wasn't that fancy it is now. The real reason is security.
TazeTSchnitzel
There were other advantages. To write JS you just need a text editor, and it's easy to pick up. To write Flash requires spending several hundred dollars. To write Java requires the JDK and to learn Java.
freditup
Used to to do a good amount of flash development - you could actually do it with just a text editor and a compiler (which was free). There were also quite nice free IDEs, like FlashDevelop.
ANTSANTS
We're talking late 90s/early 2000s here. If anything like that existed during Flash's heyday, I certainly wasn't aware of it.
freditup
If I had to pinpoint it, I'd say Flash's primetime was around 2005-2008 perhaps, and FlashDevelop was available then. Guess we probably define it's prime differently haha, I'm thinking more of when it matured - AS3 as a language, lots of tooling choices, etc.
ANTSANTS
I wasn't ever anything close to a professional Flash developer, I'll take your word for it if you say that was the best time to be developing for it.

I was thinking about the days of Homestar Runner, Weebl and Bob, Newgrounds, and so on, when flash cartoons and games were (for kids, at least) a huge part of internet culture, and everyone wanted to be a Flash animator. Youtube kinda killed the Flash cartoon medium, sadly. Sure, videos are simpler and don't rely on a proprietary binary blob, but there's nothing like loading up a Strong Bad email and clicking random things (or, uh, holding down tab) trying to find secrets.

freditup
Ah, don't give me too much credit haha, was more of a side-project thing for me, definitely wasn't a professional, especially at the animation side of things (as opposed to the programming side). I was also more involved with the games side of Flash, which Flash became much stronger at as ActionScript 3 came out which coincided with much better Flash performance. Flash advertising and simple animations were probably stronger earlier.

I'm just interested in the topic because it's kind of neat to look back at the internet and observe its history and the changes its gone through. Just did a little wikipediaing for fun - here are when a few different websites / notable games were released:

Newgrounds: 1995

Homestarrunner: 2000

Miniclip: 2001

Armor Games: 2005

Kongregate: 2006

Fancy Pants Adventures: 2006

Desktop Tower Defense: 2007

Turing_Machine
Flash and Java also required a compile.

Javascript just required that you click refresh.

Especially on 1995 technology, that mattered. Compiling Java took a while. I didn't use Flash enough to retain an impression of speed, but it sure wasn't instantaneous.

muyuu
It's also the reason why Flash was so prevalent until recently and is still installed in 90-something % of desktop computers: it's faster. Significantly faster, and very specially so in the 90s and early 2000s.
saraid216
While security is the main answer, it was also that Java and Flash aren't necessarily available. That is, getting them to run on another machine was frequently a huge issue, especially if you tried to put in any kind of complexity.

Javascript, on the other hand, was omnipresent and comparatively accessible. It was the least bad option by a wide, wide margin. For a different comparison, I switched from Java applets to PHP in the early 2000s. I didn't really get into Javascript until many, many years later around 2009: before that, Javascript was mostly a way to make Flash work properly.

Turing_Machine
Oh, yeah, especially after Microsoft stopped shipping Java.

There was also the version issue to worry about. "Pardon me, Mr./Ms Customer/User -- would you mind terribly going and downloading and installing a 20 MB Java update on your 14.4k dialup connection before using this page?"

Nightmarish, it was.

MichaelGG
I always found it a bit hilarious how Sun, after getting Microsoft rather onboard the Java train, albeit with their necessary native extensions, decides to sue them and put an end to it. And promptly kills off Java distribution and adoption by the largest software developer in the world.

Even stranger is how Sun, a hardware/platform company, decided making a popular platform that's hardware and platform independent would help their business. Sometimes I wonder if there was a really well thought-out plan, or people were just doing things.

Yorrrick
The "necessity" of those extensions is debatable, and they meant that code wouldn't be portable to Sun's implementation. There was real cause for concern, and there weren't a lot of other options for fixing it.

Sun probably also realized that they weren't about to compete directly with mighty Microsoft on platform lockin of all things, so they played a different game.

firlefans
Flash still powers Youtube for most users, Silverlight for Netflix and Unity's plugin is required for most 3D games on Chrome's Marketplace (not sure where else to look for successful HTML5 games).
bonede
because there a lot of bad programmers use it to write a lot of page effects, like alert("log in required"), not apps.
cbhl
Well, about ten or fifteen years ago, "we all were" would have been the answer. Except that back then, there were multiple choices -- plug-ins meant you could choose Java, or Flash, or ActiveX (Visual Basic 6, anyone?), or VRML for that matter.

The number of security issues that plug-ins have had in the last two decades makes most of them non-starters nowadays, although there are still plenty of sites that use them extensively (say, Childrens' game websites like Neopets and Nick Jr.'s website) depending on the target audience.

Carioca
Also, apparently internet banking and ecommerce in South Korea relies heavily on ActiveX

http://www.washingtonpost.com/world/asia_pacific/due-to-secu...

steveklabnik
Consider the relationship between Chromebooks and METAL.

(I'm typing this from my Pixel...)

cbhl
Bernhardt later tweeted:

"I gave The Birth & Death of JavaScript seven times and no one ever asked why METAL wasn't written in Rust."

https://twitter.com/garybernhardt/status/456875300580651009

EdSharkey
It was assumed because that was/will be a foregone conclusion.
igravious
Stellar stuff. Hugely enjoyable. Very interesting thought experiment. I won't spoil it for any of you, just go and watch! Mr. Bernhardt, you have outdone yourself sir :)
nkozyra
Extraordinarily entertaining and well presented.
Sivart13
Where did you get the footage of Epic Citadel used in the talk?

http://unrealengine.com/html5 seems to have been purged from the internet (possibly due to this year's UE4 announcements?) and I can't find any mirrors anywhere.

Which is a shame, because that demo was how I used to prove to people that asm.js and the like were a Real Thing.

sefjklsffsdfjkl
https://web.archive.org/web/*/https://www.unrealengine.com/h...
Sivart13
Not Sure If Serious, but this doesn't work at all in any browser I've tried it in. I don't think archive.org especially knows how to mirror a giant weird experimental single page app.
atmosx
I have a question, because this video confused me. I don't have background to follow through all the assertions Gary Bernhardt did, but I'll try to watch it again, since it was fun.

I want to become a full stack developer. I can program and write tests in ruby, I can write applications using Sinatra and now I am learning rails. I bought a book to start learning JavaScript because it's the most popular language and basically will allow me to write modern applications. After I'm done with JS I'll probably jump into something else (rust, go, C, C++, Java, whatever helps do the staff I want).

But watching this video, I'm confused: I avoided CoffeScript because I read in their documentation that in order to debug the code you have to actually know JavaScript so I figured that the best thing to do is learn JS and then use an abstraction (i.e. Coffescript) and tools like AngularJS and Node.js... Is my approach wrong? :-/

tragic
You can get around it to some extend with source maps and so on - just make sure you're generating them with whatever build process you use.

In practice, however, all that lovely Coffeescript syntax can easily trip you up; often something will compile successfully, but not to the 'right' Javascript. I wouldn't recommend CS until you get your head around the fundamentals of JS. In particular, CS does some very 'clever' things with the Javascript this object; I have certainly lost my scope at unexpected points in CS programs (often in loops). When you're optimising code, furthermore, you definitely need a strong sense of what JS code you'll get out the other end.

I'd recommend Reginald Braithwaite's Javascript Allonge as an overview of JS semantics - the material on scopes and environments is very useful, given that JS behaviour on that score is ... idiosyncratic. https://leanpub.com/javascript-allonge/read

alexandercrohde
I guess I don't really get the point here. This video walks a line between comedy and fact where I'm not really satisfied in either.

I can't always tell what's a joke, does he actually believe people would write software to compile to ASM instead of javascript because there are a few WTFs on js's "hashmaps." Much likely a newer version will come out before 2035? Or was that a joke?

I also feel like poking fun at "yavascript" at a python conference is cheap and plays to an audience's basest desires.

Really I see a mixture of the following: - Predictions about the future, some of which are just cleary jokes (e.g. 5 year war) - Insulting javascript preferring clojure - Talking about weird shit you could, but never would do with ASM js - Talking about a library that allegedly runs native code 4% faster in some benchmarks, with a simplistic explanation about overhead from ring0 to ring3 overhead.

pookiepookie
I'm not sure I understand the claims toward the end of the talk about there no longer being binaries and debuggers and linkers, etc. with METAL.

I mean, instead of machine code "binaries", don't we now have asm blobs instead? What happens when I need to debug some opaque asm blob that I don't have the source to? Wouldn't I use something not so unlike gdb?

Or what happens when one asm blob wants to reuse code from another asm blob -- won't there have to be something fairly analogous to a linker to match them up and put names from both into the VM's namespace?

None
None
camus2
nice nice,ultimatly languages dont die,unless they are closed source and used for a single purpose ( AS3 ). In 2035,people will still be writing Javascript. I wonder what the language will look like though. Will it get type hinting like PHP? or type coercion? will it enforce strict encapsulation and message passing like Ruby ? will I be able to create adhoc functions just by implementing call/apply on an object? or subclass Array? Anyway , i guess we'll still be writing a lot of ES5 in the 5 years to come.
_random_
I think there is a good time that an alpha-version of ES6 will be tentatively rolled out by 2035, 2036 latest.
testrun
AS3 is not dead, and it is now open source
camus2
source?
testrun
http://flex.apache.org/
nkozyra
Dead for all practical purposes.
scotth
Adhoc functions can be written using ES6 Proxy [1].

[1]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

leichtgewicht
I like that he mentions "integer". It is still very incredible how JavaScript can work well without a integer construct. Or threads and shared memory. Or bells and whistles.
JoeAltmaier
Yes as a high-level scripting language it has many uses. You're never gonna write a kernel in it admittedly.
pekk
And it would work better with integers. Or are we claiming now it was a good decision to force all numbers to be floats because look how awesome node is?
oberhamsi
modern browsers support webworkers so you get "threads" but still no shared memory.
_random_
"JavaScript can work well" - depends on what is understood by 'well'. Some craftsmen are capable of building cars from junk.
None
None
base698
I wish some of those talks were available for purchase on their own and not in the season packets. Definitely a few I'd buy since I liked this talk and the demo on the site.

Guy has good vim skills for sure.

clebio
The pricing model used to be different -- $8/mo. He changed it when he stopped producing the series. I agree the current pricing doesn't make sense. I feel slighted for having subscribed for several months, but would now have to pay _more_ for content that I used to have access to. Ach, schade. That said, the material was compelling enough to buy at the time!
Kiro
A bit OT but what is the problem with omitting function arguments?
wcummings
They'll have the value undefined which will do god knows what after some implicit type coercion
GeneralMayhem
Not necessarily anything as such, but it's the sort of thing that can easily lead to bugs if you don't know what you're doing. It's the only way to overload a function with multiple signatures, though, so most libraries and frameworks make heavy use of it.
yoamro
I absolutely loved this.
ika
I always enjoy Gary's tasks
jokoon
I want a C interpreter
mattgreenrocks
LLVM ships with lli, which can interpret LLVM bitcode generated by clang, a C compiler.
ahoge
http://en.wikipedia.org/wiki/CINT
jokoon
why not put that in browsers ?
Yorrrick
To make a complete platform, you also need APIs, so there's more to it than just picking a language. You also need to figure out how to sandbox it; CINT appears to give programmers access to unrestricted pointers. You also want to get multiple browser vendors to agree, so some kind of specification is desired; CINT targets its own unstandardized subset of C. And you ideally want it to go fast, but CINT appears to be pretty slow:

http://benchmarksgame.alioth.debian.org/u32/compare.php?lang...

So there'd be some work to do. You could also compile the code, but complete C compilers are not fast, in browser terms.

slashnull
ha-zum yavascript
jr06
Video tl;dw:

Gary Bernhardt (rightly) says that JavaScript is shit (with some other insights).

HN comments tl;dr:

50%: "Waahhh, JavaScript is awesome and Node.js is wonderful, shut up Gary Bernhardt."

25%: Smug twats talking about how they're too busy changing the world with JavaScript to even bother to comment.

25%: Pedants and know-it-alls having sub-debates within sub-debates.

Pretty standard turnout. See you tomorrow.

pekk
Thanks for the summary, but I didn't exactly see that he was saying JavaScript is shit so much as that it was imperfect (10 days, etc.) but that didn't even matter.
angersock
It's been kind of fun watching JS developers reinventing good chunks of computer science and operating systems research while developing node.

This talk has convinced me that their next step will be attempting to reinvent computer engineering itself.

It's a pretty cool time to be alive.

Fasebook
"I get back to the DOM"
h1karu
somebody tell this to the node.js crowd
_random_
Can so many lemmings be wrong?
inglor
This is actually not a bad lecture. Very interesting, a nice idea and surprising.
adamman
"It's not pro- or anti-JavaScript;"

OK

nkozyra
Did you watch it?
jliechti1
For those unfamiliar, Gary Bernhardt is the same guy who did the famous "Wat" talk on JavaScript:

https://www.destroyallsoftware.com/talks/wat

vor_
Classic video, though it's wrong at times. For instance, the audience member who corrected him was right.
robert-boehnke
couldn't quite make it out, what did he correct?
vor_
The second JavaScript example, when he told someone in the audience, "No, that's just an object." It was a string.
gary_bernhardt
I knew that it was a string. If you listen closely, you'll hear that he asked "is that an array of object?" He probably asked that because it's in square brackets. I said "No, it's just an object".

I've probably seen twenty people call this "wrong", which frustrates me. It's not wrong. It was a stringified object! I didn't say "stringified" because it wasn't relevant to the question of whether the object was in an array!

There are other things in Wat that are genuinely wrong, though, like the fencepost error about "16 commas", which mistake will haunt me forever.

saraid216
Maybe worth releasing a transcript, at this point?
hetid
It's only 15 commas man, 15 commas. That extra comma could kill someone.

Speaking of which, why does WAT do different things on node.js?

espadrine
> why does WAT do different things on node.js?

Wrapping the same input in parenthesis (eg, '({} + [])') yields different results.

Turing_Machine
"That extra comma could kill someone."

Only in old versions of Internet Explorer.

calibwam
It is wrong because it is the toString of Object, because the + operator wants to do string concatenation. You were misleading the audience, both by using a shell which doesn't show strings with quotes, and saying that the toString of the object is 'just an object'.

And that is not the only thing that is misleading, as you clealy said that {} was an object. Yes, the syntax in js is weird as it looks like an object, but it isn't. Again, a better shell would not let you do this.

nilliams
As gary explained he wasn't wrong in his response to the question because the audience member was asking whether [object Object] means the object is in an array. It doesn't. The string point is moot.

I do agree the {} + [] example has always felt a bit unfair to me (for the reason that {} is a block), but whatever, it's a light-hearted talk.

fuzzythinker
Funniest tech video I've seen! Actually maybe even minus the "tech".
kevinwang
The video seems to be down now? Anyone have a mirror?

edit: nevermind, I clicked the download link. But I'm still wondering why the video's unplayable on the site.

thousande
Works with Chrome not Firefox. Yay! 2014 ;)
Yhippa
That was really funny. First time I've laughed all day. Thanks for sharing.
hazelnut
Brilliant - everbody working with JavaScript should watch this!
briantakita
> Javascript is a bad choice

Javascript is great once used in a "good" way. It's flexible & it's almost everywhere. Spending all your time complaining about how "bad" Javascript is kindof pointless. If you use Javascript, learn how to use it well.

Master sushi chefs don't sit there complaining how bad knives are because the knives need to be constantly sharpened.

If you are cooking spaghetti, learn how to strain the noodles, instead of using silly examples assuming people are incompetent at learning a tool.

https://www.youtube.com/watch?v=rbA9KAc5gZs

rplnt
The fact that something is "almost everywhere" doesn't make it good. It makes it useful at most.

And your examples make sense, javascript doesn't (in some cases) so it's very appropriate to complain.

briantakita
> so it's very appropriate to complain

In that case, it's appropriate to complain about gravity & being restricted to the speed of light?

No, it's better to learn about and use these properties to your advantage.

loup-vaillant
Are you seriously equating the laws of physics and human artefacts? That would be ludicrous. While the laws of physics are set in stone, human artefacts can be remade.

That changes everything

I'm sure you have the skills required to, say, write a preprocessor for whatever language you are using, and add some special constructs in it. Missing feature? Done in a few days. So…

If the laws of physics suck, suck it up.

If your tools suck, change them.

briantakita
> Are you seriously equating the laws of physics and human artefacts?

Yes I am. The property that they share is they will not be changed or avoided in the near future.

Another property is that despite certain limitations, you can still accomplish many things. If you focus on these limitations, you will accomplish less.

> If the laws of physics suck, suck it up.

Not in all cases. Physics is just a model of our understanding of physical existence. Einstein demonstrated that.

> If your tools suck, change them.

I guess if it's worth it to spend that much effort, then go ahead. Just know that the frequent examples of javascript's "problems" are easily surmountable, that is if you don't dwell on these "problems". Javascript has some great attributes to it.

Indeed, it does not "suck". That's like saying the human body sucks because we have this ridiculous tail bone and wisdom teeth. No accounting for taste, I suppose.

I choose to focus on that and progress in mastery of my craft. If you want to complain and/or change your tools, go ahead. I don't judge you.

loup-vaillant
> Yes I am. The property that they share is they will not be changed or avoided in the near future.

You vastly overestimate the effort it takes to change your tools. When I was talking of a few days to add a feature to a language, that was a conservative estimate. With proper knowledge it's more like hours. And I'm not even assuming access to the implementation of the language. Source-to-source transformations are generally more than enough.

Heck, I have done it to Haskell and Lua. And it wasn't a simple language feature, it was Parsing Expression Grammars (the full business). I used no special tools. I just bootstrapped from MetaII (I wrote the first version by hand, then wrote about 30 compilers to the nearly final version). (For Haskell, I took a more direct route by using the Parsec library.)

Granted, writing a full compiler to asm.js is a fairly large undertaking. But fixing bits and pieces of the language is easy. Real easy.

> Not in all cases. Physics is just a model of our understanding of physical existence.

Oh, come on, don't play dumb. You know I was talking about the way the universe really works, not the way we think it works.

> I choose to focus on that and progress in mastery of my craft. If you want to complain and/or change your tools, go ahead. I don't judge you.

I'm not sure what you're saying. It sounds like you want to focus on particular programming languages. This would be a mistake, pure and simple. You want to master the underlying principles of programming languages. It can let you pick up the next big thing in a few days. It can let you perceive design flaws (such as dynamic scoping). It can let you manipulate your tools, instead of just using them.

Your way leads to obsolescence.

---

My advice to you: if you haven't already, go learn a language from a paradigm you don't know. I suggest Haskell. Also write an interpreter for a toy language. Trust me, that's time well spent. For instance, knowing Ocaml made me a better C++ programmer.

briantakita
First, I'd like to point out that your tone is attacking & condescending. Why?

> You vastly overestimate the effort it takes to change your tools.

Cool! If you don't mind the asset overhead, having to recreate the existing javascript ecosystem, & the abstraction mapping, & the other unknown unknowns, then it's all good. Are there any well-known production sites that use such techniques? I don't doubt there will be, but are such techniques "ready for prime time"?

I personally have not experienced enough pain to be motivated to all that.

> Oh, come on, don't play dumb. You know I was talking about the way the universe really works, not the way we think it works.

The thing about existence is we don't know about it in it's entirety. Even if we know the rules, there are many mysteries to explore. It's wonderful :-)

> It sounds like you want to focus on particular programming languages. This would be a mistake, pure and simple. You want to master the underlying principles of programming languages.

I am mastering the underlying principles of programming languages.

I want to focus on getting better, faster, & smarter. For the web, it's nice to have everything in one language. Lot's of sharing of logic. Keeping DRY. Being efficient with time. Smaller team sizes. More stuff getting done.

Maybe compiling to javascript will help for other languages.

I'm a fan of dynamic languages. There's more than one way to master the craft. Asserting your one true way is a failure of imagination.

> Your way leads to obsolescence.

I doubt it. You vastly underestimate my ability to adapt & evolve ;-)

> My advice to you: if you haven't already, go learn a language from a paradigm you don't know. I suggest Haskell.

Maybe one day. In the mean time, I'm focusing on becoming a more fully rounded thinker. That means subjects outside of programming. Learning yet another language has diminishing returns.

I'm humble enough to not give you unsolicited advice, which would only serve my ego.

Ooh, and I agree. OCaml, Erlang, & Lisp are fun languages. Javascript is also fun.

MichaelGG
With this logic you can defend anything. PHP's great if used in a good way: Zuckerman's a billionaire. Right? Why is anyone even bothering with PL these days?

Sushi knives don't decide to cut you because the rice came from a different origin. And I'd guess that most craftsmen, outside of a ritual and tradition would love for their tools to have less disadvantages.

If you want to draw an analogy to spaghetti (?), it'd be like complaining that the only kind of spaghetti you can buy locally cooks only at a specific temperature, and even a bit more turns it to mush. And the reason is because the local government passed a bylaw with only a few hours of consultation that ended up banning imports of better kinds of spaghetti.

While it might be "pointless" to spend all your time complaining, there's certainly value in asking "wat" and pointing out absurdity.

briantakita
The "wat" is a good first step in identifying problems. However, I see is people getting stuck on "wat" and not moving forward. People would rather win an argument than advance knowledge & the practice. Lot's of ego, programmers have.

In the mean time, one can learn to appreciate & use javascript strengths. It can be quite fun, liberating, & useful. Anecdotally, I have not run into these crazy issues, and I program in javascript everyday. I also have a large app and the framework I built is custom.

I liken this to using C++, Unix, & bash as a base. Yes, you could say these tools suck and spend time creating, marketing, & community-building for a new tool. Or you can iterate & improve upon these existing tools. There's no wrong answer. What do you want to accomplish?

> Sushi knives don't decide to cut you because the rice came from a different origin.

That analogy seems like a stretch. Care to explain? Javascript works with different locales. There are many international websites that use javascript.

Also, javascript does not "decide" to create a bug in your program. You create that bug by misusing the tool. You will get further if you take some responsibility and improve your practice.

> And I'd guess that most craftsmen, outside of a ritual and tradition would love for their tools to have less disadvantages.

I agree with that. Usually the improvements are iterative. One could use a laser cutter (which does not need sharpening) to cut sushi, but that would also burn it. Here's a good talk (Clojure: Programming with Hand Tools).

https://www.youtube.com/watch?v=ShEez0JkOFw&safe=active

Ritual & tradition is a social tool to propagate knowledge, idioms, & practices across generations. It makes sense to challenge ritual & tradition to they improve over time. It does not make sense to whine about it without doing anything.

> it'd be like complaining that the only kind of spaghetti you can buy locally cooks only at a specific temperature, and even a bit more turns it to mush

Not getting your analogy. This seems like a stretch, similar to the person who cannot strain the spaghetti noodles on the video. Care to explain?

nothiggs
The following perfectly describes my sentiments about all the complaints people have about JavaScript, PHP, <name your favourite hated language>

https://www.youtube.com/watch?v=uEY58fiSK8E

lomnakkus
So we should just give up completely on trying to make better things?

I don't think that was the point of the video (some amount of gratitude for the things we have) -- but then maybe I'm misunderstanding your point.

nothiggs
On the contrary. We should strive to correct all the "wats" that obviously exist in all these languages, but most of what I see is just complaints, most of them ignoring the amazing things that can be done with these technologies. Over 30 year I've been programming in more languages than I care to count, and I don't remember at any point having a specific language stop me from achieving my goal because it has some traps or design flaws. Always made sure to know about them and make use of the language's strong points instead of concentrating on the weak.

And we both understood the point of the video. The fact that there is much to be grateful for does not mean that we shouldn't improve on what needs improving. But for heaven's sake, if you're not going to improve on it, stop whining about it and be grateful for the amazing things it does enable.

Dewie
tl;dr: you can't necessarily change the troublesome technology, so you might have to leave. But in order to have a viable alternative to that "bad" technology, you need other people (case in point: mindshare of JS). In order to get more people to "your side", you might need to point out what is wrong with the original technology.

> But for heaven's sake, if you're not going to improve on it, stop whining about it and be grateful for the amazing things it does enable.

Sometimes you're not in a position to even be able to change something, even if you wanted to. The ideas you have in mind for a technology might fly in the face of how the community around that technology, or the guardians/maintainers of it, thinks of it - introducing these changes might break too much stuff that is dependent on it, the changes might fly in the face of the culture around that technology.

So if you have some technology that you think - subjectively, or even somewhat objectively if you have conviction enough - and you can not do anything about it, you only have two choices. Embrace it and try to work with it despite its flaws, or to abandon ship.

But if you want to abandon ship, you probably want to find a safe harbor, eventually. ie a place where you can develop or utilize some other technology. But that place might be sparsely populated, because everyone else is working with that other technology. So what do you do? You suggest that others jump ship. :)

Assuming that there is actually some kind of objective merit to complain about a specific technology, it might be wise to complain to others about that technology. That way they can hopefully use that info to make an informed choice, and perhaps abandon their current technology for another technology. In time, you might even get enough people to come over to this other technology that that community is big enough to support that technology as a valid alternative to the "bad" technology. But what if everyone just stfu'ed about what their "negative" thoughts are on a technology? Would that other technology be able to get enough "acolytes" in order to be a viable alternative? Probably not, because everyone was too "positive" and polite to point out how that technology might be better than the old technology.

Would JS even be so controversial if it wasn't for that it is so entrenched in Web development? Is that not a great example of how important mindshare can be?

jerf
"I don't remember at any point having a specific language stop me from achieving my goal because it has some traps or design flaws."

I have to admit, a language has never stopped me personally. But it most assuredly has hurt me when trying to program with other people, who do not have a direct psychic hotline into my brain that tells them what preconditions must hold before my code will work properly, and what things they can and can not do with a certain library, and most importantly, why they can and can not do those things. Languages that allow me to encode more of those things into the program itself, instead of the ambient documentation-that-nobody-ever-reads-even-when-I've-put-tons-of-work-into-it, work better.

And as my memory isn't all that great, it turns out that if I'm away from my own code for long enough, I become one of those people who don't have a direct hotline to my-brain-in-the-past.

briantakita
> But it most assuredly has hurt me when trying to program with other people, who do not have a direct psychic hotline into my brain that tells them what preconditions must hold before my code will work properly, and what things they can and can not do with a certain library, and most importantly, why they can and can not do those things

Programming is hard. It's an ongoing process of mastery. This is true with any programming language. There is no silver bullet.

> Languages that allow me to encode more of those things into the program itself

There are plenty of tools that almost every language provides for you. It's an architectural concern to ensure that there is as little mapping between the domain and the code.

I personally find Javascript to be flexible, which allows me to architect my software in a way that is communicative of the domain, without many restrictions.

> I become one of those people who don't have a direct hotline to my-brain-in-the-past

A story is a great way to communicate information. Automated functional (black box) testing is also good. Also, try to reduce the mapping between the domain and the software. Ideally, the software (naming) should have a 1-1 map to the domain.

Also, keep the structures flat, as this idiom tends to reduce complexity.

Keep consistent & iterate on architectural idioms between projects.

These are some ways to improve communicability of the codebase & to have insight into the business domain logic.

jerf
"business domain logic"

Ah, you see, there's the problem... this wasn't business logic. To put it in Haskell terms, I had code that was not in IO, but I couldn't actually encode that restriction in the language.

Most of your post amounts to "program better", which is vacuous advice. We've spent decades telling each other to "program better". We've proved to my satisfaction that's not enough. Have you used languages not from the same tradition as Javascript? It is possible, even likely, that you are not aware of the options that are available out there, even today.

briantakita
> Ah, you see, there's the problem... this wasn't business logic.

What is "this"?

> Most of your post amounts to "program better", which is vacuous advice

No it's not. It's certainly better than dwelling on some edge case shortcomings and limiting your growth by blaming the tools.

No tool is perfect. Learn to use it better. Master it. Improve it. If you want to use a different tool, then use a different tool. There's no need to spread negativity.

There has been plenty of progress in Javascript idioms & programming idioms in the past few decades. You can accomplish many things with Javascript and the environment will only continue to improve. Programmers will continue to get better from the ecosystem & practices that have been learned over time.

Even your mighty Haskell is not perfect. Time to accept non-perfection & evolve :-)

> Have you used languages not from the same tradition as Javascript?

Yes, I have. I also draw inspiration from other languages & environments.

> It is possible, even likely, that you are not aware of the options that are available out there, even today.

Yes, I'm aware. When they prove themselves, I'll consider using them. In the meantime (and always), I'm happily mastering my craft free of unnecessary angst.

gary_bernhardt
In that video, Louis is talking about himself in the third person: he's the one complaining on the plane. It's not about some group of "others" who are "bad" and don't appreciate the world; it's about our nature as humans.
briantakita
> he's the one complaining on the plane.

Hmm, "the guy next to me goes 'pfft, this is bullsh*t'".

> it's about our nature as humans

Well, it's about our current generation of Americans (maybe Westerners). This complaining seems like unnecessary stress to me. I understand, because I used to do it.

gary_bernhardt
I read (or watched) Louis say that he was the guy on the plane, but this was a couple of years ago and I'm failing to google it now.

I'm not convinced that this behavior is specific to Americans or Westerners; it may just be that we're most attuned to our own ways of expressing it. I'm also not convinced that it's a general property of the species, though; it would be arrogant for me to claim that kind of fundamental knowledge of how the human mind works. That kind of arrogance is the bread and butter of Hacker News, of course, so this is now necessarily a bad HN comment. ;)

I should've said something more like "it's about the way that we all act towards technology".

liviu
Brendan Eich covered this subject at O'Reilly Fluent conference in 2012:

http://youtu.be/Rj49rmc01Hs?t=5m7s

1stop
Seems he took the Wat talk a little personally, but I'm not sure why he defends {} + [] by saying the first { is a statement... wat?
_mhr_
It's automating semicolon insertion. The browser translates {} + [] into {}; + [], so + [] === 0 too. {}; is undefined.
codeflo
Being pedantic: it's NOT semicolon insertion. Your actual point is correct, the {} is an empty block statement, and the +[] is a separate expression statement. It's equivalent to "{} 0".

However, semicolon insertion is only triggered when there's a newline at a position where there would otherwise be a syntax error. Here, neither is the case: blocks don't have to be terminated by semicolon (so no syntax error), and there's no newline in the source code!

drostie
It's because they wanted to say both:

    if (a) b;
and

   if (a) { b; c; }
which tends to make you think of curly braces as a syntactic feature which can appear anywhere, turning many lines of code into one line of code. If you think that way then these should possibly also be valid:

    {b; c}
    {b}
    {}
but, since JS scope is function-oriented and in other places (e.g. functions) the braces are ultimately needed anyways, even for one-liners, it seems like this was a stupid choice and we should have just rejected the form:

    if (a) b;
and then the reuse of {} for lightweight (if non-robust) hashmaps would perhaps be unambiguous again.
kbenson
Which is exactly what Perl did, but to alleviate the clumsiness of single statement if conditionals they added a post-conditional if statement of the form STATEMENT if CONDITION; (which has the benefit of being how some people express simple conditionals in real life. "Go left if you see the blue house.")
celebril
Brendan Eich is a homophobe and whoever links to his videos are complicit in his bigotry.

All his opinions should be discarded.

tsotha
You have no idea whether he's a homophobe or not, and clearly when the subject is Javascript his opinions should be considered very carefully.
archagon
Nobody's opinions should be discarded based on their behavior. Otherwise, we'd have a scant few scientists, artists, and thinkers in history actually worth discussing. (Also, people who link to him are complicit? Are you serious?)
MichaelGG
If by cover you mean he essentially shrugs and says "it was the 90s" and moves on to ES6.
skrebbel
Which seems a pretty appropriate reaction, no? :-)
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.