Hacker News Comments on
Øredev 2014 - James Mickens - LIFE IS TERRIBLE: LET'S TALK ABOUT THE WEB - James Mickens
Øredev Conference
·
Vimeo
·
25
HN points
·
2
HN comments
- This course is unranked · view top recommended courses
Hacker News Stories and Comments
All the comments and stories posted to Hacker News that reference this video.I was very fortunate to be in a class taught by James Mickens in my undergrad. As he does in this article, he shared his wealth of cybersecurity experience but presented in a very entertaining way.If you like this paper, he also has many other stories from CS academia that is approachable and fun to read[1]. One of my favorite talks by Mickens is a 30-minute roast of Javascript and web security[2].
James Mickens on Javascript:Here's a stupid, simple Vimeo downloader instead of a using the massive, slow starting youtube-dl
More from Mickens:#!/bin/sh # usage: curl https://vimeo.com/123456789 | $0 x=$(curl -s `grep -m1 -o https://player.vimeo.com/video/[^\"?]*|sed 's>$>/config>'`|grep -o https://[^\"]*mp4|sed -n \$p) y=${x%/*};y=${y##*/};exec curl -so $y.mp4 $x
https://www.usenix.org/legacy/events/webapps10/tech/full_pap...
https://www.usenix.org/system/files/1403_02-08_mickens.pdf
Unlike Mickens, I cannot save the world, and I am not telling anyone else what to do or not to do, but I made the web fast for myself. Hence I am very skeptical of claims that "the web is slow". Web servers, the network and computers are plenty fast and still getting faster. I do not define "the web" as certain popular browsers, CSS, Javascript, etc. or whatever web developer tell me it is. Those are someone else's follies. I define it as hyperlinks (thus, a "web") and backwards compatible HTML. Stuff that is reliable and always works. To "make the web fast", I follow some simple rules. I only load resources from one domain, I forgo graphics, and I do not use big, complex, graphical, "modern" web browsers to make HTTP requests.
I do not even use wget or curl (only in examples on HN). I generate the HTTP myself using software I wrote in C for that purpose and send using TCP clients others have written over the years. There are so many of them. With a "modern" SSL-enabled forward proxy or stunnel, they all work with today's websites. "Small programs that do one thing well", as the meme goes.
Obviously, I still need the ever-changing, privacy-leaking, security risk-creating, eye-straining, power-consuming, time-wasting, bloated, omnibus browsers for any sort of serious transaction done with web, e.g., commerce, financial, etc. However that is a small fraction of web use in my case.
For me, using the web primarily comprises searching, reading and downloading. I never need Javascript for those tasks. I can do those them faster without the popular broswers than I can with them. The less interaction the better. I use automation where I can because IMO that is what computers were made for. "The right tool for the job", as the meme goes.
To think how much time and energy (kwH) has been devoted to trying to make Javascript faster as a way to make websites faster is, well, I won't think about it. Those working in the "software industry" and now "tech" are highly adept at creating the problems they are trying solve. Unfortunately today as we try to rely on software and the web for important things, we all have to suffer through that process with them.
By not using the popular browsers for a majority of web use, I have minimised the suffering of one user: me. The web is fast.
⬐ 1vuio0pswjnm7The title of this blog post refers to "the web" but it mainly discusses "rendering". IMHO, those are two different things. The later is concerned with graphical Javascript- and CSS- enabled browsers. The former is concerned with web servers.⬐ 1vuio0pswjnm7Forgot about the most important way I make the web fast, rule #1. Eliminate DNS lookups. I gather the DNS data in bulk before I start reading and add it to custom zones file that are served from localhost authoritative servers. For example, I gather all the DNS data for all sites posted to HN before I start reading. One way to do this, if encryption is desired, is to use DoH+HTTP/1.1 pipelining. I wrote some custom programs to speed up the Base64URL encoding.This way, there is no DNS traffic leaving the network. The only lookups preceding a TCP connection to a website are to the loopback.
That is the number one way I "make the web fast". If retrieving a web page were a chemical reaction, DNS is the "slow step". In most cases, I eliminate it.
⬐ AlexanderDhooreHigh on humour, low on actual content. But very entertaining though.⬐ acqqI actually waited to hear something, anything, interesting. Remained dissapointed.⬐ beerbajayYeah, I attended this talk at öredev and had the same reaction. However, this year's conference was a general disappointment, so this was unfortunately one of the best talks I saw.