HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Invited Talk - Guy Steele

ClojureTV · Youtube · 195 HN points · 11 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention ClojureTV's video "Invited Talk - Guy Steele".
Youtube Summary
Guy L. Steele has been instrumental in designing and documenting several programming languages. Guy has served on accredited standards committees X3J11 (C language) and X3J3 (Fortran), and served as chairman of X3J13 (Common Lisp). He was also a member of the IEEE committee that produced the IEEE Standard for the Scheme Programming Language, IEEE Std 1178-1990. At Thinking Machines Corporation he co-developed the languages Connection Machine Lisp, C*, and Connection Machine Fortran. Whilst at Sun Microsystems (and now at Oracle) he has advised on the evolution of the Java programming language. He is author or co-author of five books: Common Lisp: The Language, C: A Reference Manual, The Hacker’s Dictionary, The High Performance Fortran Handbook, and The Java Language Specification. He has been given the Grace Murray Hopper Award (1988), the ACM SIGPLAN Programming Languages Achievement Award (1996), and the Harry H. Goode Memorial Award (2007).
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Excellent talk by Guy Steele explaining the history of maths/comp notations that can probably lead you to the right path through original sources: https://m.youtube.com/watch?v=dCuZkaaou0Q

Then have a look at this extremely underrated talk by Alexander Stepanov for basically falling in love with maths (he gives a long view of maths in general adorned with beautiful commentary): https://m.youtube.com/watch?v=fanm5y00joc

lichtenberger
Thanks, will watch later on :-)
Apr 05, 2022 · 2 points, 0 comments · submitted by Tomte
For a deeper exposition on language and notation, Guy Steele[0] is the definitive authority :-) - with some discussion here[1].

[0] https://www.youtube.com/watch?v=dCuZkaaou0Q

[1]https://news.ycombinator.com/item?id=15473199

Aug 15, 2021 · 2 points, 0 comments · submitted by tosh
Jul 29, 2021 · 1 points, 0 comments · submitted by Tomte
Nov 11, 2020 · 12 points, 0 comments · submitted by Tomte
Nov 09, 2020 · 1 points, 0 comments · submitted by yuribro
Oct 28, 2020 · 2 points, 0 comments · submitted by tosh
Ironically, I ran into this talk by Guy Steele today. He presents the development of the Computer Science Metalanguage.

https://youtu.be/dCuZkaaou0Q

The context here isn't the average American, but a high school student taking an AP test involving writing math -- where scoring well counts for the same credit as having passed the equivalent entry level course(s) in college.

Besides, math is already pseudo language on top of pseudo language. Learning notation is part of math, I'll concede learning to type an expression on a computer is a different notation than writing it with pen, but if you can't learn that, why learn math? Each domain does its own thing too and isn't even necessarily consistent. Guy Steele gave a great talk a few years ago on "Computer Science Metanotation", and that's just one area of math: https://www.youtube.com/watch?v=dCuZkaaou0Q (Edit: And to conclude here, learning to read and write with some notation tends to be easy, it's learning what the math actually means that is the hard part.)

Mirioron
>but if you can't learn that, why learn math?

You can say this, but you won't like the response. The response is probably "okay" and they just won't learn math. I don't think the world is better off when that happens. It's not a question of whether people are capable of doing this, it's a question of whether this is worthwhile.

It's part of an arcane knowledge that is seldom reified.

"It's Time for a New Old Language" – Guy Steele [video] (youtube.com)

https://www.youtube.com/watch?v=dCuZkaaou0Q

https://news.ycombinator.com/item?id=15473199

He calls it "Computer Science Metanotation".

Found it! "It's Time for a New Old Language" – Guy Steele [video] (youtube.com)

https://news.ycombinator.com/item?id=15473199

https://www.youtube.com/watch?v=dCuZkaaou0Q

Jun 23, 2019 · 2 points, 0 comments · submitted by tosh
Jan 02, 2019 · 2 points, 1 comments · submitted by rfreytag
dang
Discussed at the time: https://news.ycombinator.com/item?id=15473199
You would probably enjoy this talk by Guy Steele where he discusses the language of mathematics that computer science people use in their papers (Computer Science Metanotation) to talk about other languages. Funnily it's untyped and lacking a rigorous specification... https://www.youtube.com/watch?v=dCuZkaaou0Q
jwhitlark
There's a reason why we teach math in kindergarten and proofs in college. There's lots of things that are useful to do without the rigors of rigorous proof.
Not directly math but Guy Steel's talk at Clojure Conj [0] on computer science type notation touches on related issues and is a great watch.

0. https://youtu.be/dCuZkaaou0Q

> I didn't really understand his talk I think he was too abstract

Then you better stay away from another talk at the same conference, by Guy Steele [0] !

[0] https://www.youtube.com/watch?v=dCuZkaaou0Q

systems
I did watch half of it ... i left when i realized, that he will not move to another topic, he is just truly talking about notations, and comparing them, and telling stories about how they evolved

I watched in anticipation, that he will use this, to make a point about clojure or programming models or techniques, but he wasnt gonna ..

It was truly just a speech about notations, and i dont really care about that

Oct 15, 2017 · 6 points, 0 comments · submitted by tosh
Here's a slightly edited rendition at the latest conjure conf https://www.youtube.com/watch?v=dCuZkaaou0Q
DonHopkins
Thanks, this is a fascinating talk!

TIL not to confuse these people:

https://en.wikipedia.org/wiki/Emil_Leon_Post

https://en.wikipedia.org/wiki/Emily_Post

Oct 15, 2017 · 8 points, 0 comments · submitted by ozy
Oct 14, 2017 · 152 points, 45 comments · submitted by zengid
heinrichhartman
Reminds me of Donald Knuth's talk at MSRI in 2004 on "Mathematical Notation" and the choices he made for TAOCP:

https://archive.org/details/lecture10604#reviews

"The speaker presents numerous examples of situations where good notations enhance mathematical problem solving, as well as a few that have been less helpful."

TeMPOraL
I've finished watching the talk, and I'm not sure if this is serious, or is Guy Steele just trolling people in a sophisticated way. The TL;DR as I understand is:

- Computer science papers have this ad-hoc, ever evolving notation that started out in mathematics, that is not well defined (every paper ends up using a different flavour, sometimes even a bunch of mutually contradicting flavours at the same time), and that is subject to weird evolutionary pressures (like page limit of CS papers leading to extreme conciseness).

- Particularly interesting is the evolution of overline notation, and the ubiquitous but never defined use of ellipsis.

- Guy thinks this is a mess and should be cleaned up. His first contribution is solving overline notation abuse by introducing underlines, which cancel out overlines, thus implementing quasiquoting in mathematical notation. His second contribution is formalizing the meaning of ellipsis by defining a transformation from ellipsis form into over/underline form.

Maybe it's because of the way he presented it, or maybe because of my unfamiliarity with the space, but overall this talk really felt like a one-hour long practical joke aimed at CS academics.

TuringTest
I have not watched the talk in-depth so I can't tell whether it's trolling, but I have worked with academic CS papers. I would agree with an attempt to develop a new symbolism for describing CS theory, that was closer to modern programming languages, or at least to a LISP-like Domain Specific Language.

It's true that the language of mathematics was optimized for a different medium: written formulas on paper. Algorithms are better expressed in pseudo-code, but to describe the exact effects of a language instructions we have to resort to formal semantics (whether denotational, operational or axiomatic), which basically describe the workings of a virtual machine in terms of a temporal logic expressed with mathematical symbols.

The ellipsis in maths is usually an ambiguous symbol whose meaning changes depending on context. In CS, it typically represents either vectors in memory or successive recursive calls. Trying to provide a precise meaning for its usage is a legitimate endeavor; it appears that Steele is using symbols from vector maths to achieve it. What I get from skimming the video is that the combination of overlines and underlines would be a way to represent how you would combine your nested loops in imperative programming to traverse the data structure.

cryptonector
Watch the video. It's great!
throwaway7645
I'm guessing Ken Iverson's APL wouldn't work here? It can show algorithms just fine. I'm not sure about type theory though.
cryptonector
He's being dead serious. He says papers on this are coming up. I don't think he means that this is a "mess", as you put it, but rather, a ripe opportunity to make papers more readable and their CSM code runnable in various ways. That last part is very important. A great deal of material is locked up in these non-machine-readable algorithm descriptions that could be made useful for things like: code generation, actual implementation, correctness proofs, and so on. This is fantastic. It's exciting. I'm jumping up and down with joy at the thought. I'm thanking my lucky stars I watched this video today. I can't overemphasize nor overestimate just what a wonderful innovation this is. Just wow!

Edit: have I mentioned that I'm wowed by this?!

festidious
This talk makes perfect sense if you know a bit about Steele's recent research history.

It was a main focus of Fortress (https://en.wikipedia.org/wiki/Fortress_(programming_language...), for example, which was interesting but never quite got off the ground in the same way as other competitors, in part because of the focus on notation implementation.

If anything, my experience following Fortress led me to believe what Steele was discussing in his talk is both a tremendous opportunity but also much more difficult than the impression he creates.

For what it's worth, I thought the talk was really interesting and I would like to see it taken more seriously, if not as an immediate goal, then a long-term one.

samth
While we spent a lot of time on it, I don't think the failures of Fortress were about the time spent on notation.

(Source: I wrote the first parser for Fortress, as well as the first type checker and interpreter.)

titzer
I'm pretty familiar with this notation. No, he isn't trolling, it's not a practical joke, and even though it might seem esoteric and trivial to the outside world (hell, what academic disciple isn't?), he has a good point. The foundations of programming languages should use a consistent notation with well-defined semantics. Of course, theorem provers can encode most of these kinds of rule systems, and do have a concrete syntax that is not nearly as concise, but they do force the formalism into an automatic checker. Some academics really hate the "programming" of using a theorem prover, but I find it to be an incredible boon to the field.
chubot
Yeah I think it's because you haven't spent hours and hours of your life reading conference papers like he has.

I read a lot of programming language papers, but not type theory papers.

It definitely seems like there is a problem, but solving it would probably only help me a little. That said, I hope that people listen to what he has to say, because it seems very well-considered.

eric-hu
I also took the under bar suggestion as a joke/troll. He mentioned horizontal lines don't work well with OCR earlier in the talk. He also said parenthesis based syntax worked better for typesetters, which I feel still holds true with LaTeX.

I think his overall point is meant to be serious though: that the language of formal CS should be more consistent.

kd0amg
Underline seems like a perfectly sensible thing to me. I've run into problems like his "same Γ, different x and τ" example many times. In PLT Redex, I have to either make sure the thing I intend to be non-subscripted is already bound elsewhere or make an after-the-fact check that all of the subscripted versions of it are equal. Both options are awkward, and which one I have to do varies by situation. An `unsequence' notation to complement the `sequence' notation (which is actually done in Redex by appending an ellipsis) would let me directly say what I mean. The OCR issue is strictly an aspect of the marks put on the page rather than the ideas they describe, and a completely serial notation could be used as well (similar to backtick and comma for quasiquotation).
tonyg
I took it as completely serious. Variation in notation for this kind of thing is a severe obstacle to understanding a paper. It also means the authors are missing out on useful tooling for checking the correctness of their notation, let alone their models.

Redex takes this in an interesting direction, allowing you to write your language models in S-expressions, run them, test them, and get LaTeX out automatically. See https://redex.racket-lang.org/ and https://eecs.northwestern.edu/~robby/lightweight-metatheory/... .

le-mark
Notation is soo important. This was a highly memborable presentation by Gerald Sussman, mathematical notation was a big part: "Programming for the Expression of Ideas". Very good to watch:

https://www.infoq.com/presentations/Expression-of-Ideas

gumby
...and of course Sussman and Steele did a lot of work together (including, most notably, the Ultimate papers).
grabcocque
Extending the old adage "the best programming language is the one you know the best" to "the best programming language is the one you created the best".
baldfat
That is why Macros and creating your own DSL is so powerful :)
JohnBooty
This is purely my opinion but after working in the Ruby world for a few years I've concluded that DSLs are harmful as all hell.

A DSL is almost never as well-implemented as a "real" language, ala Ruby/Java/Haskell/whatever. This is to be expected, as orders of magnitude more engineering time are placed in those "real" languages.

Specifically, DSLs tend to be very leaky abstractions. I always find myself needing to look at what's happening in the bowels of the library providing the DSL anyway; a DSL essentially just adds another level of indirection.

Particularly for beginners, but also for more experienced coders, it can be difficult to tell where the language and its core libraries end and the DSL begins. If your DSL "foo" seamlessly adds the command "bar" to language "baz" and it's not working for me, where do I look for information? In "foo" or "baz?" The obvious example here is Ruby on Rails.

Some may say that Rails is a poor example of a DSL. While that may be, it raises the question: are there better examples out there?

Given that the goal of a DSL is to blend into the underlying language, I'm not sure how any implementation would avoid those pifalls. To me, the best case scenario is that DSLs generally save a moderate amount of typing in exchange for a considerable amount of cognitive overhead.

davexunit
I don't think it's fair to judge DSLs (Embedded DSLs or EDSLs to be more precise) as a whole based on Ruby alone. I write Ruby at my day job and I think it's metaprogramming features are lacking. Languages like Clojure, Scheme, and Common Lisp are much better places to look. Designing languages is an inherent part of programming, but not being able to extend the syntax of your host language (as is the case in many languages) is very limiting.

In the years I've spent writing ES5 JavaScript code, I've constantly wished there was a macro system I could use to stop writing this:

    (function($) { doSomeJQueryStuff(); })(jQuery)
and instead write something like this, which is what I could do in any other sane lexically scoped language:

    let($ = jQuery) { doSomeJqueryStuff(); }
Instead I had to wait for the language design gods to add similar syntax to ES6. I think any programmer should be able to put on their language designer hat and add some syntax if they want.
flavio81
>Some may say that Rails is a poor example of a DSL. While that may be, it raises the question: are there better examples out there?

You are understanding DSL as "ok, let's import this library that defines a DSL".

In the Lisp world, one favored approach is to express the problem in the most natural way possible (i.e. like if describing the problem in english, or in any case, in your dsl language that best suits the problem.) Then, you use Lisp macros and implement this DSL. This has the side effect of making your program work, and thus, your problem to be solved. And, thus, the problem itself is specified in a high-level DSL, where it's more easy to understand and modify.

Note, also, that on a Lisp-family language, creating a DSL is very easy, since metaprogramming is very very similar to regular programming. (In Lisp, metaprogramming is not some advanced topic; quite the opposite, is taught to beginners as well.)

So, for example, if i want to create myself a library that outputs HTML, i define my DSL, which can be something like:

   (with-html-output
      (:html
         (:head
            (:title my-page-title))))
or perhaps

   (do-html-output
      (do-html-head :title my-page-title
                    :includes js-include-list))
Then, later, i'll create the functions and macros that make my language work. This solves my problem.

As you can see, anybody who knows about the problem domain of "let's do HTML output", will easily understand this DSL. In any case, if you are importing a library that defines a DSL, if the library is good, then by definition the DSL expresses the problem in the clearest way possible, and thus, it should be an easy library to use.

Perhaps it has something to do with the syntax? In Lisp, the syntax is very uniform (s-expressions) while Ruby has a far greater variety of symbols and constructs.

cujic9
> Note, also, that on a Lisp-family language, creating a DSL is very easy.

I'd argue that this is a bug, not a feature. Yes, DSLs can make for elegant code, but it comes at a high cost in the form of a hidden dependency: the state of the DSL author's brain.

In order for us to share a codebase with a DSL, I now need to understand your unique, high-level view of the problem, and be able to speak it fluently.

And if your view of the problem is incorrect, I need to be able to hold two conflicting viewpoints in my head and find a way to resolve them. (Something that humans find quite difficult.)

argv_empty
A quick `s/DSL/library/g` and the result is just as true and just as much of an issue.
cujic9
I disagree.

A library works within the style and bounds of the base language.

A DSL intends to re-define the base language and invent new keywords and a unique control flow.

heavenlyblue
Yes, and that is exactly what programmers do: they create hierarchies: we had started with assembly and now we've got easily-scalable-to-multiple-computers cloud platforms which also need to be programmed.

God bless generations of programmers doing the same thing over and over again.

It's funny how you can't even agree that there _might_ be languages that do allow you to create your own DSLs within the style of those languages; there _isn't_ a base language.

JohnBooty
I don't agree at all.

On a small project that utilizes a single DSL or a single library, sure, the difference is negligible.

On a large project with many DSLs and/or libraries (ie Rails) it's not easy to tell if a given thing is part of Ruby or part of the many-headed beast that is Rails. (or which of those heads provided it...)

lmm
A good DSL is just a library - or at least, there's no clear sharp line between them. Likewise, I think good programming looks a lot like creating a DSL (or rather a succession of them, with a tower of interpreters from the business logic at top level down to lower-level DSLs until you get to one that can be interpreted directly by the machine). The difference between a good DSL and a bad DSL is that a good DSL lets you reason about it the way you reason about normal, plain old code: you can factor out repeated function calls, inline constants that are only used in one place, and so on, and it will all just work the way you'd expect. A good DSL should just be plain old code, as should most of the language standard library, and then it doesn't really matter whether a given function is part of the language or a library.

I've been very happy with the Scala DSL experience; I'd suggest something like https://github.com/http4s/rho as an example.

visarga
Reminds me of the "no free lunch" theorem from ML. To paraphrase, there is no language that is better than every other language on all tasks.

https://en.wikipedia.org/wiki/No_free_lunch_theorem

TeMPOraL
I view this as shifting complexity around.

All Turing-complete languages are essentially equivalent, and there is some essential minimum complexity in any computation you'd like to describe. Any given language we use makes trade-offs, which cause some programs to be expressed simply in it, but the cost is that some other programs will become much more complex to express in it.

tormeh
... which is why a certain class of programs is most elegantly described in brainfuck?

Certainly there are tradeoffs to be made, but this does not mean that some languages are not simply better than others.

TeMPOraL
> ... which is why a certain class of programs is most elegantly described in brainfuck?

Possibly, though for sure those aren't programs you'd usually want to write.

The key phrase of my comment was "essential complexity". You can pile up extra complexity that's not helping anything, but assuming you got rid of it, all you can do to make things easier is to shift the remaining complexity around the problem space.

tormeh
I'd argue that no language in existence is even remotely close to having no extra complexity.
TuringTest
Maybe, but comp-sci does not study all possible programs with equal probability. A notation should be optimized for the kind of problems that are more often studied, and make expressing those easier.

If this pushes complexity towards the expression of randomly generated programs, that's a good thing (except for the few guys that study randomly generated programs, that is).

TeMPOraL
Yes. This is the general mental model I use when explaining that the design goal of a good general-purpose language is to shift the complexity to the areas almost no one will care about. Also your corollary from a sibling comment can be restated as saying that you can get extra simplicity for a particular task by designing DSL that will shift complexity to outside of its domain.
sampo
> https://en.wikipedia.org/wiki/No_free_lunch_theorem

This is a very theoretical result that people often overinterpret bease it has a catchy name. But it has no relevance for real world data.

DonbunEf7
No Free Lunch is very real, and occurs in many different guises under different masks. Here is an example constructive instantiation of No Free Lunch: https://blog.acolyer.org/2017/09/12/universal-adversarial-pe...
sampo
That article doesn't mention no free lunch at all. Also, they're just calculating small perturbations that send the classifier over decision boundaries.
stcredzero
As I noted below, it could be argued that x86-64 is a universal programming language that currently serves all purposes across a very wide swath of all computation, modulo a number of interpreters, compilers, and VMs. (Even that doesn't take care of literally all, however.) So maybe there's no free lunch, but you can get lunch for a "coupon" in the form of a compiler or interpreter, which for many is close enough to free.
carapace
All Turing-Complete languages are universal.

No Turing-Complete language is best for all problems.

(That's why CPUs get GPUs, for example.)

stcredzero
All Turing-Complete languages are universal. No Turing-Complete language is best for all problems.

But that doesn't mean they are or are not "fit for purpose." That brings in a lot of subjective and human factors.

TuringTest
As a corollary, if you're using a general purpose language, there will always exist some domain where a Domain Specific Language will be equal or better.
tomp
That obviously applies to all domains of life, but when it's actually said out loud, it's most often a cop-out. It's very improbably that the current-era languages are at (or anywhere near) the local optima of their respective niches. Sure, you don't want to issue database queries in C++ or write a high-frequency trading engine in JavaScript, but there's still a lot of progress being made, and to be made on every front of programming language development (syntax, type systems, semantics, libraries, compilation strategies, memory/resource management, ...).
jasode
>improbably that the current-era languages are at (or anywhere near) the local optima of their respective niches.

Your statement is true (e.g. Javascript/C++/etc syntax is suboptimal and could be better). However it's a separate concept from what grandparent posters' grabcocque and visarga are talking about. If we engage with their point, it's not a "cop-out" but stating an important fundamental truth about the syntax of all programming languages. Many beginners do not know this truth as can be seen by the following questions in the wild:

- Why can't there be an “universal” programming language that serves all purposes? : https://softwareengineering.stackexchange.com/questions/4889...

- ELI5: Why isn't there a universal programming language? : https://www.reddit.com/r/explainlikeimfive/comments/j2v84/el...

- Why are there so many programming languages? : https://stackoverflow.com/questions/4334954/why-are-there-so...

- Why there are so many programming languages? Can't we create one language to do everything? : https://www.quora.com/Why-there-are-so-many-programming-lang...

The top voted answers in each case tries to explain it using analogies. (e.g. hammer is wrong tool for driving screws, etc). However, I'm not sure they actually provide the insight needed to make the questioner understand that creating the "One Universal Programming Language" is mathematically impossible. It's not possible to express multiple disparate concepts using finite characters with minimum string length for ease of typing and reading. All desirable concepts cannot simultaneously share the same minimal syntax for convenience. (This impossibility is also not solved by splitting concepts via language-vs-library as in "language reserved keywords" vs "library function calls".)

(As an educational exercise, we could ask the questioner to try to invent a "One Universal Programming Language". He would soon run into contradictions rooted in trying to express multiple concepts via finite symbols. Eventually of those concepts he desires will end up being encoded with noisier and inconvenient syntax.)

stcredzero
Why can't there be an “universal” programming language that serves all purposes?

It could be argued that x86-64 is a universal programming language that currently serves all purposes across a very wide swath of all computation, modulo a number of interpreters, compilers, and VMs. Even that doesn't take care of literally all, however.

mbroncano
Not really: you don't type assembler for performing any of those tasks. The point is convenience, readability, etc.
onlinetimesheet
Worth reading information, it helps me a lot Great post
None
None
cryptonector
Wow, this is a very nice talk. I particularly liked the Q&A, where Guy Steele went back and explained the type checker shown earlier. And I also loved the LISP macro backtick-comma inspiration for using underline to "escape" overline. Brilliant!
Oct 14, 2017 · 5 points, 0 comments · submitted by smnplk
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.