HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
"Propositions as Types" by Philip Wadler

Strange Loop Conference · Youtube · 28 HN points · 18 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Strange Loop Conference's video ""Propositions as Types" by Philip Wadler".
Youtube Summary
The principle of Propositions as Types links logic to computation. At first sight it appears to be a simple coincidence---almost a pun---but it turns out to be remarkably robust, inspiring the design of theorem provers and programming languages, and continuing to influence the forefronts of computing. Propositions as Types has many names and many origins, and is a notion with depth, breadth, and mystery. Learn why functional programming is (and is not) the universal programming language.

Philip Wadler
UNIVERSITY OF EDINBURGH
@PhilipWadler

Philip Wadler is Professor of Theoretical Computer Science at the University of Edinburgh. He is an ACM Fellow and a Fellow of the Royal Society of Edinburgh, past chair of ACM SIGPLAN, past holder of a Royal Society-Wolfson Research Merit Fellowship, and a winner of the POPL Most Influential Paper Award. Previously, he worked or studied at Stanford, Xerox Parc, CMU, Oxford, Chalmers, Glasgow, Bell Labs, and Avaya Labs, and visited as a guest professor in Copenhagen, Sydney, and Paris. He has an h-index of 60, with more than 18,000 citations to his work according to Google Scholar. He contributed to the designs of Haskell, Java, and XQuery, and is a co-author of Introduction to Functional Programming (Prentice Hall, 1988), XQuery from the Experts (Addison Wesley, 2004) and Generics and Collections in Java (O'Reilly, 2006). He has delivered invited talks in locations ranging from Aizu to Zurich.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
A semi related video about the relationship between mathematics and computer science

https://www.youtube.com/watch?v=IOiZatlZtGU "Propositions as Types" by Philip Wadler

notoriously Philip Wadler says that computer science has two problems: computer and science.

It's not about computers and "You don't put science on your name if you're a real science"

He prefers the name informatics.

source: https://youtube.com/watch?v=IOiZatlZtGU

wisemang
Kinda like countries that feel the need to put “democratic” in their name.
phaedrix
Or "People"...
behnamoh
Or "Republic"...
Barfieldmv
Or "Of China"....
viraptor
Also applies to newspapers which include some version of "truth" in the title. (Russian propaganda "Pravda", Polish tabloid "Fakt", etc.)
Frost1x
"Information science" is basically long form of "informatics" so that breaks it I'd say. Also, "information" tends to imply a focus on state and places computational aspects (operations performed on information) as second hand.

I've yet to find a classification I really like but this is an interesting take. I still tend to like CIS (Computing and Information Sciences). The problem with CS is it focuses on computation and puts state as second class. The problem with IS is it focuses on state and puts computing as second class. To me, both are equally important.

peoplefromibiza
TBF informatics is "the study of computational systems"

I have studied informatics, but don't call myself an informatic, because I am not doing any research.

I call myself a programmer, because I am not doing engineering either.

lr1970
> It's not about computers and "You don't put science on your name if you're a real science"

"Computer science is no more about computers than astronomy is about telescopes." - Edsger W. Dijkstra [0], [1].

[0] https://en.wikipedia.org/wiki/Edsger_W._Dijkstra

[1] https://en.wikiquote.org/wiki/Computer_science

anamax
Materials Science is about as sciency as you can get.
rsj_hn
An economist friend of mine told me that once, so this isn't just a CS quip.
kragen
Informática is a common term in Spanish, as you probably know.
dmlorenzetti
You don't put science on your name if you're a real science

Having flashbacks to when a close friend was getting an MS in Political Science, and spent the first semester in a class devoted to whether or not political science is a science.

HeyLaughingBoy
Well? Don't keep us hanging.
People interested in this topic might also enjoy this video by Philip Wadler describing the way logicians and computer scientists converged towards equivalent ideas.

https://www.youtube.com/watch?v=IOiZatlZtGU

vishnugupta
This is a fantastic talk. I've watched a couple of different versions of this.

Towards the end he spoke about something in passing that's stayed with me.

Through out early/mid 20th century different mathematicians created their own theories of mechanised computation. Turing machine, Lambda calculus, SK combinators etc., However over they all turned out to be equivalent; for example Church-Turing thesis. So this leads one to believe strongly that mechanised computation is "discovered" rather than invented. Also it is a good indication that the computing theory is on a solid foundation.

However, when it comes to distributed and parallel computing theories have been proposed but no equivalence has been found (or at least proved). It's an active field which he hopes will result in a similarly solid foundation for the distributed computing. At the moment it seems to be on a shaky ground.

kenjackson
What is a distributed or parallel computing theory in this context? They can’t compute anything above and beyond sequential computers.
zozbot234
> However, when it comes to distributed and parallel computing theories have been proposed but no equivalence has been found

Is this really the case? It seems that game semantics ought to be able to model these concerns. The logical side of it would then be some sort of multi-modal logic.

felixyz
This three-part series of (long) posts is a very thorough and fascinating counterpoint to the idea that computation was discovered and that there is anything surprising about the confluence of lambda calculus, Turing machines etc, mostly using primary sources from Leibniz onwards:

"Finite of Sense and Infinite of Thought: A History of Computation, Logic and Algebra"

https://pron.github.io/posts/computation-logic-algebra-pt1

Aug 19, 2021 · 2 points, 0 comments · submitted by avinassh
May 16, 2021 · namelosw on Shirts of Peter Norvig
My favorite is Philip Wadler - during a Strange Loop talk[0], all of a sudden he tore his shirt, showing his "lambda man" suit underneath.

[0] https://youtu.be/IOiZatlZtGU?t=2014

May 31, 2020 · 3 points, 0 comments · submitted by tosh
Nov 03, 2019 · 1 points, 0 comments · submitted by tosh
Oct 13, 2019 · 2 points, 0 comments · submitted by furcyd
Philip Wadler showcases and comments on some of this history in his conference talk "Propositions as Types" https://www.youtube.com/watch?v=IOiZatlZtGU
proc0
This is talk is responsible for my outrage whenever people say math is invented.
Feb 19, 2019 · BucketSort on Metamath
Proof assistants are like that. Check out Coq or Agda. If you aren't aware of it already, your spirit is calling for the stuff they talk about a lot in the functional programming community. See https://youtu.be/IOiZatlZtGU for example. Maybe not the best video or intro to the themes though. Wadler also just put out a book about programming language foundations in Agda - https://plfa.github.io/. https://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspon... being one of the profound ideas here.
Propositions as Types by Philip Wadler

A really in depth look into why functional languages stand out as a programming paradigm. The short of it being that they parallel perfectly (almost proof for proof!) with mathematical logic.

https://www.youtube.com/watch?v=IOiZatlZtGU

Sep 09, 2018 · 2 points, 1 comments · submitted by tosh
gus_massa
The youtu.be domain is autokilled. Try resubmitting using the youtube.com version of the link.
I know it seems like one of those Lispers' idiosyncrasies--"discovered" instead of "invented".

I first heard it explained by Philip Wadler, and upon reflection it was one of those "enlightenment" moments that Lispers always go on about: Lambda calculus is a discovery, and should we ever encounter an alien space-faring species, we could read their programs in spite of their alien processing architectures (given that they use a functional language, but let's face it, they have the tech to cross space, they know what they are doing, i.e. they're in the functional camp)

"Propositions as Types" by Philip Wadler, Strange Loop. (https://youtu.be/IOiZatlZtGU?t=1678)

s-shellfish
Yes, there's a real reason for using the word discovered instead of invented. Depends on how webbed in you are intellectually to people around you. Which is often highly dependent on language, logic, and math.

Lambda calculus. Enlightenment. Discovered, not invented. If it already exists somewhere... Everything coming together in the same patterns. Influence.

agumonkey
then you can add a layer of Kay "Lisp is the maxwell equation of computing"

I do agree that lisp being super tiny and quite mathematical at heart is probably less a social invention than the rest.

Professor Philip Wadler has a fantastic talk titled "Propositions as Types" that introduces the Curry-Howard correspondence, some history behind it and ideas following it.

In fact it's one of my favorite talks. Well worth a watch if you haven't seen it: https://www.youtube.com/watch?v=IOiZatlZtGU&t=1s

Jul 08, 2018 · 3 points, 0 comments · submitted by tosh
Philip Wadler's 'Propositions as Types': https://youtu.be/IOiZatlZtGU

Bonus: that was the talk that introduced Lambdaman to the world.

Nov 21, 2017 · 2 points, 0 comments · submitted by fagnerbrack
Came here to mention MyPy and optional typing, the preferable term Gradual Typing means something distinctly different.

http://mypy-lang.org/about.html

Dropbox employs the creator of MyPy but it isn't a creation of dropbox.

Type Systems as Macros http://www.ccs.neu.edu/home/stchang/pubs/ckg-popl2017.pdf

Pluggable Type Systems http://bracha.org/pluggableTypesPosition.pdf

Propositions as Types http://homepages.inf.ed.ac.uk/wadler/papers/propositions-as-...

https://www.youtube.com/watch?v=IOiZatlZtGU

tedmiston
> Came here to mention MyPy and optional typing, the preferable term Gradual Typing means something distinctly different.

Are you sure that gradual typing is different than what Mypy supports?

The Wikipedia page for gradual typing cites mypy as an example.

https://en.wikipedia.org/wiki/Gradual_typing

The Mypy docstrings specifically state:

> Mypy has a powerful type system with features such as type inference, gradual typing, generics and union types.

https://github.com/python/mypy/blob/272d0c4c50918a36b57e7c0c...

The section on gradual typing in PEP 483 seems consistent with this as well.

https://www.python.org/dev/peps/pep-0483/#summary-of-gradual...

sitkack
Do they really insert extra dynamic checks?

Is Sound Gradual Typing Dead? https://news.ycombinator.com/item?id=11041272

http://prl.ccs.neu.edu/gtp/research.html

It was my understanding that MyPy was purely compile time with no runtime checks. But I could be wrong. Gradual Typing as implemented in the original paper would lead to much much slower code. The weight of a static language bolted onto a dynamic runtime for the worst of both possible worlds. Sufficiently abstract Python code is almost un-typable.

I think eventually we will get there, where one can explore a program in a dynamic language and ship a correctly statically typed one. Types are only the beginning.

tedmiston
I believe you're right that Mypy only offers compile time type checking, but not runtime.

It looks like there are some active projects for runtime type checking in Python and while I don't believe Mypy is related to any of them, enforce [1] looks to also use the same standard type hinting syntax from PEP 484.

[1]: https://github.com/RussBaz/enforce

curry-howard and Wadler's "Propositions as Types" talk at StrangeLoop* opened my eyes pretty early on to the sweet, sweet utility of logic in my every day programming. it sounds odd to start with but I never realized why these things mattered prior to viewing this talk.

props to wadler. best ~40 minutes I've ever spent.

*https://www.youtube.com/watch?v=IOiZatlZtGU

protomikron
Thank you, that was a delightful talk, I did not know Wadler was so enthusiastic (but I have to say, most logicians and theoretical computer scientists are).
Dec 27, 2016 · 6 points, 0 comments · submitted by tosh
The lambda calculus teaches you how to build computation from a very simple evaluation rule called "beta reduction" which is just a fancy name for "search and replace" with function parameters. If you can find a system that can rewrite terms in such a fashion, you can compute whatever you want using the tricks you learn in lambda calculus.

For example, C++ templates were not meant to be Turing-complete, but they work by rewriting terms [1], so you can encode lambda calculus in it, and run arbitrary programs at compile time.

Philip Wadler does a better job than me at getting excited by lambda calculus. He calls it "the omniversal programming language", and claims that aliens will probably have discovered it [2].

[1]: http://matt.might.net/articles/c++-template-meta-programming... [2]: https://www.youtube.com/watch?v=IOiZatlZtGU

Propositions as Types: https://www.youtube.com/watch?v=IOiZatlZtGU

I think this can really really change how we look at everyday programming tasks everywhere from the type of tooling we choose to how we approach problems.

Sep 11, 2016 · 1 points, 0 comments · submitted by djfm
Jul 20, 2016 · mietek on Very Long Proofs
If you’re using the right language, proofs are programs; it’s as simple as that. “Compressing” a proof, then, amounts to doing what good programmers are already doing — noticing patterns and eliminating redundancy.

I’ll refer you to an authority on the subject, Phil Wadler. If you like watching videos, try <https://www.youtube.com/watch?v=IOiZatlZtGU>. If you’d rather read an entertaining paper, try <http://homepages.inf.ed.ac.uk/wadler/papers/propositions-as-....

pron
That doesn't make proofs any shorter or any more elegant, though. The same "noticing patterns" can be done on the proof in non-FP form. C-H is basically an observation about encoding.
sedachv
The Curry-Howard correspondence does not apply here though. The coloring theorem is not encoded in the type of the program that enumerates and checks the possible colorings. The program itself is a restatement of the coloring theorem, with the result type being boolean (all the enumerated colorings satisfy the property, or not). So the proof would consist of proving that the program must terminate with a true value. Of course that is not actually useful as it does not simplify anything.

This is a simple mechanization of the proof steps, on par with using a calculator for arithmetic.

vilhelm_s
Depending on what programming language you use for your Curry-Howard correspondence, there is actually a related thing. Probably the most well known realization of C-H is dependent type theories like Coq and Agda. And those support a proof method called "proof by reflection".

The idea is that in those languages, types can contain program terms as subexpressions, and two types are considered "the same" if they evaluate to the same expression. Because the languages enforce that all expressions terminate, it's still decidable whether a program is well-typed or not, so the language can still be considered a proof system, but the type checker can take almost arbitrarily long time to check a program because it has to evaluate subexpressions to values in order to check type equality.

Anyway, the point is that this can be used to write down short programs that act as proofs. For example, suppose you want to prove there are no counterexamples to Fermat's last theorem for exponent 5 and numbers less than 100000000. In this case, you would write down a function

    f : nat -> bool
which evaluates the expression for all numbers less than the argument and returns whether there is a counterexample, then you would prove a lemmma

   soundness : forall n, (f n = false) 
              -> forall x y z, 
                   (x < n) -> (y < n) -> (z < n) 
                   -> (x^5 + y^5 ≠ z^5)
and then the proof of the final theorem is just

   soundness 100000000 eq_refl
In order to check the eq_refl part, the type checker would evaluate (f 100000000) and verify that it evaluated to false. But all that computation is not recorded in the proof term. So the program that represents the proof is very small, and constant size in the bound.

This is actually very useful in practice, because storing and manipulating big proof terms is slow and memory-hungry, so it's useful to be able to push the computation under the carpet in this way. On the other hand, it is sometimes considered a little philosophically dubious.

May 19, 2016 · 1 points, 0 comments · submitted by gulda
If you're finding the paper too dense, Philip gave a extremely entertaining talk on the same subject at Strange Loop this year: https://www.youtube.com/watch?v=IOiZatlZtGU
jcox92
+1 Probably my favorite talk at Strange Loop this year. It's worth watching just for the minute or so starting here: https://www.youtube.com/watch?v=IOiZatlZtGU&t=33m
Oct 02, 2015 · 3 points, 0 comments · submitted by ericdykstra
Sep 28, 2015 · 2 points, 0 comments · submitted by smosher_
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.