Hacker News Comments on
"Propositions as Types" by Philip Wadler
Strange Loop Conference
·
Youtube
·
28
HN points
·
18
HN comments
- This course is unranked · view top recommended courses
Hacker News Stories and Comments
All the comments and stories posted to Hacker News that reference this video.A semi related video about the relationship between mathematics and computer sciencehttps://www.youtube.com/watch?v=IOiZatlZtGU "Propositions as Types" by Philip Wadler
notoriously Philip Wadler says that computer science has two problems: computer and science.It's not about computers and "You don't put science on your name if you're a real science"
He prefers the name informatics.
⬐ wisemangKinda like countries that feel the need to put “democratic” in their name.⬐ phaedrix⬐ Frost1xOr "People"...⬐ behnamoh⬐ viraptorOr "Republic"...⬐ BarfieldmvOr "Of China"....Also applies to newspapers which include some version of "truth" in the title. (Russian propaganda "Pravda", Polish tabloid "Fakt", etc.)"Information science" is basically long form of "informatics" so that breaks it I'd say. Also, "information" tends to imply a focus on state and places computational aspects (operations performed on information) as second hand.I've yet to find a classification I really like but this is an interesting take. I still tend to like CIS (Computing and Information Sciences). The problem with CS is it focuses on computation and puts state as second class. The problem with IS is it focuses on state and puts computing as second class. To me, both are equally important.
⬐ peoplefromibiza⬐ lr1970TBF informatics is "the study of computational systems"I have studied informatics, but don't call myself an informatic, because I am not doing any research.
I call myself a programmer, because I am not doing engineering either.
> It's not about computers and "You don't put science on your name if you're a real science""Computer science is no more about computers than astronomy is about telescopes." - Edsger W. Dijkstra [0], [1].
⬐ anamaxMaterials Science is about as sciency as you can get.⬐ rsj_hnAn economist friend of mine told me that once, so this isn't just a CS quip.⬐ kragenInformática is a common term in Spanish, as you probably know.⬐ dmlorenzettiYou don't put science on your name if you're a real scienceHaving flashbacks to when a close friend was getting an MS in Political Science, and spent the first semester in a class devoted to whether or not political science is a science.
⬐ HeyLaughingBoyWell? Don't keep us hanging.
People interested in this topic might also enjoy this video by Philip Wadler describing the way logicians and computer scientists converged towards equivalent ideas.
⬐ vishnuguptaThis is a fantastic talk. I've watched a couple of different versions of this.Towards the end he spoke about something in passing that's stayed with me.
Through out early/mid 20th century different mathematicians created their own theories of mechanised computation. Turing machine, Lambda calculus, SK combinators etc., However over they all turned out to be equivalent; for example Church-Turing thesis. So this leads one to believe strongly that mechanised computation is "discovered" rather than invented. Also it is a good indication that the computing theory is on a solid foundation.
However, when it comes to distributed and parallel computing theories have been proposed but no equivalence has been found (or at least proved). It's an active field which he hopes will result in a similarly solid foundation for the distributed computing. At the moment it seems to be on a shaky ground.
⬐ kenjacksonWhat is a distributed or parallel computing theory in this context? They can’t compute anything above and beyond sequential computers.⬐ zozbot234> However, when it comes to distributed and parallel computing theories have been proposed but no equivalence has been foundIs this really the case? It seems that game semantics ought to be able to model these concerns. The logical side of it would then be some sort of multi-modal logic.
⬐ felixyzThis three-part series of (long) posts is a very thorough and fascinating counterpoint to the idea that computation was discovered and that there is anything surprising about the confluence of lambda calculus, Turing machines etc, mostly using primary sources from Leibniz onwards:"Finite of Sense and Infinite of Thought: A History of Computation, Logic and Algebra"
My favorite is Philip Wadler - during a Strange Loop talk[0], all of a sudden he tore his shirt, showing his "lambda man" suit underneath.
Philip Wadler showcases and comments on some of this history in his conference talk "Propositions as Types" https://www.youtube.com/watch?v=IOiZatlZtGU
⬐ proc0This is talk is responsible for my outrage whenever people say math is invented.
Proof assistants are like that. Check out Coq or Agda. If you aren't aware of it already, your spirit is calling for the stuff they talk about a lot in the functional programming community. See https://youtu.be/IOiZatlZtGU for example. Maybe not the best video or intro to the themes though. Wadler also just put out a book about programming language foundations in Agda - https://plfa.github.io/. https://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspon... being one of the profound ideas here.
Propositions as Types by Philip WadlerA really in depth look into why functional languages stand out as a programming paradigm. The short of it being that they parallel perfectly (almost proof for proof!) with mathematical logic.
⬐ gus_massaThe youtu.be domain is autokilled. Try resubmitting using the youtube.com version of the link.
I know it seems like one of those Lispers' idiosyncrasies--"discovered" instead of "invented".I first heard it explained by Philip Wadler, and upon reflection it was one of those "enlightenment" moments that Lispers always go on about: Lambda calculus is a discovery, and should we ever encounter an alien space-faring species, we could read their programs in spite of their alien processing architectures (given that they use a functional language, but let's face it, they have the tech to cross space, they know what they are doing, i.e. they're in the functional camp)
"Propositions as Types" by Philip Wadler, Strange Loop. (https://youtu.be/IOiZatlZtGU?t=1678)
⬐ s-shellfishYes, there's a real reason for using the word discovered instead of invented. Depends on how webbed in you are intellectually to people around you. Which is often highly dependent on language, logic, and math.Lambda calculus. Enlightenment. Discovered, not invented. If it already exists somewhere... Everything coming together in the same patterns. Influence.
⬐ agumonkeythen you can add a layer of Kay "Lisp is the maxwell equation of computing"I do agree that lisp being super tiny and quite mathematical at heart is probably less a social invention than the rest.
Professor Philip Wadler has a fantastic talk titled "Propositions as Types" that introduces the Curry-Howard correspondence, some history behind it and ideas following it.In fact it's one of my favorite talks. Well worth a watch if you haven't seen it: https://www.youtube.com/watch?v=IOiZatlZtGU&t=1s
Philip Wadler's 'Propositions as Types': https://youtu.be/IOiZatlZtGUBonus: that was the talk that introduced Lambdaman to the world.
Came here to mention MyPy and optional typing, the preferable term Gradual Typing means something distinctly different.http://mypy-lang.org/about.html
Dropbox employs the creator of MyPy but it isn't a creation of dropbox.
Type Systems as Macros http://www.ccs.neu.edu/home/stchang/pubs/ckg-popl2017.pdf
Pluggable Type Systems http://bracha.org/pluggableTypesPosition.pdf
Propositions as Types http://homepages.inf.ed.ac.uk/wadler/papers/propositions-as-...
⬐ tedmiston> Came here to mention MyPy and optional typing, the preferable term Gradual Typing means something distinctly different.Are you sure that gradual typing is different than what Mypy supports?
The Wikipedia page for gradual typing cites mypy as an example.
https://en.wikipedia.org/wiki/Gradual_typing
The Mypy docstrings specifically state:
> Mypy has a powerful type system with features such as type inference, gradual typing, generics and union types.
https://github.com/python/mypy/blob/272d0c4c50918a36b57e7c0c...
The section on gradual typing in PEP 483 seems consistent with this as well.
https://www.python.org/dev/peps/pep-0483/#summary-of-gradual...
⬐ sitkackDo they really insert extra dynamic checks?Is Sound Gradual Typing Dead? https://news.ycombinator.com/item?id=11041272
http://prl.ccs.neu.edu/gtp/research.html
It was my understanding that MyPy was purely compile time with no runtime checks. But I could be wrong. Gradual Typing as implemented in the original paper would lead to much much slower code. The weight of a static language bolted onto a dynamic runtime for the worst of both possible worlds. Sufficiently abstract Python code is almost un-typable.
I think eventually we will get there, where one can explore a program in a dynamic language and ship a correctly statically typed one. Types are only the beginning.
⬐ tedmistonI believe you're right that Mypy only offers compile time type checking, but not runtime.It looks like there are some active projects for runtime type checking in Python and while I don't believe Mypy is related to any of them, enforce [1] looks to also use the same standard type hinting syntax from PEP 484.
curry-howard and Wadler's "Propositions as Types" talk at StrangeLoop* opened my eyes pretty early on to the sweet, sweet utility of logic in my every day programming. it sounds odd to start with but I never realized why these things mattered prior to viewing this talk.props to wadler. best ~40 minutes I've ever spent.
⬐ protomikronThank you, that was a delightful talk, I did not know Wadler was so enthusiastic (but I have to say, most logicians and theoretical computer scientists are).
The lambda calculus teaches you how to build computation from a very simple evaluation rule called "beta reduction" which is just a fancy name for "search and replace" with function parameters. If you can find a system that can rewrite terms in such a fashion, you can compute whatever you want using the tricks you learn in lambda calculus.For example, C++ templates were not meant to be Turing-complete, but they work by rewriting terms [1], so you can encode lambda calculus in it, and run arbitrary programs at compile time.
Philip Wadler does a better job than me at getting excited by lambda calculus. He calls it "the omniversal programming language", and claims that aliens will probably have discovered it [2].
[1]: http://matt.might.net/articles/c++-template-meta-programming... [2]: https://www.youtube.com/watch?v=IOiZatlZtGU
Propositions as Types: https://www.youtube.com/watch?v=IOiZatlZtGUI think this can really really change how we look at everyday programming tasks everywhere from the type of tooling we choose to how we approach problems.
If you’re using the right language, proofs are programs; it’s as simple as that. “Compressing” a proof, then, amounts to doing what good programmers are already doing — noticing patterns and eliminating redundancy.I’ll refer you to an authority on the subject, Phil Wadler. If you like watching videos, try <https://www.youtube.com/watch?v=IOiZatlZtGU>. If you’d rather read an entertaining paper, try <http://homepages.inf.ed.ac.uk/wadler/papers/propositions-as-....
⬐ pronThat doesn't make proofs any shorter or any more elegant, though. The same "noticing patterns" can be done on the proof in non-FP form. C-H is basically an observation about encoding.⬐ sedachvThe Curry-Howard correspondence does not apply here though. The coloring theorem is not encoded in the type of the program that enumerates and checks the possible colorings. The program itself is a restatement of the coloring theorem, with the result type being boolean (all the enumerated colorings satisfy the property, or not). So the proof would consist of proving that the program must terminate with a true value. Of course that is not actually useful as it does not simplify anything.This is a simple mechanization of the proof steps, on par with using a calculator for arithmetic.
⬐ vilhelm_sDepending on what programming language you use for your Curry-Howard correspondence, there is actually a related thing. Probably the most well known realization of C-H is dependent type theories like Coq and Agda. And those support a proof method called "proof by reflection".The idea is that in those languages, types can contain program terms as subexpressions, and two types are considered "the same" if they evaluate to the same expression. Because the languages enforce that all expressions terminate, it's still decidable whether a program is well-typed or not, so the language can still be considered a proof system, but the type checker can take almost arbitrarily long time to check a program because it has to evaluate subexpressions to values in order to check type equality.
Anyway, the point is that this can be used to write down short programs that act as proofs. For example, suppose you want to prove there are no counterexamples to Fermat's last theorem for exponent 5 and numbers less than 100000000. In this case, you would write down a function
which evaluates the expression for all numbers less than the argument and returns whether there is a counterexample, then you would prove a lemmmaf : nat -> bool
and then the proof of the final theorem is justsoundness : forall n, (f n = false) -> forall x y z, (x < n) -> (y < n) -> (z < n) -> (x^5 + y^5 ≠ z^5)
In order to check the eq_refl part, the type checker would evaluate (f 100000000) and verify that it evaluated to false. But all that computation is not recorded in the proof term. So the program that represents the proof is very small, and constant size in the bound.soundness 100000000 eq_refl
This is actually very useful in practice, because storing and manipulating big proof terms is slow and memory-hungry, so it's useful to be able to push the computation under the carpet in this way. On the other hand, it is sometimes considered a little philosophically dubious.
https://youtu.be/IOiZatlZtGU?t=22s
If you're finding the paper too dense, Philip gave a extremely entertaining talk on the same subject at Strange Loop this year: https://www.youtube.com/watch?v=IOiZatlZtGU
⬐ jcox92+1 Probably my favorite talk at Strange Loop this year. It's worth watching just for the minute or so starting here: https://www.youtube.com/watch?v=IOiZatlZtGU&t=33m