HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Vectors | Chapter 1, Essence of linear algebra

3Blue1Brown · Youtube · 6 HN points · 18 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention 3Blue1Brown's video "Vectors | Chapter 1, Essence of linear algebra".
Youtube Summary
Beginning the linear algebra series with the basics.
Help fund future projects: https://www.patreon.com/3blue1brown
An equally valuable form of support is to simply share some of the videos.
Home page: https://www.3blue1brown.com/

Correction: 6:52, the screen should show [x1, y1] + [x2, y2] = [x1+x2, y1+y2]

Full series: http://3b1b.co/eola

Future series like this are funded by the community, through Patreon, where supporters get early access as the series is being produced.
http://3b1b.co/support

If you want to contribute translated subtitles or to help review those that have already been made by others and need approval, you can click the gear icon in the video and go to subtitles/cc, then "add subtitles/cc". I really appreciate those who do this, as it helps make the lessons accessible to more people.

Music: https://vincerubinetti.bandcamp.com/track/grants-etude

------------------

3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted about new videos, subscribe, and click the bell to receive notifications (if you're into that).

If you are new to this channel and want to see more, a good place to start is this playlist: https://goo.gl/WmnCQZ

Various social media stuffs:
Website: https://www.3blue1brown.com
Twitter: https://twitter.com/3Blue1Brown
Patreon: https://patreon.com/3blue1brown
Facebook: https://www.facebook.com/3blue1brown
Reddit: https://www.reddit.com/r/3Blue1Brown
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
There are many good resources! A few, depending on your inclination:

- For intuition: https://www.youtube.com/watch?v=fNk_zzaMoSs

- For rigor: https://ocw.mit.edu/courses/18-06-linear-algebra-spring-2010...

- For code: https://codingthematrix.com/

- For numerical/algorithmic details: https://people.maths.ox.ac.uk/trefethen/text.html

generationP
Rigor and Strang are not two words I'd put in the same sentence. Neil Strickland's notes https://neilstrickland.github.io/linear_maths/ (for the matrix point of view) or Jim Hefferon's book https://joshua.smcvt.edu/linearalgebra/index.html (for a vector-spacey treatment) are what comes to my mind when I think "rigor" (along with all sorts of older textbooks like Hoffman/Kunze); Jean Gallier's long set of notes https://www.cis.upenn.edu/~jean/math-deep.pdf goes even further in that direction.
Essence of linear algebra is an absolutely wonderful series. It gave me an intuition of the subject in a matter of hours in way years of university didn’t do.

https://youtu.be/fNk_zzaMoSs

lagrange77
Yes. The moment, when the background grid gets distorted by the matrix. Really helped me to calibrate my mental models.
Here's another vote for 3Blue1Brown's series on linear algebra [1]. Spending a few hours on this series will easily save you dozens of hours when going through a comprehensive LA textbook.

[1] https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQ...

Nov 01, 2021 · truly on SoME1 results
For anyone wondering what 3Blue1Brown is, it's a top Youtube channel with very nicely made explanatory videos for math.

The videos on linear algebra in particular are worth it, especially if you do any sort of machine learning: https://www.youtube.com/watch?v=fNk_zzaMoSs.

The author of the channel is Grant Sanderson. He has recently given a talk at SIGGRAPH 2021: https://www.youtube.com/watch?v=gvck7ssg9dE.

As part of developing his Youtube channel, he has written and open-sourced manim, a library for programatically generating animations written in Python: https://github.com/ManimCommunity/manim/.

memco
Grant also co-lectured for an MIT course on computational thinking using Julia: https://computationalthinking.mit.edu/Fall20/
mushishi
He also has started a podcast this year. In each episode he discusses mainly math and education related topics with different people. I have liked it quite much.

https://www.3blue1brown.com/podcast

LeonM
And for anyone wondering where the name 3blue1brown comes from: Grant Sanderson has a condition called heterochromia, his right eye is 3 parts blue, and 1 part brown, like his logo.

You can see it for example in this video: https://www.youtube.com/watch?v=-bc9EWhmDZg

lifeisstillgood
oh. I always assumed it was some maths problem - "if you remove 3 brown balls from the bag and 1 blue, what is the probability the islanders will kill you"

(I may be confusing maths problems, riddles and bad jokes)

morjom
Isn't heterochromia where both eyes are a different colour?
LeonM
According to Wikipedia [0]:

Heterochromia of the eye is called heterochromia iridum or heterochromia iridis. It can be complete or sectoral. In complete heterochromia, one iris is a different color from the other. In sectoral heterochromia, part of one iris is a different color from its remainder. In central heterochromia, there is a ring around the pupil or possibly spikes of different colors radiating from the pupil.

So, in case of Grant Sanderson, it would be sectoral heterochromia.

[0] https://en.wikipedia.org/wiki/Heterochromia_iridum

kzrdude
I wonder how this is resolved for places that ask for eye color (various people identification registries). If the information could be accurate and understandable (from a short database column entry), it's a quite unique identifier.
apricot
75% blue and 25% brown: "Brue eyes"

Or maybe take a weighted average of RGB values and convert to a word, which would give "teal eyes". Radical.

I strongly recommend 3blue1brown's intro to linear algebra[0] for quick intuitions, followed by Gilbert Strangs MIT Open Course[1][2] for (much) deeper detail. Strang is a genius, but Sanderson's graphics and intuitions can be helpful early on.

[0] https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQ...

[1] https://ocw.mit.edu/courses/mathematics/18-06-linear-algebra...

[2] https://www.youtube.com/playlist?list=PL49CF3715CB9EF31D

Exuma
thanks!! I skimmed Strangs video just now and it looks super good, what a great suggestion!
To answer your main question - "what level of maths" do you need is more about what topics: a couple introductory classes in Calculus, Linear Algebra, and Discrete Mathematics will make that sentence understandable.

I'll give some definitions and resources, but think about how you would explain multiplication/division/algebra to a kid. At some point, you just have to work through some problems and the math just makes sense. So I'll explain the terms above, but best bet is to just take a few courses on the topics I mentioned above.

Quick definitions: - Exponential: is the constant e to the power of some number. e, like pi, is a ubitiqous constant in mathematics and nature. You run into pi when doing geometry, and e when doing calculus.

- A function in mathematics is similar to a function in programming, but not exactly. A course in Discrete Mathematics helps here.

- F-apostrophe is notation used in calculus to show the relation of one type of function, called a derivative to another. f and f' are related. How, though, is better explained by taking calculus.

- An eigenfunction exists in a system of equations. Think of it like a 'balance' point. Linear Algebra will help make sense of this term.

Resources - Essence of Calculus: https://www.youtube.com/watch?v=WUvTyaaNkzM - Essence of Linear Algebra: https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQ...

Those 2 video's will do a good job of giving you the 'intuition' behind Calculus and Linear Algebra. Like programming, though, you just have to actually work some problems out by hand for the stuff to sink in. For that, do something like MIT opencourseware or a local online college course.

Maybe you would get something out of 3blue1brown's series on linear algebra [1]. It will give you an intuitive, geometric interpretation for concepts such as a vector, a basis, an eigenvalue, a linear transformation, etc.

As for why you would want to know these things, well, linear algebra is finding applications all over the place these days. Everything from computer graphics to machine learning is jam-packed with linear algebra. And I'm not just talking about basic concepts such as matrix-vector multiplication. The singular value decomposition has applications in discrete optimization, image compression, the PageRank algorithm [2], computer vision [3], and machine learning [2] [4].

Having said that, you may find it very difficult to understand something like SVD without a firm grounding in the topic of vector spaces, linear transformations, spanning, linear (in)dependence, subspaces, eigenvalues, eigenvectors, diagonalization, and determinants. This is why SVD is one of the last things you learn in a linear algebra course (indeed, it's not covered until lecture 29 of Gilbert Strang's course).

Ultimately, it all depends on how relevant these things are to you. I would assume (hope) that because you clicked on this HN discussion that you're interested in learning linear algebra because you think it might be useful to you.

[1] https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQ...

[2] http://www.cs.cornell.edu/courses/cs4850/2010sp/Course%20Not...

[3] http://cs.rkmvu.ac.in/~sghosh/public_html/nitw_igga/talk.pdf

[4] https://medium.com/@jonathan_hui/machine-learning-singular-v...

Frankly, couple with this book, it does hardly get better. You still have 3blue1brown[1] series of video, but it just brush off the surface.

[1] https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQ...

This is ok but nothing is as intuitive as 3B1B's series on YouTube that has been posted hundreds of times on HN [0].

Linear algebra is really about linear transformations of vector spaces, which is not captured in this blog post.

[0] https://www.youtube.com/watch?v=fNk_zzaMoSs

OrwellianChild
3B1B does great work explaining these concepts, but I can't help but ask "why not both?" when it comes to explaining these concepts. Turns out, linear algebra is great for working with matrices, vector space, approximating non-linear systems, and more... Let's embrace multiple ways of teaching it and gaining intuitions rather than keeping score, eh?
sandov
Playlist link: https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2x...
foxx-boxx
Often these articles are written by students who either failed their exam or scared to fail it. People who actually know the subject get paid either for lectures or for real work.
gitgudnubs
Then you failed to comprehend the subject. The point is that a wide array of problems and models are really the same thing.
vecter
Failed to comprehend which subject, linear algebra? I would argue no, and other people who are more on the pure mathematics side would agree [0][1].

snicker7 said it very succinctly:

> However, in mathematics proper, it is absolutely the case that linear algebra is about linear transformations. Indeed, this is the only interpretation that remains meaningful when trying to generalize (e.g. to functional analysis / multilinear algebra).

If you're point is that I failed to comprehend matrices, then I don't think you have enough data to make that claim since I don't really talk about matrices. I kind of address that in my other comment [2].

I don't follow your point around "a wide array of problems and models are the same thing". That's a very vague general statement that I certainly comprehend (not sure how you inferred otherwise). Specifically, I don't see how that point relates at all to the claim I made about linear algebra.

[0] https://news.ycombinator.com/item?id=22419018

[1] https://news.ycombinator.com/item?id=22417764

[2] https://news.ycombinator.com/item?id=22417595

nightcracker
> Linear algebra is really about linear transformations of vector spaces, which is not captured in this blog post.

I... disagree. Some of linear algebra is about that. And it's probably a good way to view it that way when learning.

But some of my current work (coding theory) involves linear algebra over finite fields. We use results from linear algebra, and interpret our problem using matrices, but really at no point are we viewing what we're doing as transforming a vector space, we're just solving equations with unknowns.

quietbritishjim
> > Linear algebra is really about linear transformations of vector spaces, which is not captured in this blog post.

> I... disagree.

This is literally the definition of the term "linear algebra".

> really at no point are we viewing what we're doing as transforming a vector space, we're just solving equations with unknowns.

You may not see what you're doing as transforming vector spaces with linear operators, but that is what you're doing. It's worth pointing out that the definition of vector spaces allows any field, including finite ones, though it's true that the intuition won't be exactly the same.

Another way to say this: if you're working on a problem without thinking about the connection to linear transformations, then it's not correct to say it's a linear algebra problem without obvious connection to linear transformations; instead, it's not a linear algebra problem at all, by definition.

snicker7
Linear algebra is a shared field across multiple disciplines. So I'm sure that there are many valid and useful interpretations as to what "linear algebra" is essentially about.

However, in mathematics proper, it is absolutely the case that linear algebra is about linear transformations. Indeed, this is the only interpretation that remains meaningful when trying to generalize (e.g. to functional analysis / multilinear algebra).

graycat
20+ years ago I took a grad course in coding theory, e.g.,

W. Wesley Peterson and E. J. Weldon, Jr., Error-Correcting Codes, Second Edition, The MIT Press.

-- gee, people are still studying/learning that?

The prof knew the material really well, but to up my game in the finite field theory from other courses, I used

Oscar Zariski and Pierre Samuel, Commutative Algebra, Volume I, Van Nostand, Princeton.

which did have a lot more than I needed!

My 50,000 foot overview of linear algebra is that the subject still rests on the apparently very old problem of the numerical solution of systems of simultaneous (same unknowns) linear equations, e.g., via Gauss elimination (it's really easy, intuitive, powerful, and clever, surprisingly stable numerically, and is fast and easy to program; someone might want to type in, say, just an English language description!). Since such the subject of linear equations significantly pre-dates matrix theory, the start of matrix theory was maybe just easier notation for working with systems of linear equations. In principle, everything done with matrix theory could have been with just systems of linear equations although often at a price of a mess notationally. In particular, as I outline below, now there are lots of generalizations of systems of linear equations that use different notation and not much matrix theory.

What's amazing are the generalizations, all the way to linear systems (e.g., their ringing) in mechanical engineering, radio astronomy, molecular spectroscopy, frequencies in radio broadcasting, stochastic processes, music, mixing animal feed, linear programming, oil refinery operation optimization, min-cost network flows, non-linear optimization, Fourier theory, Banach space, oil prospecting, phased array sonar, radar, and radio astronomy, seismology, quantum mechanics, yes, error correcting codes, linear ordinary and partial differential equations, ..., and then

Nelson Dunford and Jacob T. Schwartz, Linear Operators Part I: General Theory, ISBN 0-470-22605-6, Interscience, New York.

robpal
Linear algebra IS about linear transformations and vector spaces.

The thing is that the field over which the space is defined can be quite arbitrary (finite, infinite, not algebraically closed etc.) which has immense consequences on the behavior of such objects.

When one drops the assumption on finite number of dimensions, the story becomes wild (and is known as functional analysis, beautiful and extremely useful branch of mathematics).

xscott
I think this is spot on. Depending on what you're doing, a matrix can be:

    - A linear transformation
    - A basis set of column vectors
    - A set of equations (rows) to be solved
       - (your example: parity equations for coding theory)
    - The covariance of elements in a vector space
    - The Hessian of a function for numerical optimization
    - The adjacency representation of a graph
    - Just a 2D image (compression algorithms)
    ... (I'm sure there are plenty of others)
For some of these, the matrix is really just a high dimensional number. You (rarely?) never think of covariance in a Kalman filter as a linear transform, but you still need to take its Eigen vectors if you want to draw ellipses.
vecter
Great points. I wrote my comment in response to the article claiming to be an intuitive guide to linear algebra, not an intuitive guide to matrices. According to wikipedia:

> Linear algebra is the branch of mathematics concerning linear equations, linear functions, and their representations in vector spaces through matrices. [0]

The Venn Diagram of Linear Algebra and Matrices definitely has a lot of non-overlap, of which your list covers some. This article should be renamed to be about matrices and not linear algebra, because it's not.

[0] https://en.wikipedia.org/wiki/Linear_algebra

smallnamespace
A covariance matrix naturally transforms from the measured space to a space where things are approximately unit Gaussian distributed. This is identical to the Z transform in 1D case.

This can be useful in, say, exotic options trading - a natural unit of measurement is how many ‘vols’ an underlier has moved, e.g. a 10-vol move is very large.

FabHK
Not really the covariance matrix, though, but its Cholesky decomposition (which exists, as a covariance matrix is symmetric positive (semi)definite, as otherwise you could construct a linear combination with negative variance). Useful stuff.

And vice versa, btw - take iid RV with unit variance, hit them with the Cholesky decomposition, and you have the desired covariance. Used all over Monte Carlo and finance and so on.

threatofrain
How I view it is the matrix is a 2-dimensional indexed data structure, but when conditions are right the matrix becomes an object of Linear Algebra.
JadeNB
> - A basis set of column vectors

Let's leave the word 'basis' out, since the column vectors may well be linearly dependent.

FabHK
Or not span the space, for that matter.
JadeNB
Well, it depends on what "the space" is. Every set of vectors (in a common ambient space) spans some space—often called the column space of a matrix, if they are the column vectors of the matrix.
FabHK
Sure, every set of vectors will span the space they span. But the requirement that a basis span the space refers to the space it’s in, not the space it spans (otherwise every linearly independent set of vectors were a basis, spanning the space it spans.) I could go on :-)
obastani
The first three can reasonably be thought of as defining linear transformations. For linear systems of equations A x = b, x is an unknown vector in the input space that is mapped by A to b.

Both covariance matrices and Hessians are more naturally thought of as tensors, not matrices (and therefore not linear transformations). That is, they take in two vectors as input and produce a single real number as output.

As for graph adjacency matrix, this can actually be thought of as a linear transformation on the vector space where the basis vectors correspond to nodes in the graph. Linear combinations of these basis vectors correspond to probability distributions over the graph (if properly normalized).

2D images... Yes, these cannot really be interpreted as linear transformations. But I'd say these aren't really matrices in the mathematical sense.

xscott
If you squint hard enough, you can see all of them as linear transformations (even the 2D images :-).

I politely disagree about covariance and Hessians. I can squint and say that the Hessian provides a change in gradient when multiplied by a delta vector. Similarly for covariance... Or you could look at it as one half of the dot product for a Bhattacharyya distance, which is just a product of three matrices (row vector, square matrix, col vector). No need for tensors yet.

That is unless you decide to squint hard enough to see everything as tensors! :-)

If you haven't seen it, it might be worth checking out 3Blue1Brown's Linear Algebra YouTube series (https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQ...).

I think it's exactly that kind of "intuition of what Linear algebra is and is for".

i recommend this series

https://www.youtube.com/watch?v=fNk_zzaMoSs

idonotknowwhy
Thanks, I'll check it out. This is one of those things which I never knew I didn't know about.
rohan_shah
Goes very well with your username.
psv1
I would recommend 3blue1brown only if you've already covered the material in another way. It's a great way to get some new intuition about things, the videos can help something 'click' and are a pleasant watch with an obviously high production quality. I just don't think they are great for being your first exposure to a topic.
If you want to get an intuitive, visual understanding of linear algebra -- including eigenvectors/eigenvalues -- 3blue1brown's playlist on the subject is just ... perfect. https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQ...
Koshkin
Problem with visual understanding of linear algebra is that it is no understanding at all when it comes to linear spaces over the complex numbers (which are an important tool in quantum mechanics, for example) or infinite-dimensional spaces. Attempts (or the habit) to use the intuition gained when being exposed to an elementary examples often lead to gross misunderstanding and logical errors.
eigenloss
It seems like you're suggesting the added complexity of complex numbers makes visual understanding unhelpful; do you care to explain why visualizing complex numbers or multidimensional spaces is so impossible?
3Blue1Brown's Essence of linear algebra series is the go to place for learning this material. I've included the link to the series below so people don't get lost in all his other wonderful videos along the way.

https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQ...

throway88989898
No offense to 3b1b, the production quality is impressive and the content is effectively insightful.

However, it is ultimately a very limited linear format covering the topic only lightly.

tokyodude
I see you used a throwaway because there are so many fans. I might not be as harsh but while I enjoyed the pretty animations they didn't actually help me learn linear algebra. They're still full of jargon written for people who already know the topic.
3Blue1Brown has a great series on Linear Algebra. His explanations are so clear that by the 2nd or 3rd video you'll already understand how it applies to computer graphics.

https://youtu.be/fNk_zzaMoSs

kevinskii
I agree. I've had quite a bit of exposure to linear algebra through other textbooks and online courses, and the 3Blue1Brown series explains the intuition better than just about anyone. In particular, the video on determinants really crystallized the concept for me.
Dec 26, 2018 · 3 points, 0 comments · submitted by furcyd
The 3B1B series on Linear Algebra is by far the most welcoming and informative introduction to the topic I've ever seen: https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQ...
aidos
Any linear algebra post I come here to comment the same (if someone else hasn’t done it already). Seriously, this series is absolutely wonderful.
YouTube math god 3Blue1Brown has a lovely series of videos that visualize linear algebra: https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQ...
If you haven't already studied Linear Algebra, and want to get a headstart on that, check out the "Coding The Matrix" book/videos from Brown.

http://codingthematrix.com/

https://cs.brown.edu/video/channels/coding-matrix-fall-2014/

https://www.amazon.com/Coding-Matrix-Algebra-Applications-Co...

Also, see the Gilbert Strang video series on Linear Algebra:

https://www.youtube.com/playlist?list=PL49CF3715CB9EF31D

and the amazing 3blue1brown "Essence of Linear Algebra" series:

https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQ...

Dec 22, 2016 · 3 points, 0 comments · submitted by rosstex
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.