HN Books @HNBooksMonth

The best books of Hacker News.

Hacker News Comments on
Holographic Reduced Representation: Distributed Representation for Cognitive Structures (Volume 150) (Lecture Notes)

Tony A. Plate · 1 HN comments
HN Books has aggregated all Hacker News stories and comments that mention "Holographic Reduced Representation: Distributed Representation for Cognitive Structures (Volume 150) (Lecture Notes)" by Tony A. Plate.
View on Amazon [↗]
HN Books may receive an affiliate commission when you make purchases on sites after clicking through links on this page.
Amazon Summary
While neuroscientists garner success in identifying brain regions and in analyzing individual neurons, ground is still being broken at the intermediate scale of understanding how neurons combine to encode information. This book proposes a method of representing information in a computer that would be suited for modeling the brain's methods of processing information. Holographic Reduced Representations (HRRs) are introduced here to model how the brain distributes each piece of information among thousands of neurons. It had been previously thought that the grammatical structure of a language cannot be encoded practically in a distributed representation, but HRRs can overcome the problems of earlier proposals. Thus this work has implications for psychology, neuroscience, linguistics, and computer science, and engineering.
HN Books Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this book.
you might also be interested in the recent work on "resonator networks" VSA architecture [1-4] by Olshausen lab at Berkeley (P. Kanerva who created the influential SDM model [5] is one of the lab members).

It's a continuation of Plate [6] and Kanerva work in the 90s and Olshausen' groundbreaking work on sparse coding [7] which inspired the popular autoencoders [8].

I find it especially promising they found this superposition based approach to be competitive with optimization so prevalent in modern neural nets. May be backprop will die one day and be replaced with something more energy efficient along these lines.

[1] https://redwood.berkeley.edu/wp-content/uploads/2020/11/frad...

[2] https://redwood.berkeley.edu/wp-content/uploads/2020/11/kent...

[3] https://arxiv.org/abs/2009.06734

[4] https://github.com/spencerkent/resonator-networks

[5] https://en.wikipedia.org/wiki/Sparse_distributed_memory

[6] https://www.amazon.com/Holographic-Reduced-Representation-Di...

[7] http://www.scholarpedia.org/article/Sparse_coding

[8] https://web.stanford.edu/class/cs294a/sparseAutoencoder.pdf

pshc
Thank you for the great reading material! From a skim my take is that resonator networks are able to sift through data and suss out features (factors) from the noise, and even decode data structures like vectors and mappings. And RNs can be made to iterate on a problem much like a person might concentrate on a mental task. Is that a fair summary of their capabilities?
HN Books is an independent project and is not operated by Y Combinator or Amazon.com.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.