Hacker News Comments on
The Society of Mind
·
5
HN comments
- This course is unranked · view top recommended courses
Hacker News Stories and Comments
All the comments and stories posted to Hacker News that reference this book."Novel" is relative.However, this concept dates back at least to the 1980s with Marvin Minsky's "The Society of the Mind":
https://www.amazon.com/Society-Mind-Marvin-Minsky/dp/0671657...
Also 3 more "easy overview" type books:The New Turing Omnibus by A K Dewdney (https://www.amazon.com/New-Turing-Omnibus-Sixty-Six-Excursio...)
The Society of Mind by Marvin Minsky (https://www.amazon.com/Society-Mind-Marvin-Minsky/dp/0671657...)
Creating Mind: How the Brain Works by John E Dowling (https://www.amazon.com/Creating-Mind-How-Brain-Works/dp/0393...)
I'm curious how close the research community is to general AINobody knows, because we don't know how to do it yet. There could be a "big breakthrough" tomorrow that more or less finishes it out, or it could take 100 years, or - worst case - Penrose turns out to be right and it's not possible at all.
Also, are there useful books, courses or papers that go into general AI research?
Of course there are. See:
https://www.amazon.com/Engineering-General-Intelligence-Part...
https://www.amazon.com/Engineering-General-Intelligence-Part...
https://www.amazon.com/Artificial-General-Intelligence-Cogni...
https://www.amazon.com/Universal-Artificial-Intelligence-Alg...
https://www.amazon.com/How-Create-Mind-Thought-Revealed/dp/0...
https://www.amazon.com/Intelligence-Understanding-Creation-I...
https://www.amazon.com/Society-Mind-Marvin-Minsky/dp/0671657...
https://www.amazon.com/Unified-Theories-Cognition-William-Le...
https://www.amazon.com/Master-Algorithm-Ultimate-Learning-Ma...
https://www.amazon.com/Singularity-Near-Humans-Transcend-Bio...
https://www.amazon.com/Emotion-Machine-Commonsense-Artificia...
https://www.amazon.com/Physical-Universe-Oxford-Cognitive-Ar...
See also, the work on various "Cognitive Architectures", including SOAR, ACT-R, CLARION, etc,
https://en.wikipedia.org/wiki/Cognitive_architecture
"Neuvoevolution"
https://en.wikipedia.org/wiki/Neuroevolution
and "Biologically Inspired Computing"
https://en.wikipedia.org/wiki/Biologically_inspired_computin...
⬐ hhsThese are useful references, thanks.
There's a GREAT book that discusses this phenomenon called "Society of Mind": https://www.amazon.com/Society-Mind-Marvin-Minsky/dp/0671657...Every HN'er should read it. Basically, we have all these "mini-mentalities" in our brain, but they "talk" (or don't talk) together in different ways. Definitely interesting if you're an AI guy.
Take a look at Marvin Minsky's The Society of Mind. Consciousness as an emergent property of the communication between simple agents is a large arm of cognitive science.http://www.amazon.com/The-Society-Mind-Marvin-Minsky/dp/0671...
⬐ hharrisonI am aware of Minsky's work, see my other reply. I was too flippant in my comment. Perhaps I should have said intelligent behavior, rather than intelligent structures, is self-organized.But, I don't think any approach to consciousness can be considered a large arm of cognitive science. Most cognitive scientists don't want to touch consciousness with a ten-foot pole. But of course you're right that connectionism lives on.
Let me try to explain the kind of self-organization I have in mind. Consider the fundamental question "how is behavior organized"? The behaviorists pointed to organization in the environment. Cognitivists point to organization of internal representations. Connectionists and similar approaches point to organization of neural structures. Yes, something intelligent emerges from simple, perhaps self-organized, components in this scheme. But they are unwilling to take self-organization to the level of behavior.
In my opinion, a true self-organizational approach to behavior is to say that behavior emerges from the interaction between organism and environment. This is the level at which we need to accept self-organization. It is far from the mainstream. The mainstream approach to vision, for example, starts with the retinal image and asks what can be inferred from it. Yes, maybe they say that this inference engine is itself a self-organized structure. But it still reifies an input-process-output view of cognition. The sensory system receives input, constructs a model of the world. The "higher cognitive" centers formulate plans from this model. The action system instantiates these plans.
To bring it back to the ants: The ants demonstrate what can be done without explicit planning. Modern cognitive science studies explicit planning, even if they agree that this capability emerges from simple components.
As I said in the other post, I could provide literature if you are interested in any of these specific debates.