HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Ted Nelson struggles with uncomprehending radio interviewer, 1979

TheTedNelson · Youtube · 299 HN points · 8 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention TheTedNelson's video "Ted Nelson struggles with uncomprehending radio interviewer, 1979".
Youtube Summary
Max Allen of CBC radio asks over and over
how a computer could possibly be useful
for thinking and visualizing. He absolutely
does not get it.

With unprecedented patience, Ted answers
over and over and over and over.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Jun 21, 2022 · 5 points, 0 comments · submitted by pmoriarty
Also see this Ted Nelson interview from 1979:

https://www.youtube.com/watch?v=RVU62CQTXFI

Aug 20, 2021 · shawnz on Dumb Phone
The author's arguments kind of remind me of those of the interviewer in this 1979 clip with Ted Nelson: https://www.youtube.com/watch?v=RVU62CQTXFI
dredmorbius
Great interview, one of my faves.

It's also appropriate, though possibly not for the reasons you'd chosen it.

Max Allen, Nelson's interviewer, struggled to be able to think far enough forward to comprehend how a tool he thought of as a glorified calculator might possibly be of use to the average person. The problem of course was that he was mis-categorising the tool, and greatly underestimating its capabilities and applications.

The issue I'm seeing in this thread, and amongst communications device and software designers, is a similar failure of comprehension and imagination, of being unable to look back in time and understand how a glorified bit of papyrus and leaky grease tube (pen and paper) could possibly serve the functions modern smartphones do. Again, it's a misunderstanding of the capabilities, workflow, possibilities, and affordances.

Thanks for illustrating the point so clearly!

shawnz
Of course I think you have it wrong here (and I don't really appreciate being told that I have a lack of comprehension, nor having my argument misrepresented in such a way. Perhaps try to consider the counterarguments being made in this thread more charitably).

I think Nelson's point was that the computer is a strict superset of those previous forms of writing. It doesn't need to replace every single instance of pen and paper, or Rolodex, or secretarial staff, but there's no reason why it shouldn't in any case where the cost is low enough.

I think that the argument that the smartphone's UI is more synchronous than those past technologies is true in some ways but it's a weak one: that's easily solved by just having multiple computers, or multiple terminals (or most likely a desktop and a smartphone). If that is not yet cheap enough, that's still fine, because you could simply continue to use the smartphone together with a notepad or secretary until it is cheap enough (as many do today).

If people are choosing to forgo those past technologies in exchange for a more synchronous UI today, I don't think it's because they are not imaginative enough to understand that notepads can still be used. It is simply because they are willing to accept a somewhat more synchronous interface in exchange for a vastly more efficient ability to execute those tasks synchronously. Or, perhaps they simply don't find it to be too synchronous: after all, we only have two ears and two eyes, so there is a limit to the advantage of having multiple channels, and perhaps modern smartphone users are just very effective at maximizing the bandwidth of that interface (considering how many have pointed out here that there are many options for multitasking on smartphones).

dredmorbius
Again: That really is a favourite interview. I'm familiar with Nelson's work. I agree with many of his views, differ strongly on others.

I also stand by what I said: it's easy to focus on what you know, and to ignore or discount foreign experiences, whether local or distant, or future or past.

One of Nelson's problems is that he's not an effective communicator when talking to the general public. He has advanced ideas. He certainly feels strongly about them. Some are valid, though I suspect others ... not quite so much.

In particular he falls into one of the classic technologist's fallacies: seeing the potential benefits of a tool, process, or technology, but not its drawbacks. Most especially the emergent drawbacks. If you look at the contemporary litany of complaints about technology (spam, advertising, censorship, propaganda, surveillance, manipulation, clickbait, adware, attention fragmentation, an ever-expanding security attack surface and threat model, platform complexity, an obsolescence treadmill, fragile systems, business-model / monetisation challenges, etc., etc.), I think you'll find that Nelson didn't anticipate them, and even now doesn't much address them. These are inherent to the type and level of technology we've adopted.

By contrast, the limits of paper reduce much of that.

The limits of paper do pose numerous other challenges. This is a trade-off.

But that is my point: to consider the specific affordances and drawbacks of each approach, to look closely at the overall structure and system, and to think critically about which of those aspectare truly valuable, and which constitute blocks too true value (to the end-user, as opposed to value-extraction by a hardware vendor or service provider).

My comment above does have its cheeky aspect. But I'm completely serious about the overall point: that there's a blindness to the unfamiliar, and that it works both ways.

Maybe in another 20--30 years you'll have a better understanding of this.

Sep 13, 2020 · 71 points, 30 comments · submitted by davisr
ggrrhh_ta
Such an immensely articulate person... His "Computers for Cynics" series is also very recommendable, the blockchain/bitcoin one particularly so for outsiders that want to know or need to communicate on related matters (e.g. journalists). Happy to have been able to hear him in youtube, he feels so close one feels one knows him personally.
angleofrepose
I'll second the computers for cynics. I don't think one of that series is about blockchain but he does have other videos about blockchain.

For those who don't know computers for cynics is his series about questioning the origins of the status quo and considering alternative futures with different foundations.

https://www.youtube.com/playlist?list=PLTI2Kz0V2OFlgbkROVmzk...

ggrrhh_ta
Is the seventh in the series: https://youtu.be/3CMucDjJQ4E published on Sep 2014, six years ago, and a must-watch account for any outsider even in 2020. Enjoy it!
angleofrepose
Ah I didn't even realize the list I sent wasn't from his account. Thanks for the correction!
ggrrhh_ta
That 2014 video is so revealing... I think it deserves a post of its own :-)
zandorg
I worked with Ted a few years from 2001 and though we kind of stopped working together, I knew him! So I do know him personally. He's a great guy though intolerant of nonsense.
ggrrhh_ta
He gives that impression from his videos. He seems to speak his mind honestly and rightfully demands reciprocity.
bborud
I discovered that Ted Nelson has a Patreon: http://www.Patreon.com/thetednelson . I decided to sign up and perhaps you would like to help out with a few bucks too.

I've seen talks by him a few times and I think he is worth listening to. He has interesting ideas well worth understanding.

wombatmobile
13:40 "The two commonest digital notation languages in the world are musical notation, and knitting."

Knitting is the perfect analogy for describing how DNA works to your grandmother.

"Knit 1 purl 2 often enough and you end up with a sweater."

dang
If curious see also

a thread from 2018: https://news.ycombinator.com/item?id=17376753

dustingetz
LMAO 8:10 Interviewer: This seems to me first and foremost to be a technology looking for a problem to solve
konjin
He's right. The same way that writing was. We had perfectly good oral traditions before writing and they were good enough to get us to the bronze age.

I'm a bit annoyed at Ted Nelson for not making that comparison.

the-dude
An [audio] tag would be more appropiate.
angleofrepose
It's been a little while since I've heard this one, but if there's one thing that I remember hitting me is that the same language used by the interviewer is used today about privacy, end user programming and any other more powerful technology, programming language or paradigm. It seems that as an culture we're always able to go so far, but not all the way. We see the path between start and end, there's no genius needed for the last push, but after so much progress we reduce ambition towards the end goal and instead develop arguments against continuing.

At some point we just don't think people need help with paper based tasks, "look around you, it's how everything is done" yet here we are with the PC 40 years later. And people look around and think that there is no chance everyone could be a programmer "look around you, they're all consumers, they couldn't understand how to make the computer do what they like". In 40 years there's no doubt this viewpoint will be wrong, but the popular opinion on the matter can't see that future.

See Bret Victors history of computing. The biggest adversary we have to overcome towards progress is the mainstream experts of our own field.

We have apps which seem to be like starting from scratch every time, which can't have abilities known by all because they aren't prepackaged by the devs ahead of time. Every app reinvents a minimal subset of sorting and search. If you have a better idea or a different connection you want to make its just not possible in the app.

Stop pretending that debilitating users is actually good for them in the silly word games we play. Give users power.

danbolt
I feel like we don't give tools like Microsoft Excel, Game Maker, Photoshop, or the web enough credit for the ability they empower users of computers. Without any formal training or education, they're able to use computers to their ends.
konjin
Give Excel to an uncontacted tribe and see how well they go with it.

That we assume reading, numeracy and fine motor skills which until 5 centuries ago were the preserve of less than 1% of the population in the West are not part of formal education or training should tell you all you need to know about how much cultural knowledge we assume people need to have in their daily life to function at the level of a 10 year old.

That 4 years to learn to read is considered normal but reading a 100 page manual is considered unreasonable shows us how much popular culture is lagging behind our tools. Given how technocentric our culture is, this is a situation as ridiculous as Mongols complaining that they need to learn to ride horses.

danbolt
oh ok
webwanderings
Good for the interviewer for making counter arrangements at that time. Some of the things he said are still true: we still have to come up with categories in our heads. This is true regardless of advancements in ML.
angleofrepose
Good point! I might give it another listen. I remember it being an interesting respectful conversation with no ground given on either side. Hence the title, I suppose.
himinlomax
> there is no chance everyone could be a programmer

There is no chance, even if by "everyone" you just meant "a majority". It's intellectually challenging, just for one.

angleofrepose
I think the other two comments make good points. There is every chance that everyone will learn to use a small set of fundamentally composable digital tools in the future. That's programming. I think "intellectually challenging" just means poorly explained or resulting from poor access. Anyone can program, it's just artificially hard to do it today.
himinlomax
> There is every chance that everyone will learn to use a small set of fundamentally composable digital tools in the future. That's programming

That's not, unless you stretch the meaning of the word far into meaningless. But if you insist on doing so, then yes, most people should be able to "compose digital tools" for a small enough number of digital tools and a wide enough meaning of "compose." Although, on second thought, it appears so many people had issues with "programming a VCR" back in the day, and that wasn't anywhere close to my meaning of "programming."

So let me rephrase it, "there is no chance everyone or even majority could become a minimally proficient user of a minimally useful programming language for novel tasks beyond a sequential list of actions."

dustingetz
Imagine if "everyone" was literate, what a world that would be /s
himinlomax
Is being able to read at the minimum level comparable to programming?

Do you seriously expect 99% of the population to be able to understand something as simple as the demonstration of Euclid's theorem? And yet any programming is more complicated than that, and more analogous.

jstanley
Reading and writing is intellectually challenging. Everyone can do it now, because we make sure to teach them to do it, and because it's necessary to live in modern society.
himinlomax
Almost anyone can walk, not everyone can climb mount Everest, or even a much smaller mountain face. Bad analogies are bad.
AnIdiotOnTheNet
Are you seriously trying to claim that programming, of any useful sort, is so hard that only really smart people, presumably like yourself, can do it? Christ. I really hope the software development industry gets its ego kicked in really hard in the near future so everything can stop sucking because of elitist asshats.
himinlomax
Well on one hand, it could be my massive ego, and on the other it could be the truth.

You're the one arguing on the basis of value. I don't. Let me just ask you this; do you think that being a good enough finance trader requires a high IQ? I'm pretty sure of it, and I'm also of the opinion that what they're applying their smarts to is a net negative for society as a whole.

Another question, have you read Steven Pinker's The Blank Slate?

anotherhue
> The biggest adversary we have to overcome towards progress is the mainstream experts of our own field.

“Science advances one funeral at a time.” - Max Planck (Apparently)

angleofrepose
This Idea Must Die is an interesting take on that concept.
Ted Nelson: Computer Lib / Dream Machines (1975) [pdf] (worrydream.com)

https://news.ycombinator.com/item?id=19249556

http://worrydream.com/refs/Nelson-ComputerLibDreamMachines19...

For what it's worth, YC is helping Ted Nelson sell his "Computer Lib / Dream Machines" book:

https://twitter.com/nolimits/status/1087770718878687232

This book is a truly unique and is worth owning in hardcopy format.

https://news.ycombinator.com/item?id=19058137

Ted versus The Media Lab [video] (youtube.com)

https://news.ycombinator.com/item?id=22169775

https://www.youtube.com/watch?v=qH4Kr3Gsadc

Interview with Ted Nelson (notion.so)

https://news.ycombinator.com/item?id=19057331

https://www.notion.so/tools-and-craft/03-ted-nelson

Ted Nelson on What Modern Programmers Can Learn from the Past [video] (ieee.org)

https://news.ycombinator.com/item?id=16222520

https://spectrum.ieee.org/video/geek-life/profiles/ted-nelso...

Ted Nelson struggles with uncomprehending radio interviewer (1979) [audio] (youtube.com)

https://news.ycombinator.com/item?id=17376753

https://www.youtube.com/watch?v=RVU62CQTXFI

Ted Nelson’s published papers on computers and interaction, 1965 to 1977 (archive.org)

https://news.ycombinator.com/item?id=16245697

https://archive.org/details/SelectedPapers1977

Ask HN: What is the best resource for understanding Ted Nelson's ZigZag?

https://news.ycombinator.com/item?id=22518401

http://www.xanadu.com.au/ted/XUsurvey/xuDation.html

http://mimix.io/getting-to-xanadu

Alan Kay's tribute to Ted Nelson at "Intertwingled" Fest (how the script of Tron was the first movie script to ever be edited by a word processing program, on the Alto computer)

https://www.youtube.com/watch?v=AnrlSqtpOkw

"Silicon Valley Story" — a Very Short Romantic Comedy by Ted Nelson

A playful story about the microcircuitry of love, with Ted Nelson as an absentminded genius, featuring Doug Engelbart as Ted's father and Stewart Brand as the villainous CEO.

Closing song: "Information Flow", sung by Donna Spitzer and the auteur.

With Timothy Leary as the Good Venture Capitalist!

https://www.youtube.com/watch?v=AXlyMrv8_dQ

Ted Nelson's Channel

https://www.youtube.com/channel/UCr_DXJ7ZUAJO_d8CnHYTDMQ

lioeters
Oh, joy - the play "Silicon Valley Story" by Ted Nelson is so funny, weird, and self-consciously awkward in the best theatrical sense. Brilliant. I had never seen that before.

Saved the whole list in study/ted-nelson.txt. Thank you for gathering the links and sharing. (I'm a long-time fan of your work!)

Ted Nelson pointed out that knitting (or was it weaving?) is one of the most popular programming languages back in 1979.

https://m.youtube.com/watch?v=RVU62CQTXFI

Edit to add: I wouldn’t be surprised if there are more knitting programmers today than all computer programming languages combined. Just the size and pervasive presence of Walmart or Joann’s yarn and fabric sections makes me almost certain.

dylan604
The really nice yarn is not found at either of those places. You want to see serious knitters, find a dedicated yarn store, or check out a yarn convention. It's like an xmas movie where the shoppers are fighting over the last skein of yarn from a well known dyer.
Jun 22, 2018 · 199 points, 103 comments · submitted by pmoriarty
Jedd
While the host - Max Allen - is certainly a bit obtuse and doesn't share the same technical insights & visions of Ted Nelson (hardly surprising given they had significantly different experience and careers) he's respectful of his guest (doesn't interrupt or talk over), concedes various points willingly, appears to genuinely want to understand, and actually enjoys the discussion.

The nostalgia sensation isn't just around Ted's prescience.

andyidsinga
I really appreciate that too - the interview / discussion is very interesting; draws a lot of good stuff out of Ted Nelson.
EGreg
That’s how interviews used to be.

FOX News pioneered the interview where the interviewer spends most of the time talking, asking leading questions to the respected person they invited, then cutting them off, shouting them down and attacking them for being socialist or whatever. And on to the next.

It’s more like watching gladiator fights than really learning people’s answers to questions.

taberiand
His viewpoint is also rather reasonable considering the context of the interview, at the dawn of the information age. It's a little depressing however that it's not uncommon to still encounter that point of view today.
DonHopkins
I love his criticism of IBM (and the literary adjective that perfectly describes IBM that I learned today):

Q: Tell me about IBM?

A: What would you like to know? OMINOUS LAUGH

Q: Well, I'd like to know what's wrong with them, to start with, since that's what you want to talk about.

A: Ok. IBM is first and foremost a very slick sales organization, which was created in the image of Thomas J. Watson, a supreme despot, and very imaginative salesman who managed to create an organization with less fat in -- pardon me -- less local fat than any other corporation that ever happened.

And one that has over the years learned to devote teams to getting things done. That's the positive side. Get these things done with dispatch and with earnestness.

Now whether they're done the way one would like to see them done if one contemplated what the real nature of a problem, this is and entirely different manner. And critics of such things as IBM's 360 and 370 computers would say that they are were Brobdingnagian, clumsy and surrounded by unnecessary difficulties.

And this of course is why many people like the kids who showed me around the University of Waterloo this morning far prefer to use systems like Digital Equipment machines which are much more accessible to the sophisticated user.

Anyway, the question is why has IBM prevailed in its way, and the answer is that they have a sort of monopoly. And one which obviously has a political side and a technical side. And the problem is now that as they are swinging their new communication system into place, it seems increasingly likely that this communication system is more built to maintain the monopoly than it is built to satisfy the needs of what people ought to have.

Q: What communication system?

A: Oh there's something called Satellite Business Systems, which IBM and Aetna Life Insurance and I think a few other partners have created in a joint venture.

thechao
Brobdignagian is the opposite of Lilliputian—from Gulliver’s travels. It means stupendously gargantuan.
JetSpiegel
Brobdingnag is the "Gulliver's Travels"[1] country where everyone is a giant. Famous for having a breastfeeding woman horrifying the narrator, and him sword fighting a rat.

[1]: https://en.wikipedia.org/wiki/Gulliver%27s_Travels#Part_II:_...

DonHopkins
whoosh -- That's the second literary reference that went over my head when transcribing an interesting Ted Nelson video. I originally guessed he was saying that "Franklin Brouwer Zeus" invented the internet, when he actually said people thought "it sprang from the brow of Zeus". Here is the corrected version: ;)

https://www.youtube.com/watch?v=edZgkNoLdAM

When we look at the past of computerdom, it's through a lens that's peculiar, because things have changed so much so fast.

That to me the 50 years since I've been in the computer field have gone so quickly that the past seems ever-present.

In the 60's and 70's, a lot of young people have started communes. And it was a combination of free love, which is a term you don't hear any more because it's taken for granted, and pot, and LSD, and idealism, and hope for a new kind of economy.

And that spirit of that age leaked into the computer world. There was a sense of possibility at the beginning that is different because we thought computing would be artisanal.

We did not imagine great monopolies. We thought the citizen programmer would be the leader.

When I say we I mean I, but of course I had a sense that I was sharing this with a lot of people.

We had visions of democratization, of citizen participation, to create vistas of possibility for artistic expression, and artistic expression in software. And software is an art form, though not generally recognized as such.

And because of Moore's Law, which has been stated to me not as Moore's Law, but just as a general principle: things will get faster and cheaper. We will be able to afford it. Right now a computer with a screen is $35,000. Tomorrow, who knows. It will be $100 some day.

Now is the time to start thinking about what will be the documents of the future.

As I would abstract it now, the two concepts were:

We can have parallel connections between visible documents. So you can have two pages with a connection saying "this sentence is connected to that paragraph" and see it as a visible strap or bridge.

And you can't do that yet. So that was one of my hypertext concepts.

And the other hypertext concept was being able to click on something and jump to it.

So as the hypertext concept developed and deteriorated over the years, only the jump link became popular in the hypertext systems of the 60's and the 70's, and then Tim Berners Lee created the World Wide Web, which was the sixth or seventh hypertext system on the internet.

People think it sprang from the brow of Zeus, in fact it was just a clean job that had the clout of CERN behind it. How to see the possibilities when there are so many things around you that are a certain way?

I don't know. The future is an unknown place. There are a lot of scary things about it.

What aspects you are going to approach? Are you going to go on thinking about leisure, or about the terrible problems that confront the world?

All I can say is: "Close your eyes, and think what might me."

My first software designs were largely done with my eyes closed. Thinking: "Now if I hit that key, what should happen? If it hit this key, what should happen?"

I was able to imagine -- they say this can't be done, but when my interfaces were built, they always felt the way I knew they would.

And the people at Xerox PARC said "That's never possible. You never know how it's going to be." But I did.

TheTedNelson
Thanks for the transcription, but please change one word--

>All I can say is: "Close your eyes, and think what might me."

That last word should be "be".

(Apparently I also created a Ycombinator account under the name "Ted Nelson", but they won't tell me the password.)

aportnoy
https://en.wikipedia.org/wiki/Brobdingnag
dluan
The following short segment is great.

"What's swinging into place Max, is that we have great communication networks now coming about for the transmission of digital information.

Now by digital we just mean symbols. There's this mistake that digital means in numbers. That's wrong. The two commonest programming languages are musical notation and knitting instructions.

That's not fu... why are you laughing."

Knitting was really big back then.

EGreg
He forgot recipes :)
fapjacks
Knitting is really big still. One of the biggest lobbies for copyright and IP legislation is (no kidding) the industry that creates patterns for sewing and knitting. Put out feelers on social media, see how many of your associates knit which you didn't know about. It was surprising to me for some reason to find out how pervasive it is.
tomatotomato37
Wait, by sewing pattern industry, do you mean the fashion industry?
fapjacks
Haha that's a great question, but no, there is actually a different set of companies selling patterns for sewing and knitting (traditionally to their target demographic of older ladies). Companies that typically sell patterns to places like Hobby Lobby or Michaels, this kind of thing, and not companies like Old Navy and Forever21 and the like. In that industry, patterns are sort of like patents in a way, and these companies have amassed huge libraries of patterns and aggressively defend them with litigation. I'll dig a bit to see if I can find information about the specific events I'm thinking of, but the gist of it was that alongside Napster I believe, one of the biggest and earliest instances of copyright/IP litigation was against little old ladies who were using P2P to fileshare sewing and knitting patterns. This was before the RIAA started suing customers, and I believe it actually partly served as the inspiration for the RIAA stupidly choosing to go that route. It's been some years since I've read the story, so I'm likely not remembering some details correctly. Edit: I can't find a link to the story I'm thinking of, but you can actually find some side effects of this litigation, for example this "copyright notice" (especially read the actual PDF as this notice was pretty clearly issued to protect the interests of these pattern companies I'm talking about): https://www.gov.uk/government/publications/copyright-notice-...
pmoriarty
This reminds me of trying to convince a FidoNet BBS sysop in the late 80's to try the Internet. He adamantly refused, saying FidoNet would be all he'd ever need.

It also reminds me of arguing in the 80's with an IBM PC user that the Amiga's 4096 colors were desirable. He insisted that 16 colors were all he'd ever need.

I also tried to turn my dad on to the Internet and Usenet newsgroups in the late 80's. That failed too. He just wasn't interested.

I guess I'm just not a very good salesman.

unimpressive
The funny thing is that early on in this interview the host tells Nelson that he's the only one saying anything about using computers to store and retrieve text. But that's simply not true, certainly not in 1979. For example, John McCarthy had been describing what largely came to be our digital literary future as early as 1970:

https://news.ycombinator.com/item?id=10370990

Even within the contemporary environment a significant fraction of computer use was not 'number crunching', but the sorting, retrieval, and printing of essentially textual information. (i.e, records) Records are tiny documents, which to me makes this early portion where Ted accepts the point without argument a large missed opportunity to get the point across.

https://www.youtube.com/watch?v=HMYiktO0D64

8bitsrule
1979

Even 10 years later. In the mid-80s (by which time BYTE magazine was as big as a Sears catalog) I wrote a piece on the approach of the PC era and offered it to a smaller (100K) town's newspaper editor. His response: 'That's kind of a nitch product isn't it?'

The bigger newspapers were already moving completely to computers, but such facts hadn't really reached such guys.

shagie
Started listening, ok, sounds like an interesting person. Check the Wikipedia page on him...

> Theodor Holm "Ted" Nelson (born June 17, 1937) is an American pioneer of information technology, philosopher, and sociologist. He coined the terms hypertext and hypermedia in 1963 and published them in 1965.[1] Nelson coined the terms transclusion,[1] virtuality,[2] and intertwingularity (in Literary Machines), and teledildonics[3].

tele... is that what I think it is? Click link. Yep.

The interview is really interesting and how much he got right about the past 30 years.

oh_sigh
https://archive.org/stream/Mondo.2000.Issue.02.1990#page/n53...
miguelrochefort
This is how I feel about computers today. They're a mess. The fragmentation of hardware, operating systems, apps, websites, programming languages and frameworks is horrible.

I frequently ask people how they feel about the current state of software. Most people think it's fine, they don't have a problem with it. They can't imagine how else it could be. This drives me mad.

I have 100 apps on my phone. None of them talk to each other. Every new appliance, restaurant and event has its own app. These 100 apps will quickly turn into 1000. Clearly, the application paradigm doesn't scale. Where is it going to end?

We're still emulating the physical world in software. File systems are still trees. Programming is still done with text. Paragraphs are still copy-pasted, rather than dynamically embedded. We barely made any progress since that radio interview. Xanadu is still vaporware.

What will it take for a software revolution to take place?

mrkstu
OLE embedding and OpenDoc utterly failing has been one of my great frustrations in the computer field. I should be able to have a universal document format with intelligent encapsulated data that I can manipulate in flexible ways.

Excel is almost there as a standalone island, but I should be able to embed Excel's intelligence in all my documents.

rsync
"OLE embedding and OpenDoc utterly failing has been one of my great frustrations in the computer field. I should be able to have a universal document format with intelligent encapsulated data that I can manipulate in flexible ways."

I think we have had this in the form of the UNIX shell environment and various command line primitives like cat/awk/sed/grep/wc/strings.

It isn't sexy and it certainly isn't a WIMP[1] paradigm like Windows (which I associate exclusively with OLE) but if you can work with ASCII text (and you should...) then I think you have that environment and those abilities.

[1] https://en.wikipedia.org/wiki/WIMP_(computing)

perl4ever
I'm not saying that paradigm isn't useful, but it seems like you're saying 70s technology is still good enough. But shouldn't we have come up with some improvement in the last 40 years? Really, all we've done is regained the abilities of big computers from long ago in modern microcomputers.

I was just playing around with Selenium, and it was neat what you could do with it, but it was kind of depressing that it is basically a project to do for one application (or group of applications) what people used to have a vision for doing with all applications back in the 1990s (with AppleScript for instance).

We keep redoing things we've done before in slightly different ways, scaling down the grand visions and retrying parts of them.

sverige
AI is going to invent that revolutionary software, but unfortunately you and I won't understand it or be able to manipulate it. Be careful what you wish for.
miguelrochefort
Is it coming soon? I'm not sure I can wait 5 more years.
sverige
Oh, well, probably not 5 years. More like 20 years away for human-level AI, according to everyone I've read since the 80s. Fusion power will be coming online right around then, too, so that's good. And the true value of the blockchain will finally be evident in about 20 years, too. Which is really good, because that's around the time we are likely to discover the secrets of immortality.
calebh
Imo gradient descent is not going to get us to general AI. I'm a programming languages theory person, and I also know a fair bit about machine learning. The current problem with neural networks is that they suck at processing variable length data, and they have problems with remembering the past. Programs can be represented by trees, graphs, or text. Deep learning isn't great at dealing with graphs or trees, and it's not that good at text either.

The other issue is that deep learning works great for recognizing common patterns, but it sucks when faced with novel situations. So I don't think that we're going to have programs that program anytime soon. The first applications of AI to programming will probably be with programming assistants or with AI guided proof solvers. Programmer productivity will improve, but we're not going to see everyone losing their jobs.

nmca
The latter of those two points is much more valid than the former. You're comment about generalisation difficulty is pretty accurate, but as for "sucks at variable length data", I think neural machine translation [0] and the fact that schemes including RNNs just won M4 [1] indicate that this is incorrect. Your point about remembering the past is true (it's hard), but people are actively working on it. Unitary neural nets, and the fast/slow weight paradigm are very different angles that seem promising.

As for handling trees + graphs, this actually works very well. Thomas Kipf is pushing this area forward, GATs [2] are a nice random example of how dominant differentiable programming (eg NNs) can be in this area. Unfortunately these graph approaches don't parallelise as nicely on GPUs as CNNs.

Your predictions (assistants) seem likely to me.

[0] https://arxiv.org/abs/1804.09849 [1] https://www.m4.unic.ac.cy [2] https://arxiv.org/abs/1710.10903

bachbach
Something like Urbit probably.

Those sorts of ideas sound improbable - it's hard to say whether it takes 10 years or even a different civilization. The Romans never managed to pass land reforms.

The Chinese tech sector has gotten around some of it with the assistance of the government - there's glimpses of a different possible future but without full stack revolution it's hard to imagine new types of institutions being created.

I'm really not sure Silicon Valley is able to pivot out of the tar pit it has itself created.

The closest sign of real progress would be if somebody did something that destroyed a ton of Silicon Valley jobs.

I know people think here: AI. I don't think so - as of yet I haven't seen any evidence for autonomous executive complex coordination. It's possible for a machine to do it - that I believe - but it doesn't follow that computers are going to be the machines that do that.

andyidsinga
this! Interviewer: "I don't understand what the necessity of an electronic device called a computer, or anything else like that - a computer screen to use your word - is in making people smarter. I don't think another gadget is going to have any effect at all"

... I can't tell you how many times I've heard this gadget/toy statement since I first got a computer when I was around 10.

On the flip side - there are so many people with open minds willing to give things a shot (and often a not inexpensive one at that) ...thanks for giving me those computers mom - and then enjoying watching what happened :)

(edit : typos)

thsowers
This is a great listen. The interviewer doesn't seem to understand that there are hard limits to humans ability to search, compute, organize, notice patterns etc
samueloph
Ok, so i didn't know Ted Nelson, from his wikipedia page: "Theodor Holm "Ted" Nelson (born June 17, 1937) is an American pioneer of information technology, philosopher, and sociologist. He coined the terms hypertext and hypermedia in 1963 and published them in 1965."

I'm impressed.

Rapzid
Really great interview. Ideas should be challenged, and the value we, the audience, get from the responses to challenge is great. Not enough of this in the mainstream today.
EthanHeilman
Ted Nelson in continuing in a thread of argument first put forth first by Ada Lovelace:

"Again, it might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations, and which should be also susceptible of adaptations to the action of the operating notation and mechanism of the engine. Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent."

Sketch of The Analytical Engine Invented by Charles Babbage By L. F. MENABREA of Turin, Officer of the Military Engineers With notes upon the Memoir by the Translator ADA AUGUSTA, COUNTESS OF LOVELACE https://www.fourmilab.ch/babbage/sketch.html

cm2187
To be honest I wouldn't be convinced myself that I would need a personal computer to better index my pieces of text at home.

Searching through un-indexed text would have been a far more powerful argument. I remember reading the memoir of a DGSE analyst (the DGSE is the french equivalent of the CIA). As always, the reality of secret services is far less sexy than its popular image and the book has an entire chapter about the importance of good (pre-computer area) archives about everything, every conversation any analyst had with anyone, any book or article written by any public figure, anytime someone is mentioned in any anecdote, etc. And how the essence of being a good analyst is to know these archives inside out, how to quickly browse through when looking for what we know on someone or a topic. On a personal computer that would have been a simple CTR-F.

Doxin
To be fair, that "un-indexed text" is still indexed. The indexing process is just hidden from the user and fully automated.
KirinDave
Wow, it's interesting that Ted says the use a computer "during the war" for for trajectory calculations. I wonder if he believes that, or if he's simply avoiding a very complex conversation about information theory and cryptography that in the 70s was both less well-understood and less popular.

I wonder if instead he's referring to some other specific event that in the subsequent 50 years we've decided is insignificant to the formation of computers.

hchasestevens
One consideration is that much of Bletchley Park's efforts were kept hidden from the public until 1974[1] - it's possible that in 1979 the computer's role in breaking German cyphers was still not widespread knowledge.

[1] https://en.wikipedia.org/wiki/Ultra#Postwar_disclosures

TheTedNelson
Bletchley Park was publicly unknown at the time of the interview. Whereas Eckert and Mauchly, with Eniac, had been computing those tables. (That was a secret too, but it was no longer a secret when I knew him in his last years.)
ipsin
They used computers during WW II for rangekeeping [1]. They were complicated and heavy electromechanical beasts.

[1] https://en.wikipedia.org/wiki/Mark_I_Fire_Control_Computer

KirinDave
Right but this was NOT the most historically significant application of computers in WW2 by a long shot.

That's why I'm wondering if this is a visible example of how history tends to redefine significance and refocus on different events over time.

stan_rogers
Calculating trajectory tables for artillery was ENIAC's primary job as well.
miguelrochefort
We're still very far from the connected and unified future Ted Nelson had in mind, yet I can't seem to find many people interested in solving that.

Where are these people?

dustingetz
I'm in that space, link in my profile (not that it is recognizable as a hyperdata system yet, nonetheless it is). Why do you ask – are you?
xamuel
Interesting how they mention the coming war between the "computer centers" and the personal computer. And now looking back, we see that's a war that's been fought time and time again, and now we're fighting it all over again with the whole Cloud thing. Next week Cloud'll be out and desktops back in.
braindead_in
Here's a transcript, if you want to skim through it.

https://scribie.com/transcript/290d2d5d863e4353b3843816aade9...

Kaius
"It is possible to insist that every changes is merely a small change in degree, rather than a change in kind."

Good quote from Ted.

jonahx
The vicarious frustration in this listen was deep.

It's not only a fascinating time-capsule, but a palpable reminder of how revolutionary ideas can be received. The clarity of Ted's thought and vision here is striking even by today's standards, and despite his patience and articulate explanations he might as well be talking to a brick wall.

The Swift quote came to mind:

> When a true genius appears in the world, you may know him by this sign, that the dunces are all in confederacy against him.

abecedarius
http://habitatchronicles.com/2004/04/you-cant-tell-people-an...
miguelrochefort
I spent the last 10 years trying to tell people my vision. Nobody is getting it.

Now I understand why. I wish someone told me this 10 years ago.

Thank you.

rocky1138
> When people ask me about my life’s ambitions, I often joke that my goal is to become independently wealthy so that I can afford to get some work done. Mainly that’s about being able to do things without having to explain them first, so that the finished product can be the explanation. I think this will be a major labor saving improvement.

Honestly this is so good. I feel exactly the same.

DanBC
It's a radio interview, so he may be talking to a brick wall in the studio but it's likely a bunch of people listening suddenly "got it" from this interview: computers aren't about manipulating numbers, but are general purpose machines that manipulate symbols.
booleandilemma
“Don't worry about people stealing your ideas. If your ideas are any good, you'll have to ram them down people's throats”

- Howard H. Aiken

ehnto
That title had me thinking he was trying to unthink an epiphany the radio presenter had enlightened him with.
andyidsinga
I'm listening to this for the second time - I think there will be more.
redwood
Shadow IT goes back a long way!
BenjiWiebe
"Remarkable that humanity survived without [a computer] all this time." -sarcastic

"Yes it is!" -serious

dqpb
I found it striking that the majority of Max Allen's questions/arguments were along the lines of "But isn't everything fine the way it is?"
sverige
He wasn't wrong, either. Having lived the first half of my life without a computer at home and the second half with more and more of them, the only thing I can say is a real improvement is that I can transfer funds from my account to my wife's at any time. The rest isn't that great. Probably hard to understand if you never lived without them.
mwcampbell
That's overly pessimistic. Isn't it awesome that computers have enabled worldwide communities like this one? Asynchronous communication through text, enabled by computers, is also a great equalizer for people with disabilities and other differences. Blind people, deaf people, quadriplegics, people with speech impediments (e.g. stuttering), non-native speakers of the language, etc. can all communicate and work together, without even being aware of each other's difficulties.

To be fair, all of that was enabled by the earliest BBSes, and I guess we haven't made as much progress in the meantime as some people (like Engelbart and Nelson) hoped.

sverige
You are correct about asynchronous communication, of course. That is a great thing, and I should have included IRC / texting on the plus side.

The worldwide communities? I enjoy them, but it seems to me that it is one of the fundamental causes of the increasing fragmentation of society. Hannah Arendt describe this as 'atomized' society. Totalitarian regimes have flourished in such environments.

In the old days, if you didn't fit in, you had to make an effort to find common ground with others physically near you to enjoy a social life. This led to a lot of seredipity and built social cohesion. Nowadays, it's a lot easier to hate your neighbor who disagrees with you politically or culturally because there's always a virtual means of socializing just a click away.

Nelson was a visionary to coin 'virtualization' but I think he and many others have been far too optimistic about the end results of that ongoing experiment. It is far easier to manipulate people into committing acts of physical violence against their neighbors than is commonly believed in a society that has become 'atomized,' and virtual communities are a terrific means to creating an atomized society.

BenjiWiebe
Hilarious. The poor interviewer just can't comprehend at all what a computer could be used for. Also, Ted Nelson was pretty good at predicting that a computer would be used as a writing medium, a filling cabinet, a musical instrument, a few more things I don't remember specifically. As I was listening I was just silently agreeing that yep, we have that, and that, and that already.
hobls
Even hit on what we could now call “smart home” devices. He managed to come up with a lot of examples to try to illustrate to the host what computers would be used for!
hindsightbias
I'd wonder if HN readers would have gotten it in the context of 1979.

It took nearly three years for someone to wonder what would happen if you connected something like Hypercard to a network. And it wasn't Bill Atkinson.

putlake
Bitcoin today is where computers were in 1979. A little embarrassed to admit it but I feel like the short-sighted interviewer who fails to see the applications of the technology.
kevinpet
If you looked back to 1979 you'd find a dozen other technologies that didn't pan out.
inteleng
Just because something new-ish exists doesn't mean it's the root of something society-changing. Bitcoin might be a steam engine.
shoo
> Just because something new-ish exists doesn't mean it's the root of something society-changing. Bitcoin might be a steam engine.

this is a great comment, i can interpret it at least two entirely different ways.

by "might be a steam engine" do you mean that bitcoin might be one of the key drivers of a major process of economic/industrial revolution?

inteleng
I mean it might be completely obsolete in five years. It has caused a great deal of innovation, even if much of that doesn't come to fruition.
rinze
Bitcoins are tulips.
inteleng
How original. I imagine you can show how the tulip craze fomented great innovation in the floristry field, and spawned a few actually good products despite being inherently misguided?
pinewurst
Or Cabbage Patch dolls
inteleng
Cabbage Patch Kids and tulips wear out and disintegrate like any other organic matter. We will be lucky if bitcoin wallets from a decade ago are able to be cracked a thousand years from now.
davidgay
> Bitcoin might be a steam engine.

I think it's fairly safe to say that steam engines revolutionised society (the industrial revolution and all that), arguably more so than those engines that replaced it.

inteleng
Sure. But they never worked on airplanes, and they're only a distant ancestor to the jet and rocket engine. You don't see people investing in steam engines these days.
pavs
Are you saying that bitcoin has the potential to revolutionize the world economy, create trillions of dollars worth of industry across many fields? Create 100s of millions of jobs. Create new forms of arts, new ways of communications, help find cures for illness, help accelerate human conditions 100s of ways that I can't even comprehend at 5 in the morning, before my coffee.

No, I don't think bitcoin has any such potential. Yes, some people made quick bucks at the expense of some other gullible people losing quick bucks. In the realm of human history, bitcoin/altcoin will be considered a mildly interesting idea, a lot of noise but eventually a major dud. No sane and major government will give up central control of currency, If bitcoin eventually ends up being centrally controlled (if not banned first), it defeats the existential angst of having cryptocurrency.

Its an extremely environmentally inefficient way to screw people over, so please stop propping up cryptocurrency. Let it die.

Bitcoin isn't rocket science, a lot of people understands what it is and its potential benefits. It's just people are confused what the hell do we need this in the first place.

nicklaf
It would be interesting to ask whether or not Ted Nelson's vision for personal computing has been achieved. In the podcast, personal computing was to be the means of liberation from a centralized priesthood on the one hand, and from ignorance (i.e., access to means of learning) on the other.

Personal computers are now ubiquitous, and we've been liberated from central priesthoods which had restricted access to (expensive) computers in the past. But new forms of centralization emerge, whether they arise in the form of bloated, draconian corporate software foisted on workers within a company, or as public products such as Google and Facebook, which have been gobbling up the open web and replacing it with means of control and manipulation. [1]

In fact, this latter concern of manipulation begs the question of what people really need computers for, and brings into question Ted Nelson's presumed answer to it: i.e., that people have an inherent need to use computers to facilitate their own creativity. But what about the possibility that the majority of people prefer to be passive consumers? Neil Postman warned that we would become a trivial society, rife with distraction, like in Huxley's Brave New World. [2]

Perhaps computers have succeeded in universally capturing our imagination, but corporations (centralization!) have once again captured computing, and by extension our imagination as well, as Tim Wu laments [1]: am I more likely to open up my personal computer at odd moments of the day to check Facebook or Hacker News to momentarily capture my imagination? Or have I instead acquired the habit of pulling out a tablet to accumulate further progress in a creative work, which perhaps only requires internet connectivity for the purpose of reference? And if the answer is the former, is the reason because we don't have actual Dynabooks [3], but instead a more asymmetric device skewed toward consumption and away from creative expression? Or is it because the majority of people in fact prefer passive consumption anyway?

[1] http://www.timwu.org/AttentionMerchants.html

[2] https://en.wikipedia.org/wiki/Amusing_Ourselves_to_Death

[3] https://en.wikipedia.org/wiki/Dynabook

miguelrochefort
The problem is that the computer doesn't help me figure out what to do next. It only provides a grid of icons, through which I cycle repeatedly until the real world grabs my attention.

What we need is an OS designed to help people get things done. The best such framework is probably Getting Things Done [1]. Build the 5 steps (Capture, Clarify, Organize, Reflect, Engage) right into the system, and you'll have a winner.

I've been repeating this idea for the past decade. Perhaps I should implement it myself.

[1] https://en.wikipedia.org/wiki/Getting_Things_Done

wool_gather
Interesting though. Home assistant devices (Echo/Google Home) might be a first step in this direction, since they're entirely task-oriented. Not quite the same as a GTD system, of course: they just do a thing immediately. But the non-visual interface means that you can't idly flip through Facebook posts even if you tried.

On the other hand, they don't have any support for in-depth tasks. If the things you wanted to get done were writing a letter or editing a photo, you're not going to get far without a screen.

mwcampbell
> you're not going to get far without a screen.

Tell that to blind people. Not for editing a photo, of course, but they can certainly do other in-depth tasks without a screen.

wool_gather
Sure, but for a sighted person used to interacting with things visually, this is learning some entirely new mode of control. One of the strengths of computer GUIs is that they heavily leverage humans' most favored sense (for those of us that can use it).
8bitsrule
whether or not Ted Nelson's vision for personal computing has been achieved.

If you listen to one of Ted's recent videos, you'll find the answer in the weeping and gnashing of teeth. (The dumbing down of what could-have-been.)

cornholio
A reference is appropriate: https://www.youtube.com/watch?v=gDrHkNgGQDs
bborud
The impression I got is that he is more of a technology philosopher than an actual technologist. He dreams up things, but he doesn't realize them and he hasn't managed to convince others to implement his vision either. In terms of software there is precious little to show for 50 years of thinking about this.

And it isn't like this is some recent brainwave. By the time the web rolled around he'd had 30 years to do something. And as the web came into being he had an excellent chance to make it a vehicle for his ideas, but he didn't.

One thing I remember from a (longish) chat I had with him is that some elements of his ideas exist today, but perhaps not where you would think about them. For instance in non-destructive media editing software you have concepts that remind me of some of his ideas.

I think the one key takeaway I got from his talk and the chat afterwards is that only what you do really counts. Ideas really are worth less though not worthless.

AndrewKemendo
only what you do really counts. Ideas really are worth less though not worthless.

It's a matter of degree right?

So for example Erdos didn't actually do anything - in the sense that we're talking - such as creating products or works of physical art. His mathematics are ideas. Which people then used, but critically referenced Erdos as their originator.

From what I can tell of Ted Nelson, the design and cybernetics ideas were obvious enough that many other people thought of them also, or were not so obscure as to be canonical. So like many ideas without fathers, it's whomever can create a product around them which get the credit.

So for example "Steve Jobs invented the iPhone." Anyone with even a shallow historical understanding of handheld computing knows that this statement isn't true in any concrete sense, but because Apple, under Jobs' management, popularized the smartphone the attribution goes as such. Many people thought of, and wrote about or made in science-fiction the smart phone - so it was an obvious enough idea (by then) that execution is what mattered.

Interestingly, if you push all the way back to writers like HG Wells, they predicted much of the technology we see today including smartphones, but they aren't credited with "inventing" them or even being important in their development really. So the right time, right place, right idea is still the combination needed to stand out.

hindsightbias
> So the right time, right place, right idea is still the combination needed

And the right people.

bborud
> So for example Erdos didn't actually do anything

He most certainly did. In his field, doing the work which is then published in papers _is_ doing something. However, just having an idea for how to work out a mathematical proof, but not doing it would represent, well, not doing it.

I don't see how this would be unclear.

delbel
> But new forms of centralization emerge, whether they arise in the form of bloated, draconian corporate software foisted on workers within a company, or as public products such as Google and Facebook, which have been gobbling up the open web and replacing it with means of control and manipulation. [1]

This reminds me of a book that came out in 1996 called Silicon Snake Oil: Second Thoughts on the Information Highway by Clifford Stoll. In it, he predicted, the internet will all turn into crap. At the time it was the most contrarian view point ever, I can't help to think I need to find a copy now in light of what facebook and mobile phones and their negative effect on society has done.

He wrote an amazing book called the Cuckoo's Egg before that, basically everyone in silicon valley has probably read and can relate to if you are from that era.

nicklaf
> This reminds me of a book that came out in 1996 called Silicon Snake Oil: Second Thoughts on the Information Highway by Clifford Stoll. In it, he predicted, the internet will all turn into crap. At the time it was the most contrarian view point ever, I can't help to think I need to find a copy now in light of what facebook and mobile phones and their negative effect on society has done.

Very interesting. The book certainly sounds like something I ought to read. In fact, perhaps while simultaneously reading David Gelernter's 1992 book, Mirror Worlds: or the Day Software Puts the Universe in a Shoebox...How It Will Happen and What It Will Mean, whose optimistic vision of an immersive virtual reality out of networked computers would make for a rather stark contrast!

ataturk
David Brin was pretty on the mark with "Transparent Society" as well. Brin had more optimism than what came to be, but he nailed the "cameras everywhere" status quo where people whip out smart phones to capture footage of an event now. It's just that he also nailed the Orwellian side of it, too.
KC8ZKF
I remember reading that book in 1996 and finding it incredibly elitist. His alternatives to the internet were visiting libraries, book stores, art film theaters, going to concerts, strolling through parks... But, he was living in Berkeley!

The internet was a god send to most of the population who didn't live in the intellectual centers of world. Still is.

Jedd
It invites or raises the question rather than begs it, but your point is understood.

You mention Tim Wu -- his book 'The Master Switch' focuses on the history of technologies as a cycle of invention --> egalitarian use & access --> centralised control and ownership by the elite.

My feeling is that things like the trends away from net neutrality globally, and towards greater government surveillance / suppression of encryption, and corporate interest in encouraging consumption via technology rather than (as per your example) 'creative work', are fulfilling that prediction in both obvious and subtle ways.

nicklaf
It invites or raises the question rather than begs it, but your point is understood.

Hmm. I admit it's tenuously implied, but my thinking was that it does beg the question! Because what we're talking about is the ostensible resolution of Ted Nelson's dilemma of wanting 'generalized paper', but then observing that what people actually use it for in practice might lead us to question some of his original assumptions about what people need computers for after all. But OK, maybe this isn't quite correct usage of the term.

Thanks a lot for pointing that out about The Master Switch, which I should probably read.

redwood
I'm impressed that you're both familiar with the rhetoric roots of the term
Obi_Juan_Kenobi
> trends away from net neutrality globally, and towards greater government surveillance / suppression of encryption ...

Is that a trend?

It strikes me that these are ongoing points of conflict, at least in particular with encryption. Whether it's illegal numbers, Wassenaar, or the UK sitting on a version of RSA for over two decades, keeping it top secret. Or the enduring controversy over TOR, a tool of government surveillance.

I think the only thing that's changed is the degree and immediacy with which policy related to these issues affects the public.

More generally, for all the 'consumption' Facebook and Instagram, you also have Arduino and 'maker' culture, Soundcloud rappers, Youtubers of all stripes (from the old dude in his garage machine shop, inexplicably addressing an audience of tens of thousands a few times a week, to the flavor of the month pop-drama garbage), an endless list of self-directed learning resources, Wikis, etc. etc.

I certainly see the merit in the 'Wu' observation, but it strikes me how robust and resistance the internet is -- has been -- to this erosion.

Jedd
> Is that a trend?

It feels like it - but I don't have a scatter plot that I can provide to prove it.

Snowden revelations felt like the start of the wave of awareness, at least amongst those with even just a modicum of interest and savvy.

In AU we tend to follow the trends (no matter how bad they are) set by other regimes. Two recent examples spring to mind. In 2017 a fairly horrific data retention law was passed that requires all ISP's to record and retain (for two years) all user meta data. In 2016 the AU Bureau of Statistics ran the latest census - there was a significant policy change regarding the long-term retention of these PII data, cynically announced a few days before Christmas the previous year (a profoundly quiet time in AU as most take their summer holidays around then).

Watching news from the UK and the US - but less so the EU, at least - is discouraging in terms of the continued eroding of privacy and freedom. Separating large institutions such as FB / Google / Microsoft from nation states misses the point - their interests more often align with each other than with their citizenry / customers(products).

> More generally, for all the 'consumption' Facebook and Instagram, you also have Arduino and 'maker' culture ...

I love the arduino, but I don't think it's going to save us from the activities of bad actors.

> I certainly see the merit in the 'Wu' observation, but it strikes me how robust and resistance the internet is -- has been -- to this erosion.

Read the book. There's a trend of people thinking 'this is how it will always be' discovering later, too late, that that's not the case.

noblethrasher
> It invites or raises the question rather than begs it, but your point is understood.

By my reading, 'nicklaf used the phrase in the technically correct way.

But, as a recovering philosophy major, I'll just point out that the phrase "begging the question" is just a bad 16th century English translation of the Latin phrase "petitio principii" that literally meant "requesting a postulate", and which itself was a medieval translation of the original Greek phrases "τὸ ἐν ἀρχῇ αἰτεῖσθαι" and "τὸ ἐν ἀρχῇ λαμβάνειν", which meant, respectively, "asking the original point" and "assuming the original point."[1]

We say that the English translation is "bad" because the translators were jumping through hoops to avoid using the fancy Latin in favor of the vernacular (or vulgar) English.

I caucus with the folks that say that we should just use the Latin phrase, "petition principii", as we do with other well-known fallacies.

[1] http://languagelog.ldc.upenn.edu/nll/?p=2290

deegles
> computers have succeeded in universally capturing our attention

This seems so ominous to me. It’s the end goal of the “attention economy,” right?

twic
> It would be interesting to ask whether or not Ted Nelson's vision for personal computing has been achieved.

Well:

I DON'T BUY IN

The Web isn't hypertext, it's DECORATED DIRECTORIES!

What we have instead is the vacuous victory of typesetters over authors, and the most trivial form of hypertext that could have been imagined.

http://essaysfromexodus.scripting.com/tedNelsonWebHypertext

KirinDave
> In fact, this latter concern of manipulation begs the question of what people really need computers for, and brings into question Ted Nelson's presumed answer to it: i.e., that people have an inherent need to use computers to facilitate their own creativity. But what about the possibility that the majority of people prefer to be passive consumers? Neil Postman warned that we would become a trivial society, rife with distraction, like in Huxley's Brave New World.

We're in a novel time to be introspective about the use of computers because in a very real sense, accessible information systems that treat all information equally have brought about an information overload apocalypse. With accessibility and creativity everywhere, we're starting to see what happens when any group of people can assemble social campaigns (bot networks spreading lies in social media frameworks) and achieve parity with the former "information priesthood" of centralized media.

It's so easy to make and share information that people are inundated with the material of ethos (with the lies it cherishes and truths followers cling to), language, and creed. While other culture shock events have occurred in history, there is a fundamental difference in the recent decade has been the lack of any central actor for legacy power systems to coerce or stomp out. And because of this, we're having to find ways to cope at the individual level with an excess of information, even unwanted information that we don't even realize we're consuming.

I'm not sure Nelson or even Huxley truly foresaw this. To someone who has not grown up in a sea of information with algorithms helpfully (and even desperately) shoving information into our faces to try and satisfy our needs before we realize we have them, it's difficult to describe the idea of struggling not to be manipulated by subtle bias. Prior to the information saturation we enjoy now, such attempts to totally control information tended to look a lot more like North Korea's press. Aspects of this even exist in less centralized media, such as when and where western press decides to "tell both sides of the story" as opposed to simply informing a position as fact.

Counter to Huxley's rather puritan vision of us medicating and fornicating into irrelevance (which we've always done, let's be real), the idea that a minority view point (e.g., "America should be racially motivated literal monarchy") can be given the optics of having parity with ideas actually held by tens of millions of people seems to have energized people. Folks see a venue for increased social importance and control, and they're seizing it.

jdietrich
>But what about the possibility that the majority of people prefer to be passive consumers?

By any reasonable measure, we're in a renaissance of creativity. Most people in the creative industries are complaining of a total oversaturation of talent - too many bands, too many authors, too many artists, too many indie game developers, too many stand-up comedians.

Maker culture and 3D printing has created a consumer market for CAD/CAM software, which no-one in the industry foresaw even a decade ago. By some measures, musical instruments and recording equipment are a bigger market than the recording industry. It has never been cheaper or easier to turn an idea into reality; there's abundant evidence that people are grabbing that opportunity with both hands.

YouTube is full of crap, but it has also revealed a huge amount of demand for deep and meaningful content. My YouTube subs box is full of artisans, mathematicians, poets and monks. Some of them have seven-figure subscriber counts and five-figure Patreon revenue, some of them upload simply for the joy of sharing something that they love. It's easy to be cynical, but YouTube is a place of boundless magic and wonder.

The internet isn't perfect, but it's not all doom and gloom either. The internet is full of assholes calling each other nazis, but it's also full of people sharing skills and making meaningful connections. Sturgeon's Law isn't going away, but we really have removed the stultifying layer of gatekeepers that controlled access to the media. There's an uncomfortable degree of consolidation and centralisation on the modern internet, but MySpace and Snapchat stand as testaments to how fickle and fragile that power is.

To quote Ted Nelson in this interview: "The computer is a projective device, it is a Rorschach test. Anyone will see in it that which is of the most concern to him." If we choose to spend our time on the internet decrying the negative rather than contributing to the positive, we will get exactly what we deserve.

WJW
Interestingly, both problems could be happening _at the same time_! If before (say) 1% of all people got a chance to publish their work and the internet has increased that tenfold, there is still space for 90% of the population to be passive consumers. That 10% might still be 'too much' for some types of content of course (how many bands, standup comedians or indie game developers do you need before you hit diminishing returns anyway?)

In fact, I suspect that most people fall into both groups: They might be 'creators' for some types of content and only 'passively consume' some different types. For example, I sometimes contribute to OSS but have never uploaded a Youtube video in my life.

ataturk
These are all good and interesting observations, particularly the ones about Youtube. On the other hand, I still feel like something is very wrong with the Web and the Internet in general. I think it is the incessant propagandizing and ad-mongering. I can't even visit most websites these days without "shields up" on my Brave browser. The NY Times and Washington Post websites are full of slant, required by their owners. Journalistic ethics are completely out the window. Independent news sites are totally unreliable for different reasons, some only because of lack of experience(?). It's an unruly mess.

You mentioned CAD/CAM software for regular people--the prices don't reflect that these companies have gotten the idea yet.

These days, basic free-market, small government, libertarian minded people are referred to as Nazis. It is not at all comforting that such a label could be so badly mis-applied while we witness the US Government dropping a bomb every 12 minutes in multiple undeclared wars, mostly for the sake of corporate profits, and yet people like me are the bad guys? I don't get that. It is all completely counter-productive.

KineticLensman
> By any reasonable measure, we're in a renaissance of creativity

There is a really interesting historical pattern here where new communications technologies create an initial splurge of independent creators but after some time the technology is consolidated into a smaller number of platforms (by regulation or monopolisation or sometimes both) that provide the main route for content to its consumers. These platforms tend to stifle peer competitors and are only replaced by disruptive rivals. Creativity on the platform depends on how much it stifles the content, the classic example being the Hollywood film industry in the 1940s .. 1960s which was highly vertically integrated and thus became an excellent vehicle for enforcing the so-called Hays Code, which defined morally acceptable content and attitudes.

This idea is explored really well in the 2010 book ‘The Master Switch’ by Tim Wu [1] who shows how this pattern is played out with Radio, Films, Television and latterly the Internet. It would be really interesting to see an update to this excellent book to reflect the consolidation of Google, Amazon, Facebook, etc since it was published.

My own concern is that because the internet is perhaps more pervasive than the prior media, any lockdown could have more severe chilling effects. As an extreme, consider future cloud-based compute appliances that don’t allow side-loaded software and whose content creation tools delete or report inappropriate language.

[Edit - just noticed a similar comment from Jedd below. Next time I should read a bit further before commenting!]

[1] https://en.wikipedia.org/wiki/Tim_Wu#The_Master_Switch

[2] https://en.wikipedia.org/wiki/Motion_Picture_Production_Code

Apr 14, 2018 · 18 points, 0 comments · submitted by codetrotter
My favorite Ted Nelson interview: https://youtu.be/RVU62CQTXFI

It gives a great perspective on Ted's prophetic vision for computers as well as the intense skepticism it raised in many.

mentos
Incredible. I agree with the top comment on that video, the interviewer was being pessimistic and not open minded but it served to make Ted Nelson expand and give very comprehensive answers.
Oct 27, 2017 · 2 points, 0 comments · submitted by mpweiher
As an adjunct, here's a journalist in 1979 struggling to comprehend Ted Nelson's vision that the computer would become a creative tool for the masses, https://www.youtube.com/watch?v=RVU62CQTXFI
From the same period, a radio interview where Ted Nelson tries to convince, unsuccessfully, a skeptical radio host that home computing has a future: https://www.youtube.com/watch?v=RVU62CQTXFI
Feb 14, 2017 · 1 points, 0 comments · submitted by davisr
Dec 23, 2016 · 3 points, 0 comments · submitted by da02
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.