HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Xerox Star User Interface (1982) 1 of 2

VintageCG · Youtube · 12 HN points · 16 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention VintageCG's video "Xerox Star User Interface (1982) 1 of 2".
Youtube Summary
First half of a demonstration of the Xerox Star user interface from 1982. The Lisa and Mac interfaces that would follow a couple years later are dismissed by some today as being merely a direct ripoff of Xerox's work...but in watching this it becomes apparent that Apple's UI was in fact an evolution of these concepts, not a 1:1 copy. The Star UI has little to no direct manipulation, nor any visual distinction for radio buttons or check boxes. In other places, such as networking and printing, the Star was many years ahead. The Star UI laid the foundation but it was not copied whole cloth as some will claim.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
> How did IBM and Apple take over and Xerox end up as a has-been also ran?

They were too far ahead of their time. The component costs needed to come down first.

In 1981, the "starter kit" for their office system consisted of one Xerox Star for a user, a second Star to act as a file and print server, and a laser printer for about a quarter of a million dollars in today's money.

Each additional networked Star that you added was about $50,000 in today's money.

Even at those prices, the system wasn't exactly snappy. I do wonder what would have happened if they had kept iterating on the product until faster, cheaper components arrived.

https://www.youtube.com/watch?v=Cn4vC80Pv6Q

AlbertCory
They did keep iterating on the product until faster, cheaper components arrived. They had a product called GlobalView, which lasted all through the 80s and early 90s. Of course I know some people who stuck with it.

What they couldn't wrap their heads around was: no one cared anymore. The world had moved on.

FullyFunctional
I don't think that was the fundamental issue. It was one of mindset. Xerox tried to sell to their existing customers, with the same sales structure and pricing they were used to. They simply weren't structurally set up to deal with mass-market consumers. The Macintosh team targeted a completely different set customers, with a less ambitious and much cheaper product.

The IBM story is a bit of an outlier, but Microsoft saw the same potential Apple did.

TheOtherHobbes
I think it was both. No one in Xerox management understood that they could grow a new market - possibly with loss leader hardware sales - instead of trying to farm their existing market.

So the Alto was never properly productised. It was essentially a boutique product at a big-mini price. That made it competitive with DEC and IBM minis, but not with budget the $10k S100 systems that were just about - barely - powerful enough for a small business.

To be fair to Xerox, six figure systems weren't unusual for business customers at that time. Which could be why the company never understood that it had the key to popular budget semi- and non-pro commodity computing.

DEC died for much the same reason. DEC management were used to selling to fellow nerds and business people, and couldn't imagine super-affordable commodity products.

MS, Apple, and the IBM clone makers saw the new market for what it was. So did some of the new software houses.

IBM itself almost did, but couldn't quite shake its monopoly-oriented culture. So MS and the clones killed it in the commodity space.

ncmncm
The Alto was super-slow, even by standards of the time. It had a very, very limited CPU running an emulation of a more powerful ISA that app code was compiled to.

Things we expect to have dedicated hardware for -- network interface, display screen painter, disk interface, laser printer driver -- were just interrupt routines all on the one CPU. Programs competed for CPU time with feeding pixels out to the CRT-scanning electron beam, and with bits moving directly to and from disk-head R/W coils.

It did switch between those tasks pretty quickly, not like most machines then or today: there was no foolishness about saving and restoring registers. Each interrupt task was assigned a couple of registers permanently, including its own program counter, so an interrupt only ever just flipped which program counter was in charge.

The app-level instruction interpreter got to use what few registers remained.

So, on an interrupt, it would execute a half-dozen instructions to move one machine word, and yield. There were 16 PCs that ran in strict priority order, with the user-program instruction interpreter last in line. Each "yield" put the current PC back to the entry point for the next word to be processed, and the next live lower-priority PC determined the instruction to run next.

The OS was just one of those user programs.

It resembled the I/O processor on CDC mainframes of the time. It just ran the programs, too, not only the I/O.

But it really is amazing what they got it to do. Later machines e.g. Dorado and Dolphin had more stuff done in hardware so could be faster.

Worth remembering: the original windowed interface on Xerox machines was a view into underlying system objects. It was designed around a unified vocabulary of interactions that allowed user to message those objects and also direct inter-object communication:

https://www.youtube.com/watch?v=Cn4vC80Pv6Q

Xerox -> Apple -> Microsoft interface transfer preserved nothing of those core concepts. UI became a crutch developers grudgingly added to the system for "those stupid users". Thus, most software engineers today are still convinced that a teletype emulation is the best possible interface to the underlying OS that can possibly exist. Also, normal users are treated as second-class citizens in their own systems.

g4d
A recent example of this for me is using Abaqus (an FE solver). When using it to set up models I usually set all of the boundary conditions and contact interactions in a script. It wasn't until I was helping a colleague (who was using the gui) with a problem that I realised some of the options just aren't available through the gui.
swiley
> Also, normal users are treated as second-class citizens in their own systems.

I would argue that’s an artifact of corporate software development culture and not GUI design.

ChrisSD
Settings > System > Clipboard

Make sure "Clipboard history" is turned on. It will also tell you how to access the manager: "Press the Windows logo key + V".

Jun 05, 2019 · 2 points, 0 comments · submitted by heshiebee
Oct 09, 2018 · gambler on 12 Factor CLI Apps
Most people who claim that command line is the best tool for power users know nothing about history of GUIs. They think that laughable garbage that Windows calls user interface is "how it's supposed to work", and proceed to smugly lecture everyone on how command line is the only interface that can be composed and easily recorded.

(The following paragraph is not directed at the author of the parent post. It's fully rhetorical.)

Did you know that icons were supposed to be live representations of in-memory objects? That objects were more fundamental for the OS than files? Did you know that windows were views onto those objects? Did you know that interactions to and between icons were synonymous with OOP polymorphism?

https://www.youtube.com/watch?v=Cn4vC80Pv6Q

And this isn't the best UI, it's simply the first modern UI.

twic
And how you do operate on the data in those objects? How do you filter, project, sort, stash for later, combine with data from elsewhere, edit, etc? How do incrementally build up a composition of operations from repeated experimentation? How do you record that composition in a script?
gambler
>And how you do operate on the data in those objects?

You don't. Your question makes as much sense as asking how to do polymorphism in shell commands Not operating on data was the whole point of OOP. (Or at least one of the key points.)

http://userpage.fu-berlin.de/~ram/pub/pub_jf47ht81Ht/doc_kay...

  "I wanted to get rid of data. The B5000 almost did this via its 
  almost unbelievable HW architecture. I realized that the 
  cell/whole-computer metaphor would get rid of data, and that "<-" 
  would be just another message token (it took me quite a while to 
  think this out because I really thought of all these symbols as 
  names  for functions and procedures." -- Alan Kay
The way to integrate unrelated objects in fully OOP UI would be by making them send messages to one another, either directly or indirectly. The way to store this integration for later use would be by creating, modifying and serializing an object that represents it.

Look into Pharo or Squeak, at least watch some demos on YouTube.

> At the time, it was a huge leap forward in desktop computing.

After seeing some old demos of Xerox Star, I'd question that.

https://www.youtube.com/watch?v=Cn4vC80Pv6Q

The thing that horrifies me about all of this is that instead of criticizing or fixing issues most people adapt what they do and even how they think to match all the deficiencies in software. Often, users simply can't imagine systems that work fundamentally differently and better. Often, when deficiencies are pointed out people start defending the mess and pointing out all the clever (half-assed, really) workaround and hacks they came up with. They are proud of all the time they spent (wasted) learning about obscure and counter-intuitive software functionality. They are proud that barely-working plugin/extensions/add-on they found and set up. All of that to do thing that should be trivial to begin with.

One counter-point, though.

> I make no secret of hating the mouse.

If you look at the original uses of the mouse it was great. Especially in systems like Xerox Star. Star allowed people to perform complex tasks with almost no learning curve.

https://www.youtube.com/watch?v=Cn4vC80Pv6Q

(Note how they weren't shy of using keyboard either. There are dedicated hardware buttons for standard commands like copy, find, repeat, "properties" and even for common text editing actions. Meanwhile, our keyboards don't have dedicated keys for undo, redo, cut, copy and paste - operations that are used in almost every application today.)

Trouble is, we lost most of the driving ideas behind Xerox-style interfaces. Using a predefined set generic, powerful commands. Object-oriented UI. Uniformity of representations. Modern system have those things only as vestigial traits and in very limited contexts.

I don't think there were any quantum leaps in conceptual UI design since Xerox Park times. There were some minor improvements in very specialized apps and significant regression in software that's used universally. For examples, phones and tables almost completely lost drag-and-drop functionality and generic file UI.

For example, drag-and-drop is a very powerful concept, because it allows you to perform actions by combining things you already know about - and those "things" figure out how to interact in the best way possible. So, for example, instead of having N "Print" buttons in N applications you can have a singe drag-file-onto-printer-icon action that does different things based on the type of the file. [BTW, this is also the key idea behind the original notion of OOP.] Unfortunately, that's not how it works in modern UIs. They don't use either the keyboard or the mouse to their full potential.

snerbles
People get used to doing things a certain way. When they have to overcome a terrible interface, an attachment forms. It's no different than the pride of those that live in physically harsh environments.

Habituation is a powerful thing.

Aug 11, 2017 · 2 points, 0 comments · submitted by ak39
Way too much to cover in depth. In short, it's all terrible. We're mostly using tools from the 70s, and we're still to this day plagued by use-after-free and out-of-bounds errors, nullpointer exceptions, awful debuggers, race conditions, slow compilation times, bad primitives for multithreaded work, just to name a few. We've known for decades that we can do better and ways to do better have been studied in depth by academia.

That's just the basics. Then there is futuristic things: Why can't we move running processes from one physical machine to another? Where are the debuggers that allow you to go back and forward in time? Why can't we visualize the data structures in our program and trivially change the data layout to see how that affects performance?

When I look at the demo of the Xerox Sparc interface (1982) and compare that to where we are today it's pretty clear to me we messed up big time somewhere along the road: https://www.youtube.com/watch?v=Cn4vC80Pv6Q

TeMPOraL
> We're mostly using tools from the 70s

We're using the shittier tools from the 70s, because the better tools died off - which is sad, because we still didn't manage to recreate some of the features those tools offered. The success of UNIX and C seems to be caused by the same phenomenon that causes Electron to succeed and bloat to proliferate - what wins aren't good solutions, but those which get popular quickly.

None
None
chii
> aren't good solutions, but those which get popular quickly.

but by the judgement of a majority, those tools are good! That's why it got popular. Any personal sense of aesthetics from you (or anyone) doesn't matter at all here, and most people who call tools that failed 'good' are only judging it by personal aesthetics.

TeMPOraL
I'm not talking aesthetics here; I'm talking about shitty engineering. Often legitimized by nonsense phrases like "worse is better" these days.
nostrademons
For people who have an exceptionally high tolerance for learning what people actually care about and why the shitty technological solutions won in the marketplace, this is probably an opportunity.

When I entered the workforce in 2005 after learning all about bygone languages (Common Lisp, Smalltalk, Dylan, Haskell, Erlang, Ocaml, etc.) in college, it was really, really painful going back to Java. Java had gotten so many fundamental language features wrong when better alternatives were well-known in research literature. Monitors instead of CSP, null types instead of Maybe, inner classes instead of closures, manifest typing instead of type inference, no first-class functions, mutability everywhere, conflation of modules & classes, not even a semblance of pattern-matching. But I could use it to write software that performed well, that could be deployed with tooling that everybody used, that had a wide base of available libraries, that had IDEs and other tooling available, and that other programmers could maintain.

It's 12 years later, and all of my recent projects have been in some combination of Swift, Rust, Kotlin, or ES6+React. And most of these languages have: promises & message-passing instead of explicit locks; Optional types or at least some form of null-safety; lexical closures; type inference; first-class functions; immutable-by-default; module systems that don't require that you shove everything into a class; and pattern-matching. But they also did the hard work of figuring out an interoperability story with some large existing body of code; of writing decent package managers and easy build systems; of making the language reasonably performant for most everyday tasks; and of (in Swift & Kotlin's case) providing full support within an IDE. All the language features that were common in research languages of 2005 but missing in industrial programming of the time are now very usable in the industrial programming of now.

I wouldn't be surprised to see a similar renaissance in tooling over the next decade, with exotic features like time-traveling debuggers, fully-inspectable systems, and migratable processes becoming standard in the next generation of IDEs.

Feb 08, 2017 · pjmlp on Grappling with Go
Faire enough, so prepare for an history lesson. :)

Mesa/Cedar was created at Xerox PARC, as an extension of the Xerox Star 8010 system, which was programmed in Mesa, the language that inspired Niklaus Wirth to create Modula-2.

http://www.digibarn.com/collections/systems/xerox-8010/

https://www.youtube.com/watch?v=Cn4vC80Pv6Q

https://www.youtube.com/watch?v=ODZBL80JPqw Mesa evolved to Mesa/Cedar, which allowed for automatic memory management via reference counting and a tracing collector for the cycles.

https://archive.org/details/bitsavers_xeroxparcteCedarProgra...

The environment had a REPL and debugger inspired by the Smalltalk and Lisp systems at Xerox.

Think of Swift Playgrounds, kind of.

This system inspired Niklaus Wirth to create Oberon, which you can read about in his book, about how the OS was built.

http://www.ethoberon.ethz.ch/books.html

In 2013, Niklaus Wirth revised the language and updated the book for a FPGA based computer.

http://people.inf.ethz.ch/wirth/ProjectOberon/index.html

There were several generations of the Oberon OS, the last one before it stop being relevant, the Bluebottle OS coded in Active Oberon variant.

http://www.progtools.org/article.php?name=oberon&section=com...

It had a strange mix of UI, using a mix of text and mouse input. Basically all applications were dynamic modules and the procedures/functions with a special signature could be used from the UI.

So you could select text or objects from applications, and invoke such operations on the selection.

The Oberon System 3 and Oberon V4 with their Gadgets framework were the best experience, before AOS was designed.

Most relevant OS in Modula-3 was SPIN OS,

http://www-spin.cs.washington.edu/external/overview.html

There were a few others, this is just a small sample.

Sep 02, 2016 · sedachv on PC-BSD Evolves into TrueOS
IMO the Xerox Star UI and NeXTSTEP/GNUstep:

https://www.youtube.com/watch?v=Cn4vC80Pv6Q https://www.youtube.com/watch?v=uXHFfc3btCI

IncRnd
As a registered NS developer, I had a NS Turbo Color. What about NS do you find superior to OS X?
sedachv
Less chrome, no shadows, no rounded corners. Scroll bars on the left side. Two button mouse (not three, but good enough).
protomyth
Not OP but also a NS developer (custom built Pentium), and I must say like the menu system, digital librarian, workspace manager, and eof much better. A separate interface builder was much better, but that's more a developer tool thing.

[edit: left side scroll bar also]

IncRnd
For sure I liked the separate IB/PB. Good call! It was faster to develop in NS than in anything else for me.

At the time, I loved the digital librarian and menu system. After leaving and going back a few years later the WS seemed a little too dated for me. I use the context menus in BlackBox on my linux box. OS X seems to be my comfort zone now, which tends to color memory. Though I also found the IB/PB split much faster to work with. That's for sure.

You've got me fired up. I'm installing 4.2 ent in virtualbox now, which will be the newest version I've used :) Thank you!

protomyth
I still have my 3.3 discs and the 4.x discs with the wall of documentation. I'm tempted to install it again, if for nothing else to get at some very old e-mail I sent.

I think some of the "dated" is that it wasn't very high resolution, or put another way, it really didn't get to evolve into the high resolution / anti-aliased world we live in today. I've seen multiple attempts at redoing the icons in a modern setting and many of them look great. I particularly like the old folders. The menus suffer from the same problem, but more from a modern font choice scenario.

For me, OS X stripped a lot of the NeXT goodness to appease Mac lovers. I'm just not a fan of a UI designed for a 9" screen that really should have a better menu system. Recently, I've seen some 3rd party software that simulates part of the old NeXTSTEP menu system. Sadly, it doesn't seem to do the tear-offs.

I guess OS X is loads better than Windows, but it sure isn't my favorite. My two big "not working" projects are an agent oriented programing language and some form of modernized NeXTSTEP UI on OpenBSD. Sadly, more likely to do the former rather than the later.

IncRnd
I installed it last night, making sure to get the patches in order to use VESA not just SVGA. That gave me color. NS sure is beautiful. My color slab wasn't this good looking, since the slab had 16 bits of color.

The dated part for me is having to setup the OS for some workflows that have come about since NS days. Some things for scroll wheels not getting recognized out of the box and other polish. WS is great, though I'd change the Drag and Drop just a little. My memory of WS must have tarnished or faded some over the years, because it was and is really easy to work with.

I'll likely install the developer cd this weekend.

Thanks, again, for getting me to reinstall.

On a side note, there are commercial projects that use GNUstep. I use Hopper, a disassembler, that uses GNUstep.

From the developer training I still remember one statement in particular, "Even if NS isn't around in a few years, there will be a company that deeply understands what we do and they will do it." I guess that prophetically was OS X (at least in part).

protomyth
What guide did you use to do the install?
IncRnd
This one looks pretty good. http://openstep.bfx.re/

I haven't been able to figure out the networking. I have the ne2k driver, but it doesn't seem to install. I suppose the copy I have is somehow corrupted.

karma_vaccum123
You can run a variant of GNUStep today, yet no one does.
sedachv
Before switching to Lubuntu a few years ago, Window Maker was all I ran. LXDE is unobtrusive enough that I don't mind it.
cwyers
That's not timeless, that's very very timed.
Aug 07, 2016 · 3 points, 0 comments · submitted by mouzogu
Apr 10, 2014 · pjmlp on Shadow DOM
Mainly videos

Self http://www.youtube.com/watch?v=Ox5P7QyL774

Xerox Star User Interface http://www.youtube.com/watch?v=Cn4vC80Pv6Q http://www.youtube.com/watch?v=ODZBL80JPqw

Tektronix 4404 Smalltalk http://www.youtube.com/watch?v=8yxCJfayW-8

The Smalltalk-80 Programming System http://www.youtube.com/watch?v=JLPiMl8XUKU

The Final Demonstration of the Xerox 'Star' computer http://www.youtube.com/watch?v=_OwG_rQ_Hqw

Symbolics Hypertext Document http://www.youtube.com/watch?v=7DxYj32cvoE

Ygg2
Thanks :D
Except the original Lisp and Smalltalk environments are much more than a simple REPL.

Sometimes I wish people would learn about computing history.

Presentation from Kalman Reti about Lisp Machines, check the interactivity starting at 00:44:00 and how to introduce mouse sensitive images at 01:00:00

http://www.loper-os.org/?p=932

Xerox presentations about Smalltalk-80

http://www.youtube.com/watch?v=JLPiMl8XUKU

http://www.youtube.com/watch?v=Cn4vC80Pv6Q

http://www.youtube.com/watch?v=ODZBL80JPqw

est
For me "more-than-REPL" just looks like a code visualization tool where you can jump between symbolics. Mouse sensitive images looks just like a prototype of AutoCAD

screenshot if anyone is interested.

http://i.imgur.com/FjaDFgi.png

http://i.imgur.com/sGNN3la.png

Bret Victor's work is much, much more impressive, you can change variable values and see results in realtime. In that Mario game example, you can see Mario's trajectory and adjust values to see how physical parameters affect the height and distance Mario can jump, and modify gravity ticking equation to see Mario jump & walking up-side-down.

lispm
You could do that on a Lisp Machine, too. Symbolics sold a complete interactive animation and game development system. Nintendo used it in the early years. The software later got ported.
pjmlp
I am not saying his work is not great, far from it.

What I mean is that these ideas were already present in such environments.

The "more-than-REPL" stated by me, means that these environments allowed for an interactive type of work that went further than a simple textual REPL.

Many young HN readers tend to associate REPL to the pure REPL textual version they have access to, while using vi and emacs on their UNIX boxes.

Xerox Star User Interface (1982) -- http://www.youtube.com/watch?v=Cn4vC80Pv6Q
simonh
Which doesn't have any of the features I listed. e.g. look at those windows - they can't be moved, resized or overlap.
Aug 29, 2012 · 4 points, 0 comments · submitted by shawndumas
Yep, Xerox never invented the GUI. This entire product and all promotional videos and memories of it existing are a complete fabrication:

http://www.youtube.com/watch?v=Cn4vC80Pv6Q

You've made similar posts in the past, and they've been similarly filled with rambling inaccuracies. I'll just leave it at that.

nirvana
I never said this video was a fabrication, or that Xerox never invented any key technologies, like, for instance, the Mouse.

Pretending that I did, is a bald faced lie on your part. You should consider the fact that, if you have to rely on outright fabrication about the position of your opponent, you don't have a very strong position to stand on yourself.

At best you're knocking down a strawman.

Either you're too ignorant of the history of computer science to understand what a GUI is, or you're simply being dishonest. Either way, I find the tendency of people such as yourself to post lies and then links to "supporting evidence" that don't even address the issue, while down voting everyone you disagree with to be... less than compatible with useful discussion.

The thing I learned a long time ago about liars is this: It doesn't matter how many time you prove them wrong, they'll just make up another lie!

Pewpewarrows
So are you disputing the part that the Xerox product was graphical? Or that it was a user interface? In what possible definition that you've contrived in your mind does what I just linked you not qualify as a GUI?

I honestly can't tell if you're trolling or not.

nirvana
The Apple II came with video built in. It also had a user interface in the form of a command line. Thus, when people used the computer, graphics were drawn on the screen, sometimes in the form of pictures (such as for games) and sometimes in the form of glyphs and even icons, to represent concepts to the user of the command line. Your position implies you believe this to be a "Graphical User Interface".
howeyc
The problem here is that you're discounting that these things are incremntal in nature and saying the invention happened at Apple.

Where does the GUI start? At the ncurses-like menu-driven interface? At the point where this interface gains more than two colors? At the point where there are icons? When there is a menu? When there is a paint-like drawing tool? When there is an apple logo on it?

Some draw that line at Xerox, you draw that line a bit later and attribute the "ground-breaking invention" of a GUI to Apple.

I don't see it either way, they are all refinements along the continuim from the command-line to the GUIs we have today.

Apple happened to hit the mix that a large number of people grew to embrace and some attribute all that came before as garbage and the win to Apple. Which I think is extremely misguided, but what can you do but shrug it off.

Not sure if I am replying to a troll(plus much of the above is copy/paste or very similar to an earlier comment from you).

>they've just heard the lie that Apple stole the GUI from Xerox (impossible since there was no GUI at Xerox at the time, in fact)

The Apple haters also seem to uploading fake videos of a old Xerox GUI on Youtube! /sarcasm

http://www.youtube.com/watch?v=AYlYSzMqGR8

http://www.youtube.com/watch?v=Cn4vC80Pv6Q&feature=relat...

> Both Apple and Google release the operating system as open source, and both Apple and Google keep as closed source the application layer where their proprietary apps live. For Apple, that's the UI, for google that's the Google Apps

Sorry, Google Apps is not an application layer, however you wish it would be.

None
None
None
None
myspy
They had no real GUI in the 70ies, it was a terminal with mouse control. The GUI comes at the end of the 70ies, beginning 80ies and the guys from Apple came over, at that time, to see the concepts and prototyps.

From there, they started their own GUI metaphor.

I like the part in the video, when the guy inserts this big magnetic storage :D

As for Apple, the kernel is open source, but the whole Cocoa layer is closed. As for Android everything must be open, apart from the Google applications. Or not?

There are a few demos of the Xerox Star floating around on YouTube. Seeing it in action gives you a much more complete picture of the system.

http://www.youtube.com/watch?v=Cn4vC80Pv6Q

The "look" was obviously hugely influential, the "feel" not so much. It is quite an odd beast, as the mouse is actually only used for selecting objects.

The Lisa GUI prototypes are also and interesting bit of GUI trivia. It's easy to see the Star's influence on the final shipping version of the Lisa GUI compared to its prototypes.

http://www.jeremyreimer.com/apple_screens.html

pavlov
It is quite an odd beast, as the mouse is actually only used for selecting objects.

It's an interesting design. The mouse appears to be strictly reserved for establishing a source object and a target location on-screen. The action to perform using these two inputs is determined by physical buttons on the keyboard -- Copy, Move, Show Properties...

(It is slightly confusing because some operations don't need two inputs. Does "Show Properties" act upon the selected source object, or whatever is under the location of the mouse cursor?)

The mouse-oriented design that won in the market (introduced with Mac, adopted by Windows) ended up avoiding the keyboard in favour of much more complicated mouse action gestures such as double-clicking, drag'n'drop, right-click menus.

I wonder if the Star design would have been more user-friendly in the end. When I've helped older people with their computers, it seemed that they don't make use of drag'n'drop or right-click menus at all. With this limitation, they couldn't do something like duplicating a document without starting the producer application and performing a "Save As" from its main menu.

radiowave
The idea of "select things with the mouse, perform operations with keys under your left hand" goes back to Doug Engelbart's NLS system. The Xerox folks will undoubtedly have known about this work.

From 1968... http://www.youtube.com/watch?v=JfIgzSoTMOs

guelo
> The mouse-oriented design that won in the market (introduced with Mac, adopted by Windows) ended up avoiding the keyboard in favour of much more complicated mouse action gestures such as double-clicking, drag'n'drop, right-click menus.

Nit picking here, but right-click was obviously not introduced with the single button Mac mouse. And the awful double-click gesture was made necessary because of this single button mouse, the original Apple sin.

pavlov
Sure. What I meant that Apple diverged from the Xerox UI path by designing the Mac around the mouse, so that the keyboard was not necessary for any actions. Microsoft basically took up that idea wholesale, simply extending it with more widespread keyboard shortcuts (Alt key for menu accelerators) and new mouse-centric gestures like right-clicking (borrowed from elsewhere of course).

In PC history, there were two opportunities to redesign the input hardware for a new kind of GUI...

First was in 1987 when IBM introduced the PS/2. It was a play for IBM to regain control of the wild west PC clone market, and failed as that, but the PS/2 keyboard and mouse connector standards lived on for more than 15 years. If IBM had revamped the keyboard at that point with something like the Star action buttons, they would have become a part of the OS/2 GUI and subsequently Windows.

Another junction was in 1995, when Microsoft launched Windows 95 and introduced new keyboards with the Windows key. At this point Microsoft carried so much weight with OEMs that I think they would have been able to push through something more radical on the input device front, if they'd wanted.

HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.