HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
The Most Powerful Computers You've Never Heard Of

Veritasium · Youtube · 131 HN points · 9 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Veritasium's video "The Most Powerful Computers You've Never Heard Of".
Youtube Summary
Analog computers were the most powerful computers for thousands of years, relegated to obscurity by the digital revolution. This video is sponsored by Brilliant. The first 200 people to sign up via https://brilliant.org/veritasium get 20% off a yearly subscription.

Thanks to Scott Wiedemann for the lego computer instructions – https://www.youtube.com/watch?v=5X_Ft4YR_wU

Antikythera Archive & Animations ©2005-2020 Images First Ltd. https://www.youtube.com/watch?v=1ebB0tyrMa8 "The Antikythera Cosmos" (2021) follows the latest developments from the UCL Antikythera Research Team as they recreate a dazzling display of the ancient Greek Cosmos at the front of the Antikythera Mechanism.

Tides video from NASA – https://climate.nasa.gov/climate_resources/246/video-global-ocean-tides/

Ship animation from this painting – https://ve42.co/Agamemnon

Moore’s Law, the op-amp, and the Norden bombsight were filmed at the Computer History Museum in Mountain View, CA.

▀▀▀
References:

Freeth, T., Bitsakis, Y., Moussas, X., Seiradakis, J. H., Tselikas, A., Mangou, H., ... & Edmunds, M. G. (2006). Decoding the ancient Greek astronomical calculator known as the Antikythera Mechanism. Nature, 444(7119), 587-591. – https://ve42.co/Freeth2006
Freeth, T., & Jones, A. (2012). The cosmos in the Antikythera mechanism. ISAW Papers. – https://ve42.co/Freeth2012
Cartwright, D. E. (2000). Tides: a scientific history. Cambridge University Press. – https://ve42.co/tides
Thomson, W. (2017). Mathematical and physical papers. CUP Archive. – https://ve42.co/Kelvinv6
Parker, B. B. (2007). Tidal analysis and prediction. NOAA NOS Center for Operational Oceanographic Products and Services. - https://ve42.co/Parker2007
Parker, B. (2011). The tide predictions for D-Day. Physics Today, 64(9), 35-40. – https://ve42.co/Parker2011
Small, J. (2013). The Analogue Alternative. Routledge. – https://ve42.co/Small2013
Zorpette, G. (1989). Parkinson's gun director. IEEE Spectrum, 26(4), 43. – https://ve42.co/Zorpette89
Tremblay, M. (2009). Deconstructing the myth of the Norden Bombsight (Doctoral dissertation). – https://ve42.co/Tremblay
Gladwell, M. (2021). The Bomber Mafia. Little, Brown and Company. - https://ve42.co/Gladwell2021
Mindell, D. A. (2000). Automation’s finest hour: Radar and system integration in World War II. Systems, Experts, and Computers: The Systems Approach in Management and Engineering, World War II and After. Edited by A. C. Hughes and T. P. Hughes, 27-56. – https://ve42.co/Mindell
Haigh, T., Priestley, M., & Rope, C. (2016). ENIAC in Action. The MIT Press. - https://ve42.co/Eniac2016
Soni, J., & Goodman, R. (2017). A mind at play: how Claude Shannon invented the information age. Simon and Schuster. – https://ve42.co/Soni
Haigh, T. & Ceruzzi, P. (2021). A New History of Modern Computing. The MIT Press. - https://ve42.co/ModernComputing
Rid, T. (2016). Rise of the Machines: a Cybernetic History. Highbridge. - https://ve42.co/Rid2016
Ulmann, B. (2013). Analog computing. Oldenbourg Wissenschaftsverlag. – https://ve42.co/Ulmann2013

▀▀▀
Special thanks to Patreon supporters: Dmitry Kuzmichev, Matthew Gonzalez, Baranidharan S, Eric Sexton, john kiehl, Daniel Brockman, Anton Ragin, S S, Benedikt Heinen, Diffbot, Micah Mangione, MJP, Gnare, Dave Kircher, Edward Larsen, Burt Humburg, Blak Byers, Dumky, , Evgeny Skvortsov, Meekay, Bill Linder, Paul Peijzel, Mac Malkawi, Michael Schneider, Ludovic Robillard, jim buckmaster, Juan Benet, Ruslan Khroma, Robert Blum, Richard Sundvall, Lee Redden, Vincent, Stephen Wilcox, Marinus Kuivenhoven, Clayton Greenwell, Michael Krugman, Cy 'kkm' K'Nelson, Sam Lutfi, Ron Neal

Written by Derek Muller, Stephen Welch and Emily Zhang
Filmed by Derek Muller, Emily Zhang and Raquel Nuno
Animation by Fabio Albertelli, Jakub Misiek, Mike Radjabov, Iván Tello, Trenton Oliver
Edited by Derek Muller
Additional video supplied by Getty Images
Music from Epidemic Sound and Jonny Hyman
Produced by Derek Muller, Petr Lebedev and Emily Zhang
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
It's a big area of research in AI.

Veritasium has two great videos on it:

https://www.youtube.com/watch?v=GVsUOuSjvcg

https://www.youtube.com/watch?v=IgF3OX8nT0w

I found a hackernews discussion for the second one: https://news.ycombinator.com/item?id=29645610

basicly, we can do things like matrix operations and differntial equations using specialized hardware that performs computation over analog signals rather than buidlign discrete systems. the end result is that we dont' need anywhere near the level of compute to do the job as its done using an analog of the physical process.

here's two youtube videos that explain is much better than I could. https://www.youtube.com/watch?v=IgF3OX8nT0w&ab_channel=Verit... https://www.youtube.com/watch?v=GVsUOuSjvcg&ab_channel=Verit...

suffice to say, your average house fly exhibits behavior that absolutely dominates our best drones and with a fraction of the circuitry.

In the intro to this video he mentions analogue computers predicting tides. His video before this one[1] goes into detail on that, and it was incredible.

In the 1800s, Lord Kelvin spent years working on tidal rise and fall patterns, applying Fourier Transforms to break them down into ten individual sine waves, then combining those sine waves back together to predict future tides. And built analog computers to do all the involved integration, multiplying and summing. It's a part of the history of computing I'd never heard of, despite hearing quite a lot about the dawn of electro-mechanical computers in the early 1900s.

[1] https://www.youtube.com/watch?v=IgF3OX8nT0w

ggm
The machines (or later derivatives) is in the science museum. One variant uses cones, I imagine they provide for a huge amount of variance of the parameters of the fourier transform, another is pulleys, but changing out a pulley is a lot more work than moving where two cones are co-incident rubbing to transfer motion.

Along with a meccano numerical analyser, and bits of babbage's original work

The product presented: https://the-analog-thing.org/ made by https://anabrid.com/

The company presented: https://www.mythic-ai.com/

This is the second video (of a series of two) by Veritasium about analog computers. The first one was https://www.youtube.com/watch?v=IgF3OX8nT0w and discussed on Hacker News at https://news.ycombinator.com/item?id=29645610

This was an excellent video on how this all works,

https://www.youtube.com/watch?v=IgF3OX8nT0w

The author mentions they were inspired by the Veritasium video on the same topic (without the Lego). [1] I thought I'd link the previous HN discussion. [2]

[1] https://www.youtube.com/watch?v=IgF3OX8nT0w

[2] https://news.ycombinator.com/item?id=29645610

You might be interested in the analog computers that were used to compute tide tables back in the day. It's quite fascinating how they even managed to do basic Fourier transforms without electricity:

https://www.youtube.com/watch?v=IgF3OX8nT0w

There is an upcoming Veritasium video about the "comeback of analogue computers".

Part one (teaser): https://www.youtube.com/watch?v=IgF3OX8nT0w

sshlocalhost98
Yup, I saw it, what is he actually trying to say ? How can analog computer supercede digital, analogue is made too specific for only 1 task
kortex
If that "1 task" happens to be performing matrix multiplication (or even merely fused multiply add), you can do a heck of a lot with that. You still need digital circuitry to support the IO, but the key idea is doing linear algebra in a way that is faster and/or generates less heat per unit compute.
JumpCrisscross
We have tonnes of computing power dedicated to repeatedly solving a simply-parameterised problem.
rawoke083600
Stupid question... with todays memory capacities....At what limit/size do we stop using matrix-ops and simply use lookup tables todo 'matrix-math' ?
adgjlsfhk1
TLDR is you can't. For a very simple example, storing the products of all 3x3 matrices * length 3 vectors in Float16 precision would take 2^193 bytes (which is obviously impractical).
jeffbee
Not a stupid question. Economically, memory density will hit a brick wall soon. Developers should prefer to waste time and save space, since parallel computation will not hit a similar limit in the foreseeable future. Memory-to-core ratio is going to be falling.
aidenn0
Your CPU has dedicated circuitry for performing CRC32 and AES rounds. That's as specific as any analogue computer...
littlestymaar
And your mobile SoC likely has h264, h265 and VP9 dedicated circuits also, so it also works for quite complicated tasks too.
Dec 22, 2021 · 129 points, 77 comments · submitted by jdkee
amatic
To me, the greatest appeal of analog computing is in representing equations in mechanical, visual, directly modifiable, 'hands on' forms. Having different concrete representations or implementations of abstract concepts makes them so much easier to understand. I've read somewhere that the technicians working at MIT with Vannevar Bush's mechanical differential analyzer quickly learned advanced calculus and "debugged" the math of university professors that would use the machines for various computations. It is not the continuous nature of the variables that is so interesting, it is the concept of computing by analogy and the benefits to understanding that come with that.
dredmorbius
Seeing the individual sine-wave functions comprising a tide table was an eye-opener for me.

I've seen and used tide tables. I'd never even stopped to think that these were probably a Fourier function based on a set of waves, though that's blindingly obvious the second it's mentioned.

amatic
There is a series of four fantastic videos about a successor to Thompson's tidal analyzer - the Michaelson's Harmonic Analyzer; here is the first video: https://www.youtube.com/watch?v=NAsM30MAHLg
dredmorbius
Thanks!
pjmorris
I don't know if I was the last kid in the US to buy a slide rule, but playing with it is how I came by my first understanding of exponents and logarithms in 8th grade.
analog31
An analog computer that I'm not giving up is my dial caliper. Accuracy aside, the dial "updates" itself continuously, and you can even anticipate where it's going to settle. It takes a very high quality (expensive) digital caliper to match the responsiveness of the dial.

I'm not giving up the nice analog stereoscopic microscope in my workshop. Digital microscopes exist (still the optics are analog) but actually making a user interface that's as snappy as human vision remains a hard problem, and companies like Keyence who succeed at it aren't making it cheap.

These are user interface issues rather than accuracy issues. A problem with bringing many archaic technologies into the digital domain is actually understanding what problem they were solving.

(I'm also not giving up my nice Mitutoyo vernier caliper, but that's more of an aesthetic thing).

MisterTea
> (I'm also not giving up my nice Mitutoyo vernier caliper, but that's more of an aesthetic thing).

My digital Mitutoyo was only $250 USD or so, the solar version (has its drawbacks in dark corners but a flashlight works or go dial.) I then have a Starret dial and verniers in 6, 10 and 16 inch lengths. The Mitu does most of the measuring though.

Ductapemaster
One of the coolest things I ever built was a "bouncing ball simulator" [0] for my analog electronics class in college. I'be built a lot of stuff with embedded processors throughout my life, but that one project changed how I viewed electronics.

Assembling a bunch of op-amps, FETs, and passives and having it output a physical representation of a ball being dropped and bouncing off of the ground was just magic to me. I don't do much with analog these days, but when I get into a project that needs it, I still draw from the confidence that project instilled in me. Before then I struggled to see analog circuits as tools to solve problems, but for some reason after building the simulator, it just clicked and I saw the possibilities.

I wish I had soldered the thing together. I still remember the day I ripped up my 6 breadboards for the next lab...we built a Theremin though, so at least that was cool.

[0] A cleaner and more functional example of what I built: https://hackaday.com/2009/01/07/bouncing-ball-analog-compute...

xyzzy21
In many cases analog solutions are STILL the most powerful. My company is working on one such right now. The digital standard implementation can't remotely keep up and is literally 100x large in volume to even come close even if you could implement it as SoCs. It may never be possible to implement it efficiently in digital.

Which is fine - it's about achieving the purpose, not celebrating a particular technology or design assumption! The best solution for the problem doesn't care about the implementation; only what is best.

hasmanean
Analog is especially useful for neural networks, where small and cheap matters and you can deal with the “errors” through training and backpropagation.

Human brains are inherently analog…as evidenced by the fact that thinking becomes much more precise after a cup of coffee. Analog effect.

bgroat
I was waiting for someone to bring up human brains.

I'm reading Dune now, and much of the HN-crowd that in the Duniverse they don't have computers because of a past AI war.

What's a little bit less known is that one of the factions in the universe (The Bene Tleilax) actually dabbled with building computers in within the time-span of the narrative.

They concluded that it just wasn't worth it. Trained human intellects (augmented with spice) were just better.

Now, obviously we don't have access to spice in our reality... but we also aren't doing FTL travel calculations. So maybe we don't need it

jerf
Recall that Dune was published in 1965. It took a bold science fiction author to predict the degree that computers would advance even in the 1980s (Nueromancer, 1984, for instance), and it wouldn't fully penetrate scifi for quite a while. Prior to that you'd get a lot of things called "computers" but had very strange performance characteristics.

One of my favorites in hindsight is Asimov's Robot series. This is a universe where the Eniac flashing-light-style computer is the height of computing technology, but we can build sentient, human-class brains to stick in robots, and occasionally use outside of robots. A very, very weird technology landscape if you think about it for a while.

In isolation, there's a lot of interesting-in-hindsight "hits" in scifi that predict this or that aspect of the modern computerized world, but in totality I'd say the power computers have, the speed with which they acquired them, and their widespread availability was a "miss" for science fiction. Arguably they were even a little late to the party, having to see some of these come from real engineered products before they started showing up in stories. But I don't really criticize them for it... rationally and abstractly, predicting that much exponential advance across that many fronts of performance was arguably not a smart bet. It happened, so it's true, but even in hindsight I'm not sure I could prove that it was some sort of inevitability that everyone should have seen coming.

In 1965, it would still be possible to believe that humans, especially bred for the task for a few thousand years and augmented with drugs, would outdo computers forever. In 2021, I find the premise less compelling. I don't care how you try to arrange it, the current people staffing AWS couldn't themselves replace AWS, not even with hundreds of years of breeding for the purpose and a steady feed of drugs. Too many exaFLOPS.

Maursault
> Human brains are inherently analog

No they're not inherently analog. If they were, thinking of a bird would produce the analog of a bird within the brain, and this definitely does not occur. The brain is neither analog nor digital, but includes a signal processing paradigm that has properties of both. Signals sent through and around the brain are either/or states that are similar to binary. A neuron fires or it does not. These pulses are the most basic language of the brain, and they're all or nothing, so the brain is computing using something like binary signals. But biochemical pathways are similar to analog. Neurons also perform internal electrical signal integration in analog. But the spatiotemporal pulses of neural code looks a lot like digital signaling.

hasmanean
Ok it’s mixed signal.
jerf
Analog and analogical have diverged and aren't the same anymore, despite the implicit claims of the video. "Analog" now simply means "not digital". You can build analog computers that are not analogical, especially if you're an engineer just trying to do a job and don't mind hybridizing in some digital as needed. Analog now just means "not digital".
Phillipharryt
Very interested to watch the second part to this video, off the top of my head I can't come up with a situation in which analogue computation or signals are better than digital ones. Digital's versatility means we are making 2 signals represent an infinite number of other possible values, so there is certainly an inefficiency there, but the analogue signal's propensity to degradation and uncertainty is another hurdle I would find hard to overcome and produce a better computer with.
foldr
High frequency signal processing is an obvious example of a case where an analogue computer can be superior under certain conditions. Say you want to detect when a signal has risen above a certain average magnitude over a particular time window. You can quite easily do that using a few op amps and passive components, even up to GHz frequency signals. To do the same thing digitally would require high end ADCs and either a very fast CPU or an FPGA. If your budget is tight then even frequencies of 1MHz might prove challenging to process digitally.

This is probably one of the reasons why analogue fly by wire flight control systems existed quite a way into the digital age. The original Su-27 had an analogue fly by wire flight control system, for example.

nyoomboom
https://youtu.be/vHlbC74RJGU

I watched this talk, which describes the current von Neumann computer architecture as "analog communication with digital computing". This consumes more energy than digital communication with analog computing. Projects like Neurogrid, Intel's Loihi chip and pretty much any system that can efficiently run spiking neural networks.

Neuromorphic computing is where this is going.

zokier
The resurgence of analog computing is common hype thing. I guess this is the latest iteration that the video is referring to

https://spectrum.ieee.org/analog-ai

Here is another random article from 2019 https://semiengineering.com/using-analog-for-ai/

Gravityloss
Analog washing machines had one nice practical thing since you could force the "program" counter forward or backwards. This was especially practical if you were with a tight schedule and the program contained unneeded parts. You could skip them manually. Of course you had to know what you were doing, like not open the door with water in the machine.
codeflo
It’s been a while, but didn’t the program knob in those machines turn in discrete steps? If so, then that system was — to be pedantic — a mechanical digital computer, not an analog one.
dredmorbius
The most direct analogue (ahem) would be a music-box dial or perhaps a Jacquard loom.

The washer cycle(s) were driven by a clock which rotated a drum or cylinder with pegs that would start and stop specific actions. So, fill, agitate, drain, spin, rinse (fill, agitate, drain, spin), and spin-dry. The mechanisms were bog simple.

Whether you consider these analogue gear logic, or digital pin memory is somewhat arbitrary and a semantic distinction. Either way, the "programme" is fixed, and there is no interactive logic, only a pre-defined behaviour which is followed. Fill and drain were controlled via float switches, I believe.

Users could modify the routine somewhat by selecting different sections of the dial (which programmed different wash cycles) and by where within each the wash started (longer or shorter pre-soak), by selecting fill levels, and by selecting water temperature.

ithkuil
There is nothing that prevents a digital machine from exposing such UX to the user.
Gravityloss
Ah, but there is, corporate culture and design culture!

Of course, not much technical barriers, maybe some minor complexity in actually showing it and providing an interface.

jonsen
Open source! In the old days when taking off the lid or back plate you would find a full schematic of the analogue computer inside inside.
xyzzy21
The "leading edge" of most corner technologies are usually better in analog. For example, SDRs in radio are only effective up to a certain data rate and frequency bandwidth. At some point analog signal processing (in this case classic "analog radio") is more effective and often the only possible implementation.

Thankfully I work on the leading edge of several technologies and I'm trained in analog so I see this stuff all the time.

analog31
Indeed, something like converting the frequency of a laser to a usable clock signal has to be done in the analog domain, and not necessarily even in the electronic domain. Also, (as Horowitz and Hill pointed out) getting higher performance digital electronics to work requires understanding analog techniques.

I do some analog work too, but today's mantra is: Get it into the digital domain as soon as possible.

tmcb
I have no experience with analog computers at all, but I think those could be less of a problem as of today. You could plug a bunch of digital sensors/controllers/actuators to the analog computing unit to monitor for those, which was simply not possible in the 60s. Also, you can check their accuracy against their digital equivalents or simulations, which are less efficient but yield better results.
Phillipharryt
Doesn't the inclusion of the digital accuracy checkers then decrease the efficiency, and mean you might as well use a completely digital computer? Just supposing here, but interfacing digital with analogue probably is a poor middle ground between the versatility and ubiquity of purely digital computers (countless existing systems exist to do whatever you want, with optimised algorithms and chips to work with) and purely analogue (presumably gain efficiency advantage by not having to cater to versatile use-cases).
tmcb
> Doesn't the inclusion of the digital accuracy checkers then decrease the efficiency, and mean you might as well use a completely digital computer?

Not really. Whatever output the analog computer returns can be digitized with no detriment to its performance, pretty much in the same way a sensor which measures a physical property can have its output fed into a digital system with negligible interference over the original measurement.

Also, the same rationale can be used to probe intermediate steps and automatically check for their accuracy, even if only during validation phase. This is a possibility that was definitely not available, say, 60-odd years ago.

bajsejohannes
Pure speculation ahead.

It doesn't have to be better in an absolute sense, but being good enough for a cheaper price, lower power usage, smaller footprint, etc.

I think a lot of floating point calculations could fall into this. For example in neural nets, maybe there are analog versions to calculate the weights, sigmoid function and so on.

And for graphics, you don't really need the exact color value of each pixel. Maybe those could be estimated in analog functions too.

gswdh
None
Phillipharryt
I certainly agree with the idea of not being better in an absolute sense, not sure I agree with both use cases. Graphics are built around digital representations of colours and shapes, Vectors are incredibly easy ways to represent 2d graphics, and are very easy to manipulate for digital computers. Polygons were quickly discovered as a memory efficient way of doing the same things in the 3d space. Analogue graphics representation or manipulation became outdated very quickly. For example https://www.youtube.com/watch?v=0wxc3mKqKTk&ab_channel=VICET... shows how much old analogue machinery is required to replicate what could currently be done by most phones. I don't know enough about your other possible use case to comment on it.
bajsejohannes
What I was imagining was the scene still being represented digitally with polygons, but the shader could still benefit from analog functions. Say, you could do functions like sine and logarithms faster/better/cheaper. So you'd get the same image, but with some added noise. Again, it's just pure speculation on my side.

That video was amazing, by the way!

tromp
TIL about the Norden Bombsight https://en.wikipedia.org/wiki/Norden_bombsight

a good example of an analog computer too powerful for its own good, as the precision of its machined parts couldn't keep up with the complexity of its computation.

mikewarot
It is amazing what you can do with mechanical computing, you can even precisely compute irrational numbers like sqrt(2) to their limits of precision in o(1).

While it might be tempting to use analog computing in a neural network chip to take advantage of the improvements in transistor count from Moore's law, something tells me that digital computing fabrics will still outpace even the most clever chip. You have to abandon the Von Neuman architecture to do so.

charcircuit
>you can even precisely compute irrational numbers like sqrt(2) to their limits of precision in o(1).

Can you elaborate? I have a feeling they would also be O(1) on digital computers.

cyber_kinetist
Draw a square (using a compass), calculate its diagonal length, voila. Though it really depends on the accuracy of your compass and ruler, and it's kinda meaningless to do any complexity analysis (what even is 'N' in this case?)

For digital computers, square-root algorithms that calculate digit-by-digit would take O(N), if you take N as the number of digits. If your precision is fixed, then you can get an approximate solution by using iterative methods (like Newton-Raphson). In that case, the complexity would be more like O(log(N) f(N)) to calculate up to N digits (where f(N) is the cost of doing one iteration with N-bit numbers)

charcircuit
if you have m-bit numbers you can simply loop all of them and find which one when squared is closest to 2. This alogrithm will run in O(1).

Alternatively you can do the same thing with drawing a square and measuring the distance between the center and the corner. Perhaps it's a little cheaty since you have to include a precomputed sprt(2) for the diagonal length of a pixel.

codeflo
Supposedly, one application of analog computing is simulating systems governed by differential equations.

The discrete time step in a typical numerical simulation introduces some artifacts, often seen as an error at high frequencies. In an analog computer, you set up a system of differential equations by using integration components that you plug back into the system to literally form a feedback loop. Because such a setup is physical, it can simulate the target system without any temporal artifacts.

The CCC hosted a great (English language) talk on this that really blew my mind: https://media.ccc.de/v/saal_mp7_og_-_2013-07-07_14:00_-_anal...

Aldipower
I would like suggest Bernd Ulmann here, who's an expert in modern and also past analogue computing.

https://www.youtube.com/watch?v=sVKmiCy4LA8

Here is a very opinionated promo video.. https://www.youtube.com/watch?v=j1wZ8zU1ZGI

freemint
Or his start-up https://www.anabrid.com/
Borrible
Not to forget his Analogmuseum!

http://www.analogmuseum.org/english

Aldipower
Absolutely.

Sadly this site is only reachable without TLS. Works only with http not https.

Borrible
Grmph....

Modern Times.

Changed the link, thank you.

ofrzeta
The Deutsches Museum in Munich exhibits a mechanical Gezeitenrechenmaschine:

https://www.youtube.com/watch?v=Oyq2zVXLKmY

calclambda
"In some ways, new Navy computers fall short of the power of 1930s tech" [1]

[1] https://arstechnica.com/information-technology/2020/05/gears...

TheOtherHobbes
I'm finding it very hard to believe that small machined metal parts have more accuracy than 64-bit floating point math, which has around 16 digits of precision. Of that you can't run fp fast enough on GHz hardware to model ship movements which run at a few tens of Hz at most.

If you really want higher accuracy 128-bit and 256-bit fp formats are defined in IEEE-754.

kayodelycaon
I think one of the big issues is there just isn't the need for accurate long-range fire from ship-board guns like there were in World War 2. The eletro-mechanical systems used on various warships during the war were accurate enough that other environment factors became more of a problem than the fire-control system.

Improving on a system that was already the peak of performance is difficult. And this is made even more difficult when long-range gun accuracy is less of a concern because naval guns are no longer the only weapons available to a warship. Missiles have become so common and powerful that modern warships have little to no armor protection. Point-defense weapons like Phalanx are used instead.

There is still a place for artillery, both naval and land-based. It's just a smaller role than it was a hundred years ago.

mannykannot
That subtitle is misleading: the actual claim being made is that the electromechanical fire control computers were more than precise and accurate enough for their purpose of firing long-range guns. The remaining inaccuracies were elsewhere in the process, and required guided munitions to improve upon.

"...but take away the fancy GPS shells, and the AGS and its digital fire control system are no more accurate than mechanical analog technology that is nearly a century old."

gorgoiler
Nice try Veritasium, but I’m afraid that as clever as these pen wiggles are. the defining characteristic of computers[cience] is the calculation of discrete, discontinuous functions…

  if sunday:
     store_closed
  else:
     store_open
These analog “computers”, despite their historic name and ability to compute continuous functions, don’t meet the bar for what it means to be a computer.

It’s like saying a spring balance is an atomic computer because it adds up the masses of all the atoms on the balance to give you a mass in grams.

That’s not to say that non-electronic logic circuits haven’t been built in the past. They have been built with fluid, marbles, redstone etc. but I don’t know if they were ever used in antiquity to perform calculations. All the examples I know of were built in the computerized modern era.

(Let’s not get started on whether a machine with separate memory for data and code can count as a computer.)

esarbe
A spring balance is a computer in the same way a lever is a machine.
DonHopkins
You should real Steven Wolfram's "A New Kind of Science", and you'll get a much deeper and wider appreciation for just what is a computer and how Turing completeness can apply to so many situations. Even the simplest systems can be universal computers!

https://en.wikipedia.org/wiki/A_New_Kind_of_Science

>Generally, simple programs tend to have a very simple abstract framework. Simple cellular automata, Turing machines, and combinators are examples of such frameworks, while more complex cellular automata do not necessarily qualify as simple programs. It is also possible to invent new frameworks, particularly to capture the operation of natural systems. The remarkable feature of simple programs is that a significant percentage of them are capable of producing great complexity. Simply enumerating all possible variations of almost any class of programs quickly leads one to examples that do unexpected and interesting things. This leads to the question: if the program is so simple, where does the complexity come from? In a sense, there is not enough room in the program's definition to directly encode all the things the program can do. Therefore, simple programs can be seen as a minimal example of emergence. A logical deduction from this phenomenon is that if the details of the program's rules have little direct relationship to its behavior, then it is very difficult to directly engineer a simple program to perform a specific behavior. An alternative approach is to try to engineer a simple overall computational framework, and then do a brute-force search through all of the possible components for the best match.

Even a reservoir of water (or a non-linear mathematical model of one) can be used to piggyback arbitrary computation on the way liquid naturally behaves.

Here's a paper about literally using a bucket of water and some legos and sensors to perform pattern recognition with a "Liquid State Machine" (see Figure 1: The Liquid Brain):

https://www.semanticscholar.org/paper/Pattern-Recognition-in...

>Pattern Recognition in a Bucket. Chrisantha Fernando, Sampsa Sojakka. Published in ECAL 14 September 2003, Computer Science.

>This paper demonstrates that the waves produced on the surface of water can be used as the medium for a “Liquid State Machine” that pre-processes inputs so allowing a simple perceptron to solve the XOR problem and undertake speech recognition. Interference between waves allows non-linear parallel computation upon simultaneous sensory inputs. Temporal patterns of stimulation are converted to spatial patterns of water waves upon which a linear discrimination can be made. Whereas Wolfgang Maass’ Liquid State Machine requires fine tuning of the spiking neural network parameters, water has inherent self-organising properties such as strong local interactions, time-dependent spread of activation to distant areas, inherent stability to a wide variety of inputs, and high complexity. Water achieves this “for free”, and does so without the time-consuming computation required by realistic neural models. An analogy is made between water molecules and neurons in a recurrent neural network.

This idea can be applied to digital neural networks, using a model of a liquid reservoir as a "black box", and training another neural network layer to interpret its output in response to inputs. Instead of training the water (which is futile, since water will do what it wants: as the apologetics genius Bill O'Reilly proclaims, "Tide go in, tide go out, never a miscommunication."), you just train a water interpreter (a linear output layer)!

https://www.youtube.com/watch?v=NUeybwTMeWo

Reservoir Computing

https://en.wikipedia.org/wiki/Reservoir_computing

>Reservoir computing is a framework for computation derived from recurrent neural network theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir.[1] After the input signal is fed into the reservoir, which is treated as a "black box," a simple readout mechanism is trained to read the state of the reservoir and map it to the desired output.[1] The first key benefit of this framework is that training is performed only at the readout stage, as the reservoir dynamics are fixed.[1] The second is that the computational power of naturally available systems, both classical and quantum mechanical, can be used to reduce the effective computational cost.[2]

>History: The concept of reservoir computing stems from the use of recursive connections within neural networks to create a complex dynamical system.[3] It is a generalisation of earlier neural network architectures such as recurrent neural networks, liquid-state machines and echo-state networks. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface.[4] The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling.[3] However, training of recurrent neural networks is challenging and computationally expensive.[3] Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer.[3]

>A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components.

>Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks.[5] These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems.[5][6] In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid.[6] However, the nuclear spin experiments in [6] did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a random kitchen sink[7] algorithm (also going by the name of extreme learning machines in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices.[6] In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers.[8]

>Reservoir computers have been used for time-series analysis purposes. In particular, some of their usages involve chaotic time-series prediction,[9][10] separation of chaotic signals,[11] and link inference of networks from their dynamics.[12]

Liquid State Machine

https://en.wikipedia.org/wiki/Liquid_state_machine

>A liquid state machine (LSM) is a type of reservoir computer that uses a spiking neural network. An LSM consists of a large collection of units (called nodes, or neurons). Each node receives time varying input from external sources (the inputs) as well as from other nodes. Nodes are randomly connected to each other. The recurrent nature of the connections turns the time varying input into a spatio-temporal pattern of activations in the network nodes. The spatio-temporal patterns of activation are read out by linear discriminant units.

Echo State Network

https://en.wikipedia.org/wiki/Echo_state_network

>The echo state network (ESN)[1][2] is a type of reservoir computer that uses a recurrent neural network with a sparsely connected hidden layer (with typically 1% connectivity). The connectivity and weights of hidden neurons are fixed and randomly assigned. The weights of output neurons can be learned so that the network can produce or reproduce specific temporal patterns. The main interest of this network is that although its behaviour is non-linear, the only weights that are modified during training are for the synapses that connect the hidden neurons to output neurons. Thus, the error function is quadratic with respect to the parameter vector and can be differentiated easily to a linear system.

dredmorbius
I'm fairly certain that there exist logic mechanisms for analogue computers. An alarm clock is a fairly trivial example, but might serve as a minimal case.

Various mechanical governors (see Watt's flywheel governor), pressure relief valves, and similar automated controls within hydraulic systems, also come to mind. Similarly, a mechanical thermostat.

gorgoiler
Good point, but these are really just switches.

Can you think of any examples where one switch controls multiple others? The mechanical equivalent of a clockless processor?

dredmorbius
Not offhand. Though I strongly suspect they exist.

Physical analogues of all the basic logic functions (AND, OR, NAND, XOR, NOR, NOT) should all be constructable.

tagoregrtst
None
jerf
"Physical analogues of all the basic logic functions (AND, OR, NAND, XOR, NOR, NOT) should all be constructable."

Trivially so. Electronic circuits are not intrinsically digital, nor are integrated chip components. They are intrinsically analog. We get digital behavior out of them by building and driving them in a certain way. The cost/benefits tradeoff on using these components digitally is so nice for us that we do it so often that we can forget this fact, but it is a fact. There are a number of chips that have analog components and behaviors in them; you can find a lot of them in sound synthesis, for instance, especially older ones like the SID chip ("The chip combines analogue and digital circuitry, that cannot be emulated with 100% fidelity even today." https://www.c64-wiki.com/wiki/SID ).

gorgoiler, you are speaking as if you consider analog computing a subset of what digital computing can do, but it's actually the other way around. Digital computing is a subset of analog computing where we deliberately construct a digital computer out of what are still analog parts. Anything a digital computer can do, an analog computer can do, because it can simply function digitally for that portion but then incorporate other analog components. And as is often the case, when you really get down to it the line gets fuzzy... is the Commodore 64 an "analog" computer just because it had an analog part used in a particular set of ways? I think most people would say no... but it certainly wasn't 100% digital.

As I mentioned in another comment, I kinda think the video does a disservice by going too deeply into "analogical" vs. "analog". They've separated in meaning now. The people he shows at the end doing modern analog computers are, as far as I know, building things that look a lot more like a modern computer, except the requirement that all the components be driven by a clock signal and that all voltages stabilize before the next clock is loosened and they permit other analog behaviors of electronics to come in, allowing for programmable analog computers. They're not building things out of cogs with direct and obvious connections to underlying processes... modern digital computers are far better for those sorts of things.

dredmorbius
Right. I'm thinking specifically of cases in which, say, a water-control system might utilise such elements, and just how complex they might get. This isn't my area, so I'm being conservative in statements.

An electronic switch translates to a hydraulic valve.

And a hydraulic valve is a direct analogue of a vacuum tube or transistor: an applied input delivers a controlled input. Often but not necessarily amplified --- there are cases where the input effort might be much larger than the output, in control or precision implementations.

Mind: a valve itself might be water activated, in the sense of a small flow through one channel translating to a large flow in a controlled channel. One obvious example of this is the fantail of a post windmill, where any orientation of the mill's main sails outside the primary wind flow starts spinning the fantail which reorients the mill into the wind. See:

https://upload.wikimedia.org/wikipedia/commons/6/65/Beebe_Wi...

There are numerous cases of interlocks, many of which are mechanical. Some of these are through inherently fail-safe designs. A canal's locks, aeroplane doors, and airlocks all open such that they require pressure equalisation, preventing opening whilst the lock and channel are at different levels, or pressurised and depressurised regions are not at equilibrium. Shift-lever and starter interlocks require that brakes be engaged and vehicles in neutral to start, or shift from reverse.

https://realpars.com/interlock/

And I'm finding a few references specifically to mechanical control logic:

"Mechanical Logic Devices and Circuits" http://www.nacomm09.ammindia.org/NaCoMM-2009/nacomm09_final_... (PDF)

"PDF Pneumatic Logic & Controls - Parker Hannifin" https://www.parker.com/literature/Literature%20Files/pneumat... (PDF)

The second is a catalogue of available products based on pneumatic logic.

One specific domain in which analogue / mechanical controls have been specifically discussed is for future Venus lander missions. Temperature and pressure profiles are too high for virtually all electronic systems (circuit boards, solder, and componentry would thermally degrade or melt).

JPL have specifically explored the concept of a "clockwork rover", AREE (advanced rover for extreme environments) for a Venus or similar mission:

https://www.jpl.nasa.gov/news/a-clockwork-rover-for-venus

credit_guy
Also, all the domain of fluidics, such as

https://en.wikipedia.org/wiki/Paper-based_microfluidics

dredmorbius
Beautiful, thanks!
freemint
One obvious approach is to have a model of the system that runs faster then the system define a fitness function and a fitness gradient depending on the controls and the do model predictive control.
tagoregrtst
You’ve conceded that logic gates can be built with pneumatic (and other) systems.

Complex logic systems were built, car automatic transmissions come to mind. I don't know if there were Turing complete since there was no need - these were purposely built for a specific task.

Anyway, analog computers are computers in that any realization of a Turing complete machine is necessarily created by analog circuits - albeit very non linear analog circuit. Therefore, Turing complete machines can be emulated on an analog computer but the reverse isn’t obvious - CFD, DFT, ad-initio can never really get the answer of the general case right (try simulating turbulence, crack propagation, the transition state of oxygen adsorbed on Pt under an applied external electric field etc)

thunderbong
From the video -

> With analog computers, the quantities of interest are actually represented by something physical, like the amount a wheel has turned. Whereas digital computers work on symbols like zeros and ones. If the answer is, say, two, there is nothing in the computer that is 'twice as much' as a one. In analog computers, there is.

agumonkey
Personally this topic made me quite jaded about recent computing. So much interesting ideas with simple devices. React.phys
simonblack
The most common analog computers were the slide-rules.

Even today, many pilots use their Weems circular flight calculators (otherwise known as dedicated circular slide rules) - https://www.ebay.com/itm/264450416305

i_hate_pigeons
There is a light hearted talk from Freeman Dyson that touches on these, I'm not sure how accurate the overall talk is but found it very entertaining

https://www.youtube.com/watch?v=JLT6omWrvIw

esjeon
"Analog" AI computing? Maybe this one?

https://analog-ai-demo.mybluemix.net/

The concept isn't that difficult, and there's a cool demo on the page.

subhro
Analogue computers are fun, and faster (in some instances) for my use case. I still regularly use E6B for in flight calculations, an analogue flight computer, although I fly a glass cockpit.
sinuhe69
I wonder how could they produce gears in such high quality and precision 2000 years ago. Anybody has more information about this?
willis936
Is it fair to say a quantum computer has more in common with an analog computer than a digital computer?
kohlerm
Great video, saw it yesterday, need to see the second one ASAP
me_me_me
Pullies as a way of summation is mindblowing simple and beautiful solution.

Before he revealed the solution I was thinking about some rail(s) that can be stack with movement of the rail translated into next wheel, but he pullies and rope is such a much better solution.

dredmorbius
I went through pretty much precisely the same set of thoughts. Yes, quite elegant.
tontonius
wasn't this up here just a few days ago?
throw8932894
"Most powerful" is quite a stretch. Human computers were far better and more universal.
me_me_me
He literally gives an example how calculating tides was unfeasible for humans to do by hand.

Unless you are trying to go into 'definition of' argument

quantumwannabe
The US Navy made a great video in 1953 explaining how mechanical computers work: https://www.youtube.com/watch?v=s1i-dnAH9Y4
hungryforcodes
Wow. I actually feel SMART now -- why was I watching React videos before? :p
agumonkey
This video is so brilliant
emilfihlman
Thank you for this! I absolutely love old US military / government services produced teaching and learning material like this. Sometimes it feels like the quality peaked long before.
Dec 21, 2021 · 2 points, 0 comments · submitted by zeristor
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.