HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
The RS-232 protocol

Ben Eater · Youtube · 350 HN points · 0 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Ben Eater's video "The RS-232 protocol".
Youtube Summary
This video explores the electrical and timing characteristics of the RS-232 protocol.

Support these videos on Patreon: https://www.patreon.com/beneater or https://eater.net/support for other ways to support.

------------------

Social media:
Website: https://www.eater.net
Twitter: https://twitter.com/ben_eater
Patreon: https://patreon.com/beneater
Reddit: https://www.reddit.com/r/beneater

Special thanks to these supporters for making this video possible:
Adrien Friggeri, Aleksey Smolenchuk, Alex, Alex Black, Andrew Van Gerpen, anula, Ben, Ben Cochran, Ben Kamens, Ben Williams, Bill Cooksey, Binh Tran, Bradley Stach, Burt Humburg, Carl Fooks, Carsten Schwender, Chai, Chaitanya Bhatt, Chris Lajoie, Chris Sachs, criis, Daniel Jeppsson, Daniel Pink, Daniel Tang, Dave Burley, Dave Walter, David Clark, David Cox, David Dawkins, David House, David Sastre Medina, David Turner, Dean Bevan, Dean Winger, Deep Kalra, Dennis Schubert, Dilip Gowda, Dušan Dželebdžić, Dustin Campbell, Dzevad Trumic, Emilio Mendoza, Eric Dynowski, Erik Broeders, Erik Granlund, Ethan Sifferman, Eugene Bulkin, Evan Thayer, Eveli László, Florian Rian, fxshlein, George Miroshnykov, ghostdunk, GusGold, Humberto Bruni, Ingo Eble, Isaac Parker, Jacob Ford, James Capuder, Jared Dziedzic, Jason DeStefano, Jason Dew, JavaXP, Jaxon Ketterman, jemmons, Jeremy, Jeremy Cole, Jesse Miller, Jim Kelly, Jim Knowler, Joe Beda, Joe Pregracke, Joe Rork, Joel Miller, John Hamberger jn., John Meade, John Phelan, Jon Dugan, Jonn Miller, Joseph Portaro, Jurģis Brigmanis, Justin Graziani, Kai Wells, Kefen, Kenneth Christensen, Kyle Kellogg, Lambda GPU Workstations, Larry, László Bácsi, Leo K, Lukasz Pacholik, Marcos Fujisawa, Marcus Classon, Mark Day, Martin Noble, Mats Fredriksson, Matthäus Pawelczyk, melvin2001, MICHAEL SLASS, Michael Tedder, Michael Timbrook, Michael Weitman, Miguel Ríos, mikebad, Mikel Lindsaar, Miles Macchiaroli, Muqeet Mujahid, Nate Welch, Nicholas Counts, Nicholas Moresco, Nick Chapman, Oli Homer, Ori Shamir, Örn Arnarson, Paul Heller, Paul Pluzhnikov, Phil Dennis, Philip Hofstetter, Porus, ProgrammerDor, Ralph Irons, Randal Masutani, Randy True, raoulvp, real_huitz, ReJ aka Renaldas Zioma, Ric King, Rick Hennigan, Robert Diaz, Robey Pointer, Sagnik Bhattacharya, Scott Gorlick, Scott Holmes, Sean Patrick O’Brien, Sergey Kruk, solderspot, SonOfSofaman, Spencer Ruport, Splashtwist, Stefan Nesinger, Stefanus Du Toit, Stephen Kovalcik, Stephen Riley, Steve Jones, TheWebMachine, Thomas Eriksen, Tim Oriol, Tim Walkowski, Tim Wheeler, Tom, Tom Knowles, Tyler Latham, Vincent Bernat, Walter Montalvo, Warren Miller, Wim Coekaerts, Wraithan McCarroll, xisente, Yee Lam Wan
HN Theater Rankings
  • Ranked #3 this month (apr/may) · view
  • Ranked #18 this year (2024) · view

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Nov 05, 2022 · 350 points, 100 comments · submitted by EgoIncarnate
gorgoiler
One of my proudest hacker moments was also one of my first: debugging a noisy console session whose transport was RS232.

Long story short (it was 30 years ago so I also can’t remember the precise diagnosis) there was corruption on the line in that sometimes when you typed a space it would come out as something random. Few other characters were corrupted as often, if at all. It was almost always spaces that were showing up as random other characters.

My observation was that space was encoded as 00000 — all low, the printed character most susceptible to line noise — and that something wasn’t grounded. Smarter people took my observation, figured out what was wrong, and soon all was fixed. It felt like, for the first time, I understood the system with a depth I’d not had before. It was cool.

Oh, and playing DOOM for the first time over RS232, too. Another precious memory indeed!

birktj
Yesterday I was configuring a usb to serial converter to communicate with a CNC controller (over RS232). However I was unable to transmit any data and so I brought the oscilloscope to see what was happening. I disconnected my converter and connected the oscilloscope and all the data was there and my oscilloscope had no problems decoding it with the same parameters i had used with my usb converter. Confused, I disconnected my scope and connected my converter back up again, but sill nothing. After a few times back an fourth I ended up connecting my scope up while the converter was still connected and crucially I also had a cat /dev/ttyUSB0 running and as expected the data was right there at the scope, but to my big surprise it was also there in my terminal! I then tried to disconnect my scope resulting in no more data being sent to my laptop.

The only explanation I could think of was a faulty ground connection. The scope was grounded through the wall socket same as the CNC and thus had the same ground potential. My laptop on the other hand was running on battery and only shared ground with the CNC through RS232 pin 5 (signal ground). However it seems this pin was not correctly connected on the CNC side and thus I was unable to transmit any data. Experimenting a bit I tried to connect ground to the connector shield instead and everything worked perfectly.

ace2358
I’ve had a similar experience with WS8211 (or whatever they’re called) LEDs not being properly grounded to my arduino. They don’t seem to work if they’re not on the same ground plane as the controller. So if I power the LEDs from a different power source (larger current than the arduino can handle) I have to make sure they’re grounded together.

Took me a long time to figure that one out haha

spfzero
That sounds like the laptop USB ground was not connected to the battery reference (and logic) ground. That's probably intentional but seems like it might cause problems with other USB uses from that laptop as well. I am assuming when power is plugged in the power ground is connected to the USB output ground, but I could be wrong there.

I have used my laptop on battery, with a USB-to-RS232 adapter, to control a telescope remotely and it worked fine IIRC.

cjensen
Yep, referencing signals vs ground is the big flaw in RS-232. That's why a lot of industrial machinery uses RS-422/RS-485 which is sent as a differential pair. 422 gives you the same serial interface, the same signaling, but with a better physical layer that lets you send the signal on much longer runs.
63
> Please don't do things to make titles stand out, like using uppercase or exclamation points, or saying how great an article is. It's implicit in submitting something that you think it's important.

> Otherwise please use the original title, unless it is misleading or linkbait; don't editorialize.

https://news.ycombinator.com/newsguidelines.html

dang
Changed now. Thanks! (Submitted title was "Ben Eater is back with a new video (after almost a year)!")
kragen
This is unfortunate, because it removes the reason the link is worth reading.
dang
That's true, but it's a tradeoff. Something also gets lost when everything is spelled out. This has never been the way that HN works. Part of the assumption that readers are smart enough to figure things out is that they also need to work a little: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor...

There's another aspect too: everyone misses a lot of interesting and/or important stories. Even the few who are paid to stare at HN all day. There's too much coming through at all hours. It doesn't make sense to try to avoid that, because it's impossible.

I'm sure this is a highly unsatisfactory answer!

kragen
Sure, there are advantages and disadvantages of any course of action.

You're under no obligation to satisfy me! I was just offering my point of view in hopes that you find it informative, so that you can take it into account in the future when you're thinking about how to change the way that HN works.

OJFord
May I suggest slightly editorialising, in this case? I honestly only clicked in because I thought (post-title-change) 'that sounds like a Ben Eater video, I wonder', and since Youtube isn't working for me on mobile at the momentcame looking through the comments for this.

Tl;dr I think 'Ben Eater is back' is an interestintg post in its own right, possibly more interesting than this particular video considering the typical [video] reception.

(Longer-term higher-complexity suggestion: could YouTube links have the channel name somehow, like we now have /<name> for GitHub, Twitter, et al.?)

A concrete suggestion:

> Ben Eater's back: RS-232 protocol [video]

Or to be more subtle, just append '(2022)' (and confuse people who aren't familiar or aware of the absence):

> Ben Eater: RS-232 protocol (2022) [video]

dang
I agree with you that in some cases the editorializing can be helpful for this reason. But once a thread has reached the upper ranks of the front page (or is making its way back down with tons of upvotes and comments), that extra information has sort of achieved its purpose and it's better to return to the mystery.
kragen
Yes, for unhealthy people like me who reload the HN home page a dozen times a day. But not for people who check it at a saner frequency, like a few times a week.
tyingq
Interesting comment on the video:

"The negative voltage requirement of RS-232 is why some ATX power supplies still provide a negative voltage reference on one of the pins. The power supply manufacturers want to provide it in case the end user has a motherboard that uses on board RS-232 (some industrial motherboards still use it). usually the available current for the negative voltage reference is pretty small though (around 1 amp or less)"

bogantech
I'd be surprised if any modern board didn't use something like a MAX232 or something similar though
tyingq
The context being that a MAX232 only needs +5vcc, right?

Edit: Apparently called 'Integrated charge pumps'

bloggie
Yes, there are low current requirements, so typical uart to rs232 transceiver ICs include the electronics necessary to provide the higher voltage +- supplies.
toast0
-12V was also pretty commonly used for sound cards. Not a lot of amps, as noted.
djmips
I have a pile of 5 ATX power supplies of various vintages up to recent. They all had a -12 V supply. Three were 0.8 Amp. One was 0.3 Amp and one was 0.5 Amp.
0xmarcin
Great to hear that. BTW in meantime other ppl also posted amazing content. Right now I am following this guy (https://www.youtube.com/watch?v=VvyUAaRTsww&list=PLcGZbzUhfc...), he builds x86 breadboard computer, but does so with much more tech detail & compiler/liner mastery than Ben.

(If you are interested into x86 breadboard stuff, please check The 8088 Project Book - which is a great intro befor jumping into more advanced stuff).

BTW I have a PoC of using C with Ben Eater computer - https://github.com/marcin-chwedczuk/ben-eater-6502-ansi-c I do not have time to dig deeper into it yet, but looks promising. The only problem with 8-bit processor is that they use first page of memory to pass extra variables because stack is too small (256 values in total, yup).

royjacobs
You might also enjoy llvm-mos, which allows you to use very decent versions of clang on the 6502!

https://llvm-mos.org/wiki/Welcome

scarecrowbob
I will watch this one later, but I can totally understand the excitement.

Just for some background: in college I took an introductory digital logic class and loved it, but then I moved over to a humanities degree and lost a lot of that information. Then I started programming again, but mostly as a "self-taught" web developer-- not to undercut my high school and college work (I was on a HS programming team that was competitive), but most of what I use in my day to do day life are things I picked up from having to do increasingly difficult projects/tasks as a programmer interspersed with ocasional seasons of motivations where I'd pick up new techniques (functional programming, server management, front end web rendering frameworks, etc).

Ben Eater's long series on building a computer out of 65c02 and then his longer series on building a microprocessor from TTL logic chips hit an incredibly sweet spot for me.

I'd spent some time trying to teach my kids how to do basic electronics (building things like 555 timers or little robots with pi's for controllers, so I have some familiarity with the topic.

I suppose if I had the time and were motivated, I could have gotten some textbooks and learned the same material. However, his material was setup in a way that made it very easy to grasp as a passive consumer of the video. It feels like it covered the topic as deeply as I was interested and answered a lot of questions as it went along.

So, as a person who has enough familiarity with the concepts to find it interesting but no pressing need to master the material deeply, I highly recommend his work.

le-mark
I feel like Bens video go into things that aren’t necessarily in text books, like the memory timing diagram video. That’s an example of something I would not figure out on my own no matter how hard I tried. Kudos to him, simply remarkable.
stephc_int13
Imagine a world where simple point to point data link would still use a backward compatible and improved RS-232 and where everything else was using Ethernet.

I would be curious to see an evaluation of the collective cost of USB + Bluetooth over the decades since their inception.

I've seen the guts of USB drivers a long time ago, I could not describe the horror.

mmac_
Most people will be familiar with RS-232 when connecting to a computer (COM ports). In the industrial/embedded world, it's still used quite a bit to connect to devices for logs/terminals/programming and setup etc. Of course now we rely on usb-serial adapters to connect them to our laptops.

RS-232 is still used for connecting to some embedded chipsets. Best example would be modems/4G modules. You can pick either RS-232 or USB, however sometimes on that little microcontroller you're using it won't have USB. If you don't need a huge amount of throughput then the serial port is fine.

RS-485 as others have mentioned is great. Often you can't run new cables for cat/fibre so you're stuck with stealing some copper off an old system that has been decommissioned. You'd be surprised how far you can run a RS-485 system at low speed and maintain high reliability.

xattt
> Best example would be modems/4G modules.

Is this just for control of the modem or for data transfer too? How are 4G bitrates supported by this protocol?

blamazon
With modern 'system on a chip' modems, data goes from baseband to application silicon over shared memory via a proprietary interface:

https://en.wikipedia.org/wiki/Qualcomm_MSM_Interface

I think non SoC high performance modems use USB.

Regarding older or less integrated modems that do use the 232 interface, this may be illuminating:

https://fahrplan.events.ccc.de/congress/2011/Fahrplan/attach...

mmac_
yeah this looks correct and it's a different world if you have direct access to the modem IC.

To clarify I was referring to the modules that the average person can get their hands on. If you rip open the module (or look at the FCC paperwork) you'll likely see a Qualcomm chipset underneath. It's just a level of abstraction really.

mmac_
You generally pick one interface and do everything on that. The modules are still using old school AT commands (so you still send ATD to dial a phone number). Yeah the 4G uses the same AT interface over serial.

You picked up the drawback: speed. You can't pump 4G speeds over a serial port. Depending on what you're doing through, the 'slow' speeds you get over a serial port are more than enough to serve your purpose. The 4G part is about connectivity and not speed in many cases. Locally we're seeing 3G shutdown and it won't be around for much longer.

Obviously better solutions exist for low data usage over cellular - have a look at NB-IoT or LTE-M.

int_19h
Radio is another place where serial interfaces are still very common to configure hardware (e.g. upload channel frequencies, or download logs).
anonymousiam
Nice informational video (after such a long time) and I don't mean to be pedantic, but RS-232 is not a protocol, it is an electrical/physical standard. RS-232 says nothing about the bits on the wire, other than how they might be synchronized as long as both ends of the point-to-point link agree on settings.
spookthesunset
His homebrew CPU was the best explanation of how microprocessors work I’ve seen!
davidpfarrell
Just FYI what fun thing is explored in the video:

> This video explores the electrical and timing characteristics of the RS-232 protocol.

geerlingguy
His visual exploration into various protocols is eye-opening. I love the style and always learn a few things when looking at protocols from a signal-level perspective.
rvense
He's a fantastic teacher. I particularly like his breadboarding videos where he does something obvious but wrong, and then backs up and explains why.
joneholland
I posted this on the YouTube comments as well. I’m pretty sure there is minor bug in how he uses 7 cpu cycles to try to read from the middle of each pulse. It should be about 50. He divided the 13 cycle wait in half but the reason he used 13 cycles originally is because the read logic was already using many cycles. In practice this doesn’t matter because the state transitions are stable enough with his particular device.
terramex
He does not wait for 7 cycles but for 7 iterations of loop taking 5 cycles.
ddalex
that is 35 cycles instead of 53 cycles he was supposed to wait

I mean, it's close enough and it works

unboxingelf
Started my career in software building automation systems around RS-232, IR, and ethernet. RS-232 was always my favorite because compared to the other protocols it just worked; IR was often unreliable and ethernet had much more complexity. I think I could still wire a db9 pinout from memory.
ThrowawayTestr
This is the guy that made a video card from scratch, neat.
geoffeg
This guy also made a video card from scratch. https://www.youtube.com/watch?v=K658R321f7I&list=PLFhc0MFC8M...
louwrentius
This video is made by Ben Eater. He has virtually disappeared online for almost a year now (give or take). I hope he is OK and I really miss his videos.
louwrentius
I thought it was an old video but apparently he’s back!
cafard
The first technical book I ever bought, 35 or 36 years ago, was The RS-232 Solution. I was doing tech support in those days for an outfit that sold systems built on similar but not identical gear, and reading that book considerably increased my efficiency.
hbogert
I should really pick up his vids, they are worth their thumbs up in gold
rektide
Is there a word to describe the +/- signal behavior of rs232? It's like a differential signal, but with a pesky common ground, bother.
Animats
It's not differential, just polarized. -12V or +12V originally. Extracting signal polarity is more reliable than triggering on a threshold, or it was before electronics got good.

It's a telco thing. Polarized signaling goes back to the original Edison stock ticker. Polar relays were once used on Teletype circuits.

robinsonb5
I'd probably go with "anachronistic" - RS232 predates TTL logic!

Interesting that the bit-level protocol is still very widely used today, even if the signalling is now more likely to be 5v or 3v3 TTL signalling bridged to USB.

masswerk
It's up to ±25V, a voltage swing of 50V. (I believe, ±12V was common for TTYs.)
kevin_thibedeau
Bipolar NRZ.
fuzzfactor
Low voltages near zero on any pin either + or - side are ignored, below a certain threshold of absolute voltage either way it's no-man's land. Signals are only considered real if they are boldly above the threshold, preferably close to +/- 12VDC either way to reliably qualify as a space or a mark, but no higher amplitude than +/- 25VDC.

This helps to compensate for unavoidable differences in absolute ground potential between remote locations.

The data signal is basically just square waves since it's ones & zeros, but hook a speaker up properly and that's the unforgotten audio sound of dial-up internet.

ajross
My understanding is that it was for driver circuit compatibility with older current-loop TTY devices. Early Teletype et. al. terminals were designed to run over potentially long wires with high resistive losses[1], and to be received by 1960's electronics with very low impedances (the ancestral devices were mechanical switches!). So the idea was you'd just switch a big DC supply onto the lines and figure it would be enough to get 20 mA at the other end, so that it could see it.

When RS-232 came along, they deliberately specified a very wide band of allowed voltages so that in many cases manufacturers could just cable the current loop device in parallel with a resistor and be compliant.

Don't quote me though, I don't remember where I read this.

[1] So... yeah, it's basically solving the same problem as differential signaling but at a different scale.

fuzzfactor
Looks like this would be RS-232F.

It was way back in RS-232B when the Mark and Space were reversed.

And the standard called for robust performance with all circuits not affected by any miswiring up to +/- 25 VDC on any pin at any time. Must not requie powering down before connecting/disconnecting.

DCE really meant Data Communication Equipment, i.e. primarily Bell-approved acoustic telephone modems (to be operated back then no faster than 300 baud when using their monopolized land lines).

So a DTE was Data Terminal Equipment which means a regular "dumb" terminal which is just a keyboard and CRT display with an RS-232 input/output having the D-sub (male) pinout according to the DTE pinout scheme. A PC is a lot more powerful but with a terminal app or the correct tty i/o your keyboard, display, and motherboard 9-pin D-sub (if available) will do it better than ever.

In the video, he's connecting only the transmit pair from his terminal, not shown would be the handshake lines which are, as seen, actually "optional". Not really but with lots of hardware if the recieving DCE end were not able to signal Data Carrier Detect (DCD), Data Set Ready (DSR), and Clear to Send (CTS) on the proper handshake lines back to the DTE terminal, there was going to be no data coming out of the terminal from pin 2 for you.

I would expect in the software of his RS-232 adapter, that he has continuously enabled the handshake lines needed by the PC to recognize that his external device is ready to recieve. So he therefore needs no extra conductors between the devices to accomplish that handshake purpose.

The DTE terminal would ideally signal power on and buffer ready on the DTR and RTS pins or the modem would speak not.

On the DTE the outputs are Data on pin 2 and logic on DTR and RTS. The inputs are Data on pin 3 and logic on DSR, DCD, and CTS.

On the DCE the outputs are Data on pin 3 and logic on DSR, DCD, and CTS. The inputs are Data on pin 2 and logic on DTR and RTS.

The default concept was supposed to be a cable with straight connections between pins 1, 2, 3, 4, 5, 6, 7, 8, and 20 of a male and female D-sub 25 connector at each end. On the hardware, the DCE modem had the female D-sub and the DTE terminal had the male D-sub, so it was just a straight cable. Multiple cables could be used as extension cords. Good for hundreds of feet with careful wiring considerations. In the abscence of handshakes only two conductors needed for one-way communication.

On the first PCs there were two D-sub 25 pin connectors on the back, the female was a (ribbon cable) bus parallel port for the printer, and the male was an RS-232 serial COM port for use to connect to a telephone modem.

When the mouse came along they were RS-232, most people did not have a modem yet (why would you want to connect to someone else's computer anway?), so there was an available port. Connector was downsized to 9-pin with partial pin renumbering.

Actually you could always connect two PC's together and bypass the handshake settings in software if your buffers were adequate, you only needed 3 conductors; a ground with 2 & 3 crossed over. There is also the optional XON/XOFF software handshake otherwise which is signaled within the data stream instead of needing excess conductors in the cable. Or you put a local jumper between some handshake pins at one or both ends of the cable to force the handshake logic if you didn't truly need the extra logic conductors between your target devices.

On any one connector, there should be a couple handshake pins (other than communication pins 2 and 3) where, in the abscense of intentional hardware signal changes, these pins remain at a constant positive or negative voltage. So either voltage is available at either end of the cable for local jumpering in order to fake a handshake if necessary to coerce data flow in one or both directions. Small (sometimes unique) rats nest of wiring inside of one or both ends of "proprietary" RS-232 cables under the hood is normal, to get by with fewer conductors since most serial devices are not modems any more and have lesser logic needs.

For one-way communication he's wise to use only two conductors between devices and skip the confusing "null-modem" fake-handshake approach I have seen so much of. Fortunately modern (1990!) RS-232 software allows you to enable various handshake lines at the terminal.

Here's some example wiring workarounds showing that almost anything goes:

https://www.perle.com/support_services/cabling/documents/db2...

At slow baud rates like 300 or less if available, your data can be visualized in real-time without an ocilloscope. With bypass of the hardware handshakes enabled in your communication software, ideally connect a bipolar LED (having an appropriate current-dropping resistor in series) between only male pins 2 & 5 of the PC's D-sub 9-pin connector. Although an ordinary (unipolar) LED can be good too, then alternatively reversed in polarity, showing only the marks or the spaces either way. This could just be popped into the breadboard in the video. As you press a key when terminal i/o is to that COM port it's almost like Morse code on the LED. These are the same two pins he is using to connect from his COM port to his breadboard.

teh_klev
Perle is a name I've not heard of in a while. In another lifetime during the mid 90's I was a field engineer and I used to look after Perle remote access controllers for IBM mainframe/mini customers.
timonoko
RS-232 has a protocol? You mean "do not send after XOFF". Too long video to watch and find out.
astrobe_
Yes. Start bit, stop bit, bit length and sampling. And maybe one can add the RTS/CTS hardware protocol because so many UARTs implement it.
jameswillan890
None
EgoIncarnate
His last video was posted Nov 13th, 2021.
bmitc
Having worked with a lot of instrumentation and control stuff, I really, really like RS-232. It just works. I made the mistake on one project to go all USB, and it was a complete nightmare. Basically, anything you can think of went wrong, from USB controllers maxing out on number of devices (despite the number of devices being a mere fraction of the allowed devices in the USB spec), devices causing other devices not to work if on the same hub, devices disconnecting, etc. I couldn't even get manufacturers to tell me how many USB devices their USB controller supported. I often had to use tools like USB Device Tree Viewer (https://www.uwe-sieber.de/usbtreeview_e.html) to understand what was going on. There was another USB debug tool that I used that I've forgotten the name of (maybe USBDeview). And if USB devices disconnect, the only way to guarantee getting back to them is to restart the OS process, which makes your software very fragile. Same thing with cameras with USB vs something like Camera Link. A camera's USB driver crashing would make you restart your entire program, making it very hard to build systems. Camera Link, another serial protocol, also just works.

RS-232 and RS-485 are just so reliable. The higher voltage of +/-12V makes it more resilient to noise, and the protocols are just simple. It isn't the fastest around, but it can still be pretty fast depending on how the protocols are implemented.

inetknght
> I really, really like RS-232. It just works.

Well... as long as everyone has their configuration correctly hand-configured. As the video states, RS-232 doesn't have any way to transmit a clock. So if one end is talking 9600 baud and another wants to talk at, say, 56000 baud, then... no it doesn't just work.

bmitc
It does because that is solved trivially by documentation. By just work I didn't mean plug and play. (USB isn't really plug and play anyway by virtue of it being terrible.)
camtarn
Sometimes you get a device from a client who got it from another company who got it from etc etc. And it's been configured via internal memory to talk at some baud rate with some parity, and if you're really unlucky, it won't transmit unless it's received a command. And, like somebody else commented above, sometimes the Tx and Rx pins are just the wrong way around.

Your experiences with USB sound incredibly frustrating. But RS232 can also be crap in its own unique ways.

But to be fair, I do still rather like working on well behaved RS232/422/485 equipment, where you plug it in, set it to 9600 baud 8N1, and you just start seeing a stream of easily parsed text scrolling down your terminal :)

brendank310
With RS232 you can hook up an old oscilloscope and measure what the right baud rate should be. Even in the "won't transmit unless it receives" case, a sweep on a waveform generator and well configured trigger will get you on your way. USB you'd need much nicer tools handy to get to the bottom of an issue.

You are right on about the joy of it being 9600 8N1 on the first try.

bmitc
Most of the issues with RS-232 can be front-loaded though, and they are understandable. Once you get it going and document what's what, it's fine. For USB though, reaching an understanding of the actual problem is basically impossible.
sgtnoodle
You can also design the protocol to auto-determine baud rate. Some protocols even transmit 0x55 at the start of every packet, allowing for clock synchronization.
megraf
I've done a bunch of work with RS232 in the machining industry (CNC). Getting my mind around both software / hardware control sequences was the most challenging part.

You can overcome the baud challenges with scripts that loop through common baud rates until alpha numeric characters are found.

It's also nice that the same few windows applications have been in use for 20 years or so (I specifically worked with RS232 to TCP/IP).

funstuff007
With a buffer, it's probably pretty easy to guess the baud rate. Similar to encoding guessers for strings or the CSV.sniffer in Python standard library.

https://github.com/chardet/chardet

https://docs.python.org/3/library/csv.html#csv.Sniffer

pdntspa
I've yet to encounter a situation where this is truly as issue.
vardump
Lucky you.

Issues like one device got a slightly different baud rate, enough to make it either not work at all or having a lot of transmission errors?

I have, and I'm sure a lot of other people have their own horror stories.

Having a clock signal, like in I2C, solves that.

Oh and that "oops, just connected 12V RS-232 to a 3.3V device," and see the magic smoke getting away...

Scoundreller
> oops, just connected 12V RS-232 to a 3.3V device

Maybe or maybe not. I ran 12V RS-232 levels into a 5V Atmel 8515 for a year or two 24/7 without issues, and tens of thousands, maybe more, have too. And that's a CMOS part.

(This was for a paytv interface sitting between an iso7816 slot and an x86 based emulator. The Atmel 8515 did the inversion in software).

The RS-232 on the PC side would have no issues with interpreting +5 as logic0 and 0V as logic1.

Some would setup a Max232 or Max233, but it wasn't necessary.

Others would use an MC1489 to at least convert the -12V and +12V into 5V and 0V by doing an input and then output into the chip, but again, it was out of caution and not actually necessary.

exmadscientist
You can sometimes get away with this if your cables are short and your EMC environment is gentle. (An EE wouldn't try it, but only because sticking the transceiver in is quite easy.) You can never get away with this if your cables are long and your EMC environment is hostile. That's why we have 232 and, especially, 422/485: for when the job is hard.
Scoundreller
Did it over ~100' of dollar-store grade RJ-11. (Maybe that's long, maybe that's short, depends on perspective). Was a residential environment, so EMC situation was nothing like some industrial environments.
pdntspa
When I was doing embedded work all our terminals simply remembered the last settings and it was like we would set it once and not have to think about it again.

The voltage thing is real (and real stupid IME) but we never ran into clock issues that would skew baud rates one way or another.

exmadscientist
You do need decently stable clocks for 232 and friends. Anything clocked by an RC oscillator is probably going to cause some sadness eventually.

I2C is terrible in so many ways. (Don't send it off-board. Just don't. Ever. Trust me on this one. I don't care what you read about the bus capacitance spec, do. not. do. this.) Sending out a clock on a 400pF on-card bus isn't too bad, but when you have a kilometer-long cable... yeah, we'd rather not send a clock down that, thank you. Hence the use of self-clocked or pray-we-have-similar-clock-frequencies protocols.

3.3V or 5V devices aren't RS-232 and people need to be careful about that. They're just UARTs with regular old logic levels. RS-232 or 422 or 485 are Serious Business levels to go out and do battle with the mean Real World and the only place they should ever land is at a dedicated transceiver. Full stop.

chasd00
Are you saying beware of i2c on a kilometer long cable?! Why would anyone even think that would be feasible?
bombcar
https://www.nxp.com/docs/en/data-sheet/P82B96.pdf lists 250m as an option for whatever you'd want to do with that.
exmadscientist
No, I'm saying you can't send I2C one meter off-board. Even six inches in anything but a gentle EMC environment can be severe trouble without careful design. And yet I've seen people try to run it ~180cm next to horrifying noise sources. Surprise, surprise, that doesn't work out very well.

The root of the problem is that I2C has serious EMC immunity issues. It's well known and appreciated that its drive is weak and open-drain. The bus buffers and especially differential drivers can and do help there. (And will get you past a meter.) What's less well recognized is that a single glitch pulse on SCK knocks all the internal state machines out of whack and requires a bus reset to fix them. Hope you're doing that when your bus is idle or when you're getting anomalies! Most people don't. The Nexus 4 phones sure didn't; this is why their light sensors went dead or crazy or both after a while of uptime.

All of that gets easier to handle if there's a nice, big, low-impedance ground plane nearby, which is why you don't see so much trouble when it stays on the PCB.

sgtnoodle
My favorite was the time we received a new batch of controllers from a vendor, same revision number, and their RX and TX pins were swapped. When working with RS232, it's best to have a handful of null adapters and gender changers in your pocket.
unboxingelf
This brings back memories. We called them “gender benders”.
Eleison23
We never called them "gender benders" because that term had already been in use in the real world of gender ideology.

Real techs knew them as "gender changers" which was different enough to be recognizable as only a technical term for cable connectors.

randombits0
So how did you reference master/slave? That term is already in use in the real world of human ideology.

I’ll bet you’re not a native English speaker. Nothing wrong with that, btw.

Eleison23
"Gender bender" does not accurately describe what gender changers actually did, so whatever.
simoncion
A gender bender changes the perceived gender of something so that other things that are going to interact with it interact with it appropriately.

Seems pretty spot-on to me.

Eleison23
Yes, it "changes" the gender - so that's why we call them "gender changers"

Sorry if you don't see the pejorative connotations in the other term. We avoided it because we didn't wish to offend people.

simoncion
> Sorry if you don't see the pejorative connotations in the other term. We avoided it because we didn't wish to offend people.

You remind me of one of the tech leads I have the occasional displeasure of interacting with. He'll blow in with some new work for us to do, and a stack of reasons to do it. The topmost in the stack he'll provide, and the rest of the stack he keeps hidden. The problem with his stack of reasons are twofold:

1) It takes no less than five and -typically- fifteen minutes to get through the stack.

2) Only the reason at the base of the stack is non-bogus. All of the others are calibrated to sound _great_ to folks who don't work on the thing (and talk to customers who use the thing) day in and day out.

Next time, start with the reason at the bottom of your stack when you're talking to technical folks like us. ;)

reaperducer
Different item. While Gender benders could optionally be built with a null modem inside, as a single item their purpose was to link incompatible male and female ports.

Having both a null modem and a gender bender end-to-end was common.

dgfitz
Still is! I carry around gender-benders, null modem adapters and 120ohm CAN db9 resistors every day, along with usb-to-serial, peak can to usb, kvaser to usb, and ixxat to usb.

These aren’t dead technologies by any stretch, not that you were implying that.

guenthert
And a breakout box (breakout tester)!
cathdrlbizzare
Null modems.

Laplink cables had DB-9 and DB-25 on both ends with the crossover built-in.

There were some male to male cables that had crossover while others didn’t. Then there were null modem adapter dongles for straight through cables. They were either male-female or male-male depending.

Reminds me about older Ethernet: before Auto-MDI/MDIX on most NICs and switches, crossover RJ-45 cables were needed.

guenthert
> I really, really like RS-232. It just works.

Hmmh, I rather have a love/hate relationship with it. It depends on context, I suppose. In an earlier era, Unix servers (and non-Unix Minis before that) and other equipment (some network routers and switches to this day) offered their system console via RS232. 9600baud 8n1 was common, but not universal. The developers in charge of our enterprise file server were impatient and hard coded the console to 115200baud (because that was the maximum speed PCs generally supported at that time), which not all "console servers" were able to cope with ...

Then was the question, how is it wired? DTE or DCE, i.e. do I need a null-cable? Flow control? And if it's not a DB9 or DB25 connector, but a RJ11 all bets are off and you need to find the manufacturer's cable.

Scoundreller
I ran a hundred feet or so of RS232 over RJ11 (dollar-store grade). Did it for a year or two without issues. It was -12v and +12v in one direction, but just 0V and +5V the other way.

Just needed RX, TX and GND for our purposes.

Ran at 115k2 too.

tonyarkles
> but a RJ11 all bets are off and you need to find the manufacturer's cable.

Here’s looking at you, Cisco, for using an RJ45 serial connector on devices covered in RJ45 Ethernet jacks.

alerighi
Well that thing make sense. For example you can wire the console port as one port in a patch panel for ease of access. For example in my current office we did this for the console port of access points on the ceiling, this will save you taking a ladder if you need to change the configuration (and while doing so the access point doesn't have a valid IP) or if something goes wrong.
ale42
> Camera Link, another serial protocol, also just works.

USB is also a serial protocol on the wire ;-) But there are so many layer of complexity on the top that make it indeed a nightmare in many situation.

I worked with a control system using USB, where the connection to the controller had to last for weeks. Regularly, the device stopped working (usually entirely disappearing from the device list) and I had to add support on the software to transparently allow the device to return (and people had to unplug and replug the board when receiving the "device disappeared" alert). Same stuff on RS-232 just worked without a single issue...

bmitc
I understand USB is a serial protocol, but it's the worst I have ever used. (Was just clarifying for Camera Link.)

Your example is exactly the type of stuff I had in mind. We had the same issue with a camera. We also wanted to power cycle part of the system, since the camera was water cooled to turn it off, but this was basically impossible with the USB communication without farming out the camera communication to an entire other OS process/program such that the communication could be restarted. That manufacturer, for whatever reason, only implemented streaming the images over Camera Link but didn't implement their settings over Camera Link. And I swear to god, another USB camera in the same system wouldn't work through a hub and only worked reliably when directly attached to a specific USB port on the computer. Mindblowingly frustrating.

userbinator
Interesting fact: the macOS kernel still has code to output debug messages to the serial port, and when enabled, does so by directly writing to port 3f8h; this port address has been used for the first serial port ever since the first IBM PC in 1981.
alerighi
Also very inexpensive to make, a MAX232 or MAX485 chip costs cents and will allow you to connect to any microcontroller. For sensors and similar stuff it's ideal, more 485 than 232 because with 485 you can have multiple devices on a bus (but unlike 232 is half-duplex).
II2II
It is best to say it is possible to troubleshoot RS-232 connections, rather than to claim they are more reliable. There is a limited number of parameters to configure, most software made exposed those parameters, and a lot of hardware would document the pinout. That is contrary to USB, which is more reliable yet there is little one can do when things go sideways. That is to say you are either doing something trivial (e.g. trying a different cable or port) or need to be a developer with a very specialized skillset.
fuzzfactor
>It just works.

No software drivers needed either.

causi
I often wish there was a variant of usb-c with some kind of positive plug retention, like a latch or screw.
fuckstick
I like RS-485 - RS-232 is not particularly electrically robust - I have a hard time calling it reliable except in ideal conditions. +12V single ended, is still 12V single ended. Ethernet is a pretty good sweet spot of robustness, speed, and cost I think (PoE is nice as well) - and I've ended up using it for a number of industrial asynchronous data communication systems throughout the years. Also: It is so ubiquitous that installation and test equipment is readily available (that moreover can be used by technicians) when things inevitably go wrong.
raggles
I like it too, but as a comms engineer working in electrical substations I have encountered many, many situations where it didn't just work, rs485/22 too. Issues with inter-chatacter delay, timing of control signals, different earth potentials, electrical interference, mangling of signals when going through multiplexers or media converters are common. Now that everything is fibre Ethernet my commissioning times are waaaay faster.
adrianmonk
> mangling of signals

As I recall, there was a 1990s Macintosh where the serial port used a proprietary connector (of course) with fewer pins, so Apple decided to double up and use one pin for both Data Terminal Ready (DTR) and Clear to Send (CTS). (Or we had cables that connected two pins?)

Many modems would hang up if you dropped DTR. Enabling this is a good practice to prevent the modem from accidentally staying connected after you're done.

Enabling hardware flow control is also a good practice. If the Mac can tell the sender to wait for a moment, that's better than dropping data.

Perhaps you can see where this is going. If you enable both of these, everything appears to work fine for a while. That is, until the Mac falls behind (scrolling a lot in a terminal window, for example) and needs to actually use hardware flow control. Then, rather than pausing the flow of data, it hangs up the modem.

And your first thought when a modem hangs up out of nowhere is that it's a modem issue: noise on the phone line, a bad modem implementation, an incompatibility between two different modems, etc. So you waste time looking at those as causes.

The solution was to either disable hardware flow control or to configure the modem to ignore DTR and use +++ ATH to hang up instead. Disabling hardware flow control makes PPP (etc.) perform horribly because packets get corrupted and re-sent. And this is another deceptive problem because the modem speed appears to have plummeted but actually the modem is working fine.

HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.