Hacker News Comments on
These Are Not Pixels: Revisited
Technology Connections
·
Youtube
·
18
HN points
·
4
HN comments
- This course is unranked · view top recommended courses
Hacker News Stories and Comments
All the comments and stories posted to Hacker News that reference this video.So TechnologyConnections made two Vids about that very topichttps://www.youtube.com/watch?v=dX649lnKAU0 https://www.youtube.com/watch?v=Ea6tw-gulnQ
It's...complicated.
⬐ maven29That sounds really similar to subpixel rendering as used in AMOLED displays and their diamond/pentile subpixel arrangements.I found a demo online illustrating this
http://www.clivemaxfield.com/area51/do-not-delete/pentile-rg...
The electron gun draws through the shadow mask, which is fixed and independent of resolution. It's a debate long enjoyed by nitpickers.
"240p" is not a meaningful statement when referring to CRTs. CRT phosphors are not pixels.For more on that: https://www.youtube.com/watch?v=Ea6tw-gulnQ
⬐ toast0Phosphors may not be pixels, but 240p doesn't say anything about pixels. The number tells us how many lines, and the p tells us that each screenful of lines covers the whole picture (the p is for progressive, vs i for interlaced). The whole phrase 240p CRT TV tells us it's a normalish NTSC tv, not a hi-res tv with fancier electronics to work with digital tv and which would likely have more processing delays.⬐ p1necone⬐ stormbrewThere's also two halves to this, sure the TV itself might not be made up of clean square pixels like an LCD. But the source image that's being sent to it absolutely does have a discrete horizontal and vertical resolution in square/rectangular pixels.⬐ perl4everI always thought of the intersection of NTSC and computer monitors as being 320x200, not 320x240. The latter is more like quarter VGA or something.⬐ p1neconeInterestingly the '240p' signal sent out by video game consoles of that era is really a hack, as 240p wasn't a standard signal supported by TVs of the time.It's actually a 480i signal with the timing fiddled with so that the alternate lines still strike the same part of the screen (this is why games from that era had such noticeable scanlines - the CRT beam is only lighting up alternate horizontal lines).
This also means that a lot of more modern TVs (and even some upscalers marketed for retro gaming) do an extra terrible job of upscaling 240p signals because they run the same logic that they would if it was normal 480i, resulting in unnecessary flickering or dropped frames.
⬐ gmuecklWait, what kind of analog TV signal allows for that kind of control? Are you sure that they didn't just scan out the same framebuffer twice?⬐ entropicdrifterEssentially all pre-HD analog TVs allow for that kind of control.⬐ toast0The Analog TV doesn't have a framebuffer, and neither do most consoles, until you get into the 3d era.My understanding is that the timing of the vblank signalling that comes between fields determines weather the next field is an even field or an odd field. If the vblank signalling comes in the middle of the last scanline, the next field is an even field; if the vblank comes aligned with the end of the last scanline, the next field is an odd field.
If you always start vblank signalling in the middle of a scanline, you get all even fields, if you always start vblank signalling at the end of a scanline, you get all odd fields.
It absolutely is meaningful. While phosphors are not pixels and the display itself does not have a meaningful concept of a pixel along the horizontal scanline, scanlines themselves are absolutely real. There are 240(-ish) lines in a progressive standard definition signal. The way the television interprets that signal is not some kind of magical accident, there are expected timings for each line and each field involved. That's actually why it's 240p and not 240xN, because N actually varies quite a bit and is a fuzzy concept. But the 240 part is meaningful.Though forcing the crt to display it as (basically) progressive instead of interlaced (480i) is kind of magical.
⬐ p1neconeThe 'p' in '240p' doesn't stand for pixels, it stands for progressive scan. Also 240p/480i/480p are the standard accepted terms for these low resolution video signals[0], nitpicking technical details as a 'gotcha' when people use standard terminology isn't helpful.
> The traditional excuse for stripping away ornament has been about improving legibility, especially for small screens. This is a fallacyNo, serifs really do have legibility issues, but not for small screens; they cause problems at (very) small font sizes where glyphs are only a handful of pixels tall.
> You can render a serif letter in seven pixels
Yes, but that's pushing it and the legibility will depend on the quality of the font hinting, how the font rasterizer antialiases, and the way subpixels are utilized[1]. Legibility will even depend on the display technology: low resolution fonts on a CRT can be a lot harder to read compared to an LCD. (a shadow mask means phosphors do not align 1-to-1 with pixels[2])
Of course, legibility is in the eye of the beholder, and many people find serifed fonts to be easier to read, while others prefer sans-serif, and some people just don't care. For some people, anything smaller or more complicated than high-contrast 96pt simple sans-serif is unreadable.
> drops the serifs from its logo
A logo, however, is probably (much?) larger than the small font sizes where serifs often have legibility issues.
[1] http://www.antigrain.com/research/font_rasterization/index.h...
⬐ jozzasWho is using CRTs? Mobiles have incredible pixel density, high-pixel-density displays and TVs are becoming the norm, etc. Is this even an issue? Who only has 7 pixels for their text?⬐ pastaWhat about printing on objects?A good logo can be printed on a pen and still be readable and recognizable.
⬐ pdkl95⬐ pdkl95Printing hardware usually has higher point density than screens. It depends on the printing method, but >=300dpi is reasonably common.Serifs on low resolution fonts didn't gain legibility when LCD market share became dominant.> Is this even an issue?
Yes, to some people on some devices. While less common than it was a decade ago, old hardware exists[1], poor quality displays exist, and broken hardware exists[2]. However,
> Who only has 7 pixels for their text?
Everybody that used an Apple II, Atari 400/800, Commodore 64? Most 8-bit computers in the era before framebuffers were common used low-resolution hardware fonts[3].
[1] The last time I bought any kind of computer hardware was ~2011; my CPU and LCD monitor are vintage 2008. Wile my LCD displays small fonts very clearly,
[2] I know multiple people that have been stuck using a broken LCD because they cannot afford to replace their old (PowerPC) macbook.
[3] https://damieng.com/blog/2011/02/20/typography-in-8-bits-sys...
⬐ brazzledazzleAt some point you have to be okay with leaving people behind in some way or another. Where you draw that line is going to depend on your goals, market, etc. but if you want progress in design and/or technology you have to draw the line somewhere.I hope I’m not coming off as brusque with that attitude. I just think it’s practical. I’m intimately acquainted with using older hardware. I was lucky to have a computer and an internet connection at all as a child but I won’t pretend that it wasn’t a punch in the gut every time a website would take ages to load, crawl when it did load or a game wouldn’t run at all. It hurt when the line was drawn at broadband or faster CPUs/GPUs but I’m glad they didn’t wait for me to catch up.
There’s a tug between creators and consumers but the drive toward new technology is a good thing. Eventually it becomes more affordable or can be had for reasonable prices second hand.
That said, I don’t think that precludes reasonable accommodation. If your website isn’t an application it should be readable without JavaScript even if it’s ugly, screen readers should work, users should be able to adjust their font size even if it’s ugly, etc.