Hacker News Comments on
How A Blind Developer Uses Visual Studio
Coding Tech
·
Youtube
·
7
HN points
·
11
HN comments
- This course is unranked · view top recommended courses
Hacker News Stories and Comments
All the comments and stories posted to Hacker News that reference this video.Visual Studio is quite good: https://youtu.be/94swlF55tVc
Windows has fantastic accessibility properties/support. They even have a blind dev working on Visual Studio (or did)[0] for awhile. Apple's accessibility options are not anywhere near on-par with Windows. Linux ... I honestly haven't ever tried a screen reader in Linux and I have a feeling it is 100% up to whichever dev implemented something with no centralized style of doing things.
Saqib Shaikh, a blind developer at Microsoft, gave this short talk on using Visual Studio: https://www.youtube.com/watch?v=94swlF55tVcI realize this isn't a console, but I think it is a good demonstration of an expert doing typical developer activities.
My understanding is that blind or visually impaired people tend to use Windows because that's where the two(?) big screenreader programs run; JAWS is one[1], I forget the other but it might be NVDA[2]. Microsoft have traditionally had some focus on accessibility, e.g. there's a video from a few years back of a blind programmer demonstrating how they program in Visual Studio[3]. Outside that, Emacspeak[4] is an eyes-free Emacs, I think it can boot into that environment; it was developed by a blind person and claims:> Emacspeak introduces several improvements and innovations when compared with screenreaders designed to allow blind users to interact with personal computers. Unlike screenreaders that speak the contents of a visual display, Emacspeak speaks the underlying information. As an example, using a calendar application with a screenreader results in the blind user hearing a sequence of meaningless numbers; In contrast, Emacspeak speaks the relevant date in an easy to comprehend manner.
> The system deploys the innovative technique of audio formatting to increase the band-width of aural communication; changes in voice characteristic and inflection combined with appropriate use of non-speech auditory icons are used throughout the user interface to create the equivalent of spatial layout, fonts, and graphical icons so important in the visual interface. This provides rich contextual feedback and shifts some of the burden of listening from the cognitive to the perceptual domain.
[1] https://www.freedomscientific.com/products/software/jaws/
[2] https://www.nvaccess.org/download/
⬐ NoneNone
Please watch this - https://www.youtube.com/watch?v=94swlF55tVc
I'm reminded of this short video of a blind software developer using Visual Studio. Maybe it'll give you some idea https://www.youtube.com/watch?v=94swlF55tVc
I recommend watching (and listening to) this talk by Saqib Shaikh, a blind developer using Visual Studio and how we codes and debugs with his screen reader https://www.youtube.com/watch?v=94swlF55tVc
This presentation by a blind software engineer at Microsoft has always stuck with me: https://www.youtube.com/watch?v=94swlF55tVc
Here’s a blind programmer using Visual Studio with a ridiculously fast TTS: https://youtu.be/94swlF55tVc
⬐ daenzThis makes me emotional.⬐ v64⬐ ilamontSame. It really makes you realize what we take for granted as sighted developers. I can't imagine what the learning curve was like for him, as he lost his sight at age 7, long before many of these accessibility technologies existed. [1][1] https://news.microsoft.com/apac/features/saqib-shaikh-on-tec...
Yes, it was like this video (scroll ahead to 1 minute mark).⬐ netcraftThank you for sharing that, was super interesting.⬐ saagarjhaI'm not blind, but I've tested some of my apps for VoiceOver and it's just utterly unusable with a "reasonable" speed. You have to pretty much set it to your reading speed for it to be useful, and that happens to be significantly faster than most people are comfortable speaking.⬐ liabilityStart slow, and build up your tolerance to it overtime. If you try to jump into the deep end, you'll just make yourself disappointed.⬐ saagarjhaI'm reasonably good at listening to sped-up audio, personally, so this wasn't really an issue for me. I was just providing anecdotal reasoning of why TTS users may set their audio speed to something that might sound unreasonable: it takes forever to navigate the interface otherwise.
On the other side of this, I found it very interesting how a programmer who can't see codes using text-to-speech: https://www.youtube.com/watch?v=94swlF55tVc
⬐ hestipodI envy and truly admire people who suffer such adversity and who live good lives. I don't know if its because they usually never knew any different so can't compare, or if they just have an innate mental strength I do not. After losing my health and life I've had an extremely hard time accepting such a huge reduction in QOL that just seems to get worse and really wish I had their ability/outlook.
There's a blind developer giving a talk on Visual Studio accessibility here - https://www.youtube.com/watch?v=94swlF55tVc - and you can hear examples of him using text to speech starting at about 0m 45s but he doesn't say how fast.