HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
A Real Life Haptic Glove (Ready Player One Technology Today) - Smarter Every Day 190

SmarterEveryDay · Youtube · 3 HN points · 5 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention SmarterEveryDay's video "A Real Life Haptic Glove (Ready Player One Technology Today) - Smarter Every Day 190".
Youtube Summary
http://www.audible.com/Smarter or text "Smarter" to 500-500
Click here to subscribe for next video: http://bit.ly/Subscribe2SED
A huge thank you to HaptX Inc. for letting me visit!
⇊ Click below for more links! ⇊

I'm a huge fan of Ready Player One. Here's our podcast episode on Ready Player One:
https://www.youtube.com/watch?v=h2s0x9CoegM

~~~~~~~~~~~~~~~~~~~~~~~~~~~~
GET SMARTER SECTION

If you want to know how this marvel works, just watch the next video... it explains Everything you want to know. I chose to break this up into two videos because it was so cool.

I asked HaptX for a blurb to explain their company. This is what they came up with:

HaptX is a multidisciplinary team of engineers based in San Luis Obispo, CA and Seattle, WA that builds advanced haptic technology. Their first product, HaptX Gloves, brings touch feedback to VR with unprecedented realism, enabling a new category of industrial training simulations. Founded by Jake Rubin and Dr. Bob Crockett in 2012, HaptX won’t stop until you can’t tell what’s real from what’s virtual. Learn more at haptx.com.

If you're interested in working for them, here's a link:
https://haptx.com/careers/

~~~~~~~~~~~~~~~~~~~~~~~~~~~~
GET STUFF SECTION:
(If I did this right these should be working Amazon affiliate links to purchase the stuff I like to use. When people purchase from these links it will support Smarter Every Day.)

❓Mystery Item (just for fun): https://amzn.to/3aPb1wn

Things I use and like:

📷Camera I use : https://amzn.to/2VSiruw
Favorite Lens: https://amzn.to/2KPDQ1a
Wide-angle: https://amzn.to/2SlPchR
On-camera Mic: https://amzn.to/3aVVbjz
Lav Mic: https://amzn.to/3aRek6r
Hot shoe mount for Lav Receiver: https://amzn.to/35m6uAo
My Tripod: https://amzn.to/2Yl6RtJ
My Multi-tool: https://amzn.to/2zGm5Pz
Favorite SD Card: https://amzn.to/2KQ3Edz
💾How I get footage off my phone: https://amzn.to/2KMem4K
Travel Tripod: https://amzn.to/2zEa9Oi
My Backpack: https://amzn.to/35jveJL
My Headlamp: https://amzn.to/3deYmVt
Favorite Bidet: https://amzn.to/2xnMG3b
World Map: https://amzn.to/3aTFCZT
Favorite Shoes: https://amzn.to/3f5trfV
Everyone needs a snatchblock: https://amzn.to/2DMR4s8
🥽Goggle Up! : https://amzn.to/2zG754g

Also, if you’re interested in a Smarter Every Day shirt etc. they’re at https://www.smartereveryday.com/store
~~~~~~~~~~~~~~~~~~~~~~~~~~

Tweet Ideas to me at:
http://twitter.com/smartereveryday

I'm "ilikerockets" on Snapchat.
Snap Code: http://i.imgur.com/7DGfEpR.png

Smarter Every Day on Facebook
https://www.facebook.com/SmarterEveryDay

Smarter Every Day on Patreon
http://www.patreon.com/smartereveryday

Smarter Every Day On Instagram
http://www.instagram.com/smartereveryday

Smarter Every Day SubReddit
http://www.reddit.com/r/smartereveryday

Ambiance and musicy things by: Gordon McGladdery did the outro music the video.
http://ashellinthepit.bandcamp.com/
The thought is it my efforts making videos will help educate the world as a whole, and one day generate enough revenue to pay for my kids college education. Until then if you appreciate what you've learned in this video and the effort that went in to it, please SHARE THE VIDEO!

If you REALLY liked it, feel free to pitch a few dollars Smarter Every Day by becoming a Patron.
http://www.patreon.com/smartereveryday

Warm Regards,

Destin

#smartereveryday
#VR
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Here’s my favorite VR glove video, from Smarter Every Day. They have enough sensors in the palm to let you feel a spider’s legs walking on it.

https://youtu.be/OK2y4Z5IkZ0

Quite Close!

As you implied it would be expensive to experience the cutting edge right now but here is what it would look like:

You would want a really high end headset, a haptic suit, haptic gloves, graphics cards and a pc to run it, and game to actually experience it.

-Pimax 8k $1500

-Research grade haptic gloves[0] $5000-$10000?

-Tesla Suit $2750

-Gaming pc and Scalped 3090s to run in SLI $10000

You could do a treadmill too but I actually don't think that tech is quite there yet and question just how much more immersed you would be at that point.

Now once your discount iron man suit is fully equipped what game do you actually play? You mentioned skyrimVR it actually has ok haptic support via mods.

However, what you really would want to play for the full Oasis experience is VRChat where with your haptic suit you could high five someone or feel them tap on your shoulder. It's also a thriving community full of bizarre worlds and avatars. Most of the hardware isn't directly supported in VRChat but you can find guides online of people getting it to work.

[0] https://www.youtube.com/watch?v=OK2y4Z5IkZ0

cardosof
That is indeed closer than I expected! Thank you for the links / names.
> Obviously, using a mouse/trackpad is not the same as tweaking real knobs, but it's much cheaper.

I’m interested to see what the availability of cheap VR does here. With today’s room scale VR, using knobs and sliders in a 3D space is feasible. Being able to develop spatial memory, even without tactile feedback, makes for a compelling alternative to using a mouse and screen. Once VR/AR gets tactile feedback like this glove[0], it gets even closer to parity with hardware.

If the module controls in VCV Rack are standard components (button, knob, slider, etc.) then it should be pretty simple to translate them into 3D controls. Though I guess buttons may not work as well as switches, I haven’t encountered them in VR yet. Anyway, with those component mappings in place, any VCV module should be displayable in a virtual rack with no extra effort—it’s just a procedural UI change, generated from the existing 2D layout, that doesn’t impact the audio engine at all.

[0] https://youtu.be/OK2y4Z5IkZ0

pmoriarty
I'm not as excited to see traditional interfaces cloned in VR space because the lack of tactile feedback in VR really hampers such interfaces (it's really annoying to have to look down at your hands and the knobs you're turning or the switches you're pushing to make sure you've got the right one, etc).

What's much more interesting to me are exploration of innovative interfaces. For example, with full-body tracking you could map body parts to different synth parameters or algorithm parameters and then allow the user to make music by dancing.

The sky is really the limit in terms of what body/head/arm tracking and VR can allow you to do with innovative interface design.

Nition
One of the better examples of that I've seen was actually back in 2012, done with a couple of Kinects[1]. Although it's still not much more than the equivalent of pressing a few buttons and sliders and it's obviously highly pre-programmed.

There's the Hot Hand[2]; the bassist is using one in [3]. There's still a lot more that could be done.

[1] "The V Motion Project" https://vimeo.com/45417241

[2] https://www.sourceaudio.net/hot-hand.html

[3] https://www.youtube.com/watch?v=ZuunY8BTqNs

filoleg
Hopefully, haptic feedback needed for this would eventually be recreated in VR. It already has some really good haptic feedback in default controllers (talking about Oculus Quest here, but also experienced the same with Vive setups), where something as simple as reloading a gun feels way more impactful with the haptic engine built into those controllers.

For something as refined as the sensation of twisting a knob, however, I don't think that the current VR controllers are enough. Maybe once VR gains a bit more mainstream adoption, we will get VR glove controllers with a bit more complex haptic feedback mechanism, which should allow for the "twisting the knob" sensation to be implemented, and then it will be a total gamechanger. I would looove to have my DAW and synths in VR, as the spaciality of VR is perfect for this kind of workflow, but we gotta wait a bit for tech to catch up.

filoeleven
Nice username =)

I think current gen controllers could work well enough here to be useful. The modules might have to be oversized, and you’d need to have a visual indicator that “this is the knob you’ll grab if you pinch now,” with good visual position feedback. But when I play Beat Saber, part of the fun is how my every slight twist is reflected exactly by a freaking lightsaber! The fine motor control is already present.

It’d take some getting used to, but still miles ahead of mouse/trackpad. The nice thing about VCV being open source is...someone can just build it, and it can be iterated upon as the haptics improve.

Another VR app I love that isn't really a game is called SpaceEngine. It started as a flatscreen project that later added VR support. There is a lot of cool stuff like this in VR that was originally developed for another medium and was ported to VR either by the developers or often by the community with a mod (Half life, Minecraft, Skyrim).

SpaceEngine is Google Earth for the Observable Universe (Google Earth VR is pretty cool too). It places celestial objects that we know of where they belong and procedurally generates what is unknown based on how many stars / planets are estimated to exist. The sense of scale in VR lends itself really well to this experience. You can control where you move and how fast you move.

I was genuinely in awe as I pointed away from earth into deep space and started accelerating. I know space is vast but it's just so hard to conceptualize. You increase the throttle and you realize that even travelling at light speed doesn't feel all that fast. At that rate it still takes hours to leave our solar system. But the shocking part is when you increase it far beyond that. Cranking it up to 100x the speed of light, then to 1 light year /second, how about 100ly/s, faster still. I don't remember what my final cruising speed was when I realized that I wasn't seeing stars pass by - those were galaxies. Maybe 100M ly/s or something. And as you're going you can stop, pick a galaxy, in that galaxy pick a star, have a look around that system, pick a planet, land on it, advance time and watch day night cycles. There was an overwhelming sense of how big the universe is and just how small I am as I watched galaxies fly by at impossible speeds.

SpaceEngine is buggy. Everything in VR is buggy and half works. Menus are terrible. UIs are baffling. Settings are either hidden and esoteric or oversimplified with no real control. This is part of what makes it seem so promising to me though. There is so much low hanging fruit to be picked. Everything I've experienced in VR feels like a castle in the sand. And not just because DMCA's will eventually remove much of the current content. There is so much potential to grow and extend what is currently available. Platforms where users can make their own content currently exist but it feels like it's almost by accident. You can imagine future systems making it even easier / more open. I think WebVr is one trying to do this but I've never used it.

The low hanging fruit isn't just the software either. Everything I've mentioned above is with todays consumer hardware. The improvements in the pipeline for VR hardware are crazy. Things that are available as Dev kits, prototypes, and commercial hardware blow my mind.

In VRChat you can join via VR or a flatscreen. When people are not connected via VR you can sort of subconsciously tell at all times because there is no body language. It's as if communication has some sort of inherent fidelity where on the high end you have standing in a room with a real person talking to them and on the low end you have something like a text message. People who are connected via flat screen feel a little closer to the text message side of that slider scale. But there are also people who use special hardware to enable full body tracking, or finger tracking. Interacting with them feels a little higher on that fidelity scale.

Full body tracking is currently done by a piece of hardware clipped to your belt and or shoes. In the future I suspect it will be done with cameras or something a bit more seamless. But it's not just body tracking. There are haptic suits where you can feel when someone taps you on the shoulder, or throws a dodgeball at you. There are very early alpha haptic gloves about which I could write another comment of similar length (this video was incredible to me[0]). There is experimentation with tracking facial expressions and eye movement. Facial expressions will go a long way towards increasing communication fidelity but eye tracking turns out to offer even more potential than that.

The resolution of the current gen VR headsets just isn't that good. It's better than the old Rift CV1 but is still noticeably less clear that a computer monitor or television. However, with eye tracking you can do foveated rendering. This allows you to track where the eye is looking at any given moment and render very high resolution in that spot, and lower resolution on the rest of the image. I have not yet experienced this myself but people who describe testing this out report a remarkable difference. I remember one article in which a guy described getting into a BMW in VR and looking at the logo on the steering wheel he said it looked shockingly similar to seeing it in real life. Thats a special example and not first hand so I take it with a grain of salt. But with foveated rendering the resolution can be on the order of 10x higher. Really extreme improvements possible here.

I've always been more of a software guy personally. The potential of the hardware is exciting to me most of all because of the crazy possibilities it enables in what could be build in VR tomorrow. It's not quite Moore's law, but there is a sense that the hardware will get better for a while.

I once heard someone call VR/AR "the last interface" because any interface that currently exists or could exist can be built in VR. I don't know if thats entirely true but it gets at what makes VR so compelling. The limit is really only our imagination. This is true of other digital worlds too, like video games or to some degree even websites. But thats just through a screen. Here you can conjour into existence places for people to go, people to be, or adventures to experience.

It's not all adventure games and toaster people either. There may be even more low hanging fruit in building tools to do work in VR. Thats even less explored. A VR IDE would be pretty cool, especially once the resolution gets a little better. Or imagine if you could walk into a virtual datacenter and see all your AWS servers manifested as physical boxes racked and waiting for you to trip over power cables like it's the 90s again.

VR is a new medium. It doesn't feel like reading a book, playing a video game, or watching a movie. It feels like something else. And like all new mediums what exists there now is mostly ports of old mediums. I don't think we've even scratched the surface of whats possible in this space. There are probably whole undiscovered classes of ideas that really only work in VR the same way there are ideas that make great books but don't really cut it as movies. It just has so much potential.

It feels like the early web to me. An untouched frontier. Almost completely unexplored and teeming with possibility. What could be more promising than that.

If you get to try (or play everyday) one of the more common VR games like beatsaber, the lab, job simulator etc. those are definitely still fun too and I think still are a window into what VR can be. Rec Room is another popular one you may find that even has some user generated content (though with more limits on things like size). Really almost anything in VR as long as it's on one of the PC based headsets will show you what I'm talking about.

Thanks for asking this question. Looking at the other comments this is one of my favorite HN threads in a while.

[0] https://youtu.be/OK2y4Z5IkZ0?t=60

Check out this Smarter Every Day introduction to the HaptX force feedback glove from last year.[1] The guy thought VR was stupid until he used the glove, then was instantly immersed. They’ve packed a lot of sensory feedback into it.

Once force feedback hits, translating 2D onscreen control surfaces to 3D models will be both trivial and worthwhile, at least in the niche realm of virtual audio devices. Touchscreens have already made them much more playable, and things like TouchOSC allow for some customization. I’d love to have something like the Reason rack[2] in a space where I could use all 10 fingers instead of 1 mouse pointer to interact with it.

[1] https://youtu.be/OK2y4Z5IkZ0

[2] https://images.app.goo.gl/37FMUZ6kfgTb5rPf6

Mar 02, 2018 · 3 points, 0 comments · submitted by jadeddrag
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.