HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
RiftSketch: Live coding in VR with the Oculus Rift, Firefox WebVR, JavaScript and Three.js

Brian Peiris · Youtube · 725 HN points · 13 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Brian Peiris's video "RiftSketch: Live coding in VR with the Oculus Rift, Firefox WebVR, JavaScript and Three.js".
Youtube Summary
I built a live-coding web app for the Oculus Rift where you code in JavaScript using Three.js and watch the world change around you in real-time.

It's quite a niche application since you need to have an Oculus Rift, be a JavaScript programmer *and* be sufficiently familiar with Three.js but, if you fit that criteria, it's a surprisingly engaging experience!

If happen to have a Rift and you want to try it out yourself, download the Windows or OS X or dev build of Firefox that includes the WebVR APIs from http://blog.bitops.com/blog/2014/08/20/updated-firefox-vr-builds/ and visit http://brianpeiris.github.io/RiftSketch with your Rift hooked up.

The code's messy but it's up on GitHub: https://github.com/brianpeiris/RiftSketch

Edit: Since people keep asking, the music is actually just one of the royalty-free tracks from YouTube's audio library called "Grass" by "Silent Partner": https://www.youtube.com/audiolibrary_download?vid=2545f834238bcd11
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Part 2:

That was right about when Google Cardboard hit. That was 2014? I saw it during a Google I/O livestream and just so happened to have a few lenses left over from some still-photography experiments involving lasers and... never mind, another thing that went nowhere, I just had some lenses around. I quit watching the Google I/O stream and immediately hacked together a new cardboard box viewer with the lenses.

Saw Versailles, which I have never been to, but I have been to Linderhof Palace in Bavaria, which is modelled after Versailles. Saw the Galapagos, where my wife and I had just spent our honeymoon about a year or so before. I immediately saw the experience was closer on the spectrum to really being somewhere than it was to seeing it on TV.

So I'm sitting there, this cardboard box in my hand, this powerful smartphone in my hand, thinking to myself "I have to do this, I have to make these things". But all I knew was C# and JavaScript. Unity was garbage at the time. It's still garbage, but also back then. I had tried my hand at native app development and just didn't have the patience for Xcode or Android Studio. So I started hacking with Three.js and JavaScript. I ended up making the first JavaScript API for VR, releasing the first version about a month before the first announcements of WebVR.

Then I saw RiftSketch (https://www.youtube.com/watch?v=db-7J5OaSag). It blew my mind. Started chatting with people. Brian Pieris talked about wishing he could get syntax highlighting into the app, and complained about how he had to use CSS 3D transforms to position the box on top of his WebGL view. At the time, I recognized how early everything was and how primitive the tools were. So I thought, if I could make developer-oriented tools that made making VR easier, I could make something out of that. I thought Brian could be my first user. So I made a RiftSketch clone, added it into my WebVR framework, and that became what was eventually called Primrose.

I got a small amount of internet fame out of Primrose. People started recognizing me at conferences. A "startup" hired me to be their head of VR. That turned out to be a different kind of hell. It crashed and burned after about a year. We had a kid and another one due any day, and we were completely out of money. I thought he VR dream was finally over. Started applying to jobs back in web and DB work.

In the mean time, I had just started working in Unity at the startup, I had all this time on my hands, and Unity was offering full-access to their learning materials for the first month for new customers. It was clearly designed to go through in 3 months, but I churned through all of it in a month. Then one of the folks that I had hired on at the startup to work on VR stuff made a connection for me at a gigantic, multinational consultancy, for their Unity dev team. He thought I was pretty good even before I learned Unity properly, so I breezed through the interview. It was also the most money I'd ever been paid. Seemed like a huge win!

Then the reality of giganto-consultingware companies set in. You've not seen office politics until you've worked in an organization that runs under a partnership model. It wasn't exactly the worst job experience I've ever had, but it definitely ranks up there. But it basically got me a ton of Unity development experience. They mismanaged the hell out of the team and eventually had to layoff half of everyone. I also ended up meeting a few folks along the way who got me an interview at my current place.

I'm now the head of VR at a foreign language instruction company. Our main client is the military's foreign language school, the very same place where my parents met and then shortly thereafter had me. Things are going well. The company is stable. It's on its second owner, who brought it back from bankruptcy in the early 2000s. I report directly to the president of the company. He thinks the world of me and lets me do whatever I think is best. We have weekly meetings where we geek out about video games and VR. I just got to hire my first employee to work on the project with me. On a weekly, sometimes even daily basis, the company does something that proves "our employees are our biggest asset" isn't just a platitude for them. It's amazing. I've never worked anywhere this nice before.

yomly
Sorry I never saw this. This both was captivating and inspiring!

Thank you for sharing your story, it was quite the journey and it is uplifting that you were able to eventually culminate all the dotted things you've worked on out of passion into something that you're now sought after for!

I think the less cynical answer is that HTC often has good ideas for physical devices but then often fails to follow through and iterate well. They're also just struggling as a company in general.

But damn, so I 1000% agree with being bummed about it not being the hackers headset. I preordered the Vive because of a VR video of a guy programming the environment he was in at the moment: https://www.youtube.com/watch?v=db-7J5OaSag

I would love to use a VR IDE some day.

mintplant
Reminds me of Second Life, in which the tools to model, edit, and script the 3D environment were all integrated into the environment itself (and realtime-synchronized over the network, no less).
None
None
mncharity
> I would love to use a VR IDE some day.

The LCD Windows MR HMDs seem pretty close to being a sufficient display. The useful visual area is something like 900^2 px. Big, visible pixels, that are ok with a 7 pt font. And one can do subpixel rendering, so ~3x the horizontal resolution. If you are ok with working on a small laptop screen, you might be ok with this.

Otherwise, there's Varjo[1] later this year. Similar resolution to looking at your laptop. ~55 px/deg. For "under $10k".

For gloves, that you can still type in... we'll see. I had hope for https://senso.me/ , but they've gone quiet. If you don't mind spending $10k, there are existing trackers.

For software... sigh. Maybe if market size explodes this Xmas, things will improve. There's been a lot of "do something, then abandon it, because the area isn't ripe yet" over the last half decade. And software dev in VR hasn't been where people's attention is focused.

Oh, if one's interest is in-VR creation of VR, instead of in-VR general software dev, then there are a bunch of "authoring environments" being worked on.

[1] https://www.anandtech.com/show/12102/varjo-announces-shippin...

dkudriavtsev
I thought that their products were available to order now...
mncharity
> I thought that their products were available to order now...

Senso.me? https://senso.me/order has said "Estimated shipment date: December 2017" for long time, and blog hasn't been updated since Sept.

Varjo? Last I saw, Alpha version available to partners. Though Beta was expected Q1?

Oct 20, 2017 · 1 points, 0 comments · submitted by timthelion
This video is what convinced me to buy in to VR:

https://www.youtube.com/watch?v=db-7J5OaSag

None
None
twiceaday
Reminds me of a non-VR game called Quadrilateral Cowboy [1].

[1] https://youtu.be/g0mi9J5b4Xs?t=24m46s

I've posted this some three years ago and it still blows my gourd https://www.youtube.com/watch?v=db-7J5OaSag
spartanatreyu
If only resolution wasn't an issue
mncharity
The weeks-away(?) new Microsoft-associated 1440x1440/eye headsets will be an improvement over the Vive/Rift 1080x1200 PenTile.

Higher risk, but almost "retina", is foveated displays like Varjo http://www.ubergizmo.com/2017/06/varjo-20-20-vr-headset/ http://www.kguttag.com/2017/07/10/varjo-foveated-display-reg... .

andybak
Whilst more resolution would of course be nice, it's fairly low on my list of important advances I desire in VR. I use VR almost daily and my top 3 list is probably: 1. A wider FOV, 2. a lighter headset and 3. Wireless

In terms of encouraging broader adoption - cost and lower required specs for the GPU is probably top.

Does anyone know of any other threejs talk/walkthrough posts? I love watching the development of things with it, like this video:

https://www.youtube.com/watch?v=db-7J5OaSag

fulafel
Learningthreejs.com has lots.
mrdoob2
DOOM (2016) - Graphics Study http://www.adriancourreges.com/blog/2016/09/09/doom-2016-gra...

GTA V - Graphics Study http://www.adriancourreges.com/blog/2015/11/02/gta-v-graphic...

Does it count if display is a VR gear and input device is a detachable keyboard ?

https://www.youtube.com/watch?v=db-7J5OaSag

https://news.ycombinator.com/item?id=8411638

lwhalen
That... is awesome. Having done a Rift demo, that's the closest I can conceive of (right now) as real-life wizardry - a terminal for (virtual) reality.
grogenaut
Isn't there a huge problem with the resolution of the rift/vive being that of a monitor but it's got to render a monitor at distance thus losing pixels and being equivalent to an old vga monitor?
bemmu
Looks like the screen in the video is about 1/3 of the view size in each axis. Rift resolution is 1080 x 1200 per eye, so about 360 x 400. That's even worse than VGA.
dwild
The Gear VR use your smartphone screen, not the Rift. That's a resolution of 1440 x 2560 for a Samsung Galaxy S7, which mean 1440 x 1280 per eye. The optic is also made to have more pixel in the center than the border of the viewport. I have no doubt that you can have a resolution similar to VGA with no issue and even better than that if you use a bigger screen. I'm pretty sure none of us had any trouble working in VGA back in the day either.
What I want (and I suspect we'll have due to gaming) is super high resolution Oculus, imagine programming in an environment where your screens can be tiled onto a sphere and moved and re-ordered arbitrarily.

Could do something very interesting with that :).

https://www.youtube.com/watch?v=db-7J5OaSag

Imagine what we could do in 10 years with ^ if the technology takes off this time.

Aug 18, 2015 · 3 points, 0 comments · submitted by colinmcd
Jun 12, 2015 · jfoster on Oculus Rift
It doesn't (yet) seem particularly comfortable, but this was an impressive demo of someone coding their environment in the Rift.

https://www.youtube.com/watch?v=db-7J5OaSag

Yes, it will be. Here's a fairly fun demonstration of exactly that.

https://www.youtube.com/watch?v=db-7J5OaSag

myth_buster
This is what excites me. 360deg desktops with depth perception. This will disrupt quite a few areas least of which would be multiple monitor setups.

An interesting implementation would be virtually working from a beach or sitting on top of a colorado 14er. Just have a table fan sending randomized breeze your way. :)

corysama
There's a small subreddit devoted to the idea of programming inside VR. Several examples there.

http://www.reddit.com/r/hmdprogramming

dzhiurgis
These ideas does not provide any novelty.

There is potential to re-imagine programming, but I have no clue how.

Also, keyboard input is not very compatible with HMD. I guess brain interfaces are the most promising, but why can't we just tap into my pinky's nerve and train my brain to use that as output?

Apr 03, 2015 · et1337 on Myself – v1.0.3
There's also my favorite: live-editing a VR environment. https://www.youtube.com/watch?v=db-7J5OaSag
philsnow
I sent this very link to an acquaintance who is looking to get started programming, to give him an idea of how much fun it is to do things interactively in a tight REPL-alike.

I think a lot of people who are just starting out get deterred when the only thing they end up with after a week of effort is some hard-to-see changes happening on the filesystem or in some file.

tomjen3
Python has a webbrowser module so that you can use it to remote control your webbrowser. It is much more fun to see results that way than mess around with print.
jakejake
A lot of us who have been doing this for years sometimes spend a week working on something only to see a benchmark slightly improve or a unit test pass - but still get a thrill from it!

But I must add that anything which gets new coders excited is a good thing!

chton
That's because you can conceptualize what happens through your experience and knowledge. For a newbie, a slight benchmark increase doesn't mean anything because they can't tell what the impact of it is, why it is important.

Closing the feedback loop between typing and seeing a cool result is one of the best ways to get people excited about coding. If 3D or AR helps with that, it's definitely a good thing :)

jakejake
I'm 100% with you - if it gets new people excited then I'm all for it!
Yep. To glimpse the potential for this, people should check out Peiris's demo vid: https://www.youtube.com/watch?v=db-7J5OaSag

If anyone is interested in working on something like this, there's a very small community germinating in http://www.reddit.com/r/HMDprogramming/

This is what's possible with WebVR: https://www.youtube.com/watch?v=db-7J5OaSag
cagenut
this is amazing thank you
fra
I wonder - has anybody built a VR development environment yet? 180˚ vim sure would beat my current dual monitor setup...
apaprocki
I think you should definitely check this out: http://2014.jsconf.eu/speakers/rik-arends-beyond-html-and-cs...
None
None
None
None
None
None
akurilin
I've been dreaming of something like that ever since the Oculus got big.
jimrandomh
I've done some experiments in this area, by transplanting my existing vim-in-console-in-browser environment to VR. It's not good enough for me to live in or invest dev work into yet.

Steam has a VR browser, which I used for my tests. You have to get keystrokes in through a side channel, though, since it steals some vital key bindings including the spacebar, and it has its own screensaver which I didn't find a way to disable (and it can't sense keystrokes through a side-channel). A purpose-built browser wouldn't have that issue. Also, there's no way to control the positions of shells within 3D space or fill the whole 180deg yet.

The main issue right now is resolution. You get a lot more visual area to work with, but at the expense of a much larger minimum font size. I compared the smallest barely tolerable font size on a DK2 to the actual font size I was using on my main monitor, and it was about 3x the angular dimension; that is, the same visual area would either be a 300-column terminal on my monitor, or a 100-column terminal in VR. The DK2 is 1920x1080, so ignoring optics and subpixel issues, a 2560x1440 Gear or CV1 would make that ~2.25x, and a 4k HMD would make that 1.5x.

corysama
Whatever experiments you done, share your progress here: http://www.reddit.com/r/hmdprogramming :)
> Basically, I won't be satisfied until I get the holodeck... or a neural jack.

How about a Rift work space?

https://www.youtube.com/watch?v=db-7J5OaSag

Oct 05, 2014 · 719 points, 126 comments · submitted by adamnemecek
tinco
This is why I bought a DK2, I believe VR IDE's are going to be a thing, and they're going to be great. The fonts aren't very crisp in this video, but once we get some nice editors in there it's going to rock, the new display has a high enough resolution to make it look great, and I bet the consumer version is going to have a retina display which will just make any discomfort vanish.

Note that in the video there's just one editing window, in an IDE obviously there could be editing windows and references all around you. You would never need a mouse as your head movements convey positional information, though you can use a mouse of course, and with good keyboard controls I think you can have an editing space ten times as big as the traditional dual 27" monitors give you.

No more hiding buffers in tabs, no more disruptions of flow as you're scrolling through your buffers looking for the right one. You just access the visual memory in your brain to intuively remember where you left the buffers you use.

Also, the idea that this would somehow be bad for your neck sounds ludicrous to me, how is staring for hours at a stationary rectangle with miniature information more natural and easy for your neck muscles than having information spread around you in even spacing at a comfortable virtual distance? I think it's going to be a relief for nearly everyone.

Void_
I'm happy with a single 24" monitor with one split screen.

There is a virtual space 10 times bigger than 27" monitor, but it's only in my head, and I have no need to see it in 3D space.

But that's because I work in this space every day, I know all filenames by memory, I always know exactly where I wanna go, and type cmd+t followed by filename automatically.

This VR idea is like flying through places, but you don't need that if you can teleport.

I can't imagine this being useful in day-to-day programming, but definitely would use it for discovering an unknown project.

tinco
> This VR idea is like flying through places, but you don't need that if you can teleport.

It's not, didn't you watch the video? You don't fly, you sit still. Even if you would want to move your virtual presence, I'll guarantee you that the computer can teleport you faster than your brain can, not that I think that would be comfortable..

Anyway, I'm glad that you're happy, so by all means stay happy.

michaelxia
Lol bro I think you missed the point he's making with flying. It's an analogy.

once you're familiar with your dev environment, you don't need to have it floating around in VR space. Its all memorized pretty much, instant access, L1 cache. so to look at something in VR you turn your head and the view pans. To look at it in your brain, its just there, instant lookup.

one requires a transition in space to access information (flying) the other does not (teleportation)

taeric
I question the assertion that the computer could teleport you somewhere faster than your brain could. I mean, I agree if you want complete sensory teleportation. At least my brain does a poor job of transporting my eyes and ears someplace. However, if I just want to think about one thing while I am doing another, it is hard to beat just thinking about it.

And then there is the problem of you brain can think about things that aren't actually done, yet. Computers can certainly help in this regard, but they are more about sharing something that was done in a different way. Consider walking through a house. Sure, a computer can let you walk through one that hasn't been built physically, yet. I'd wager skilled architects can imagine walking through one that hasn't even been built virtually yet.

And, finally, there is the problem of thinking about data structures that are literally gigabytes in size. I have a hard enough time thinking about ones that are megabytes in size. I'm not entirely sure accurately visualizing large instances would really even help.

ibrahima
I was just thinking of something like this the other day. I was kind of lamenting the fact that my phone has the same resolution display as my laptop (1080p), and newer phones have even higher resolution screens, yet I can't really get much work done on one because I can't physically display much text (though I have actually written a decent amount of code from my phone over SSH). Then it occurred to me that if we have VR mounts like Samsung is doing for the Note 4 with the Gear VR we could actually use the super high density screens on our phones for productivity.

Though as the other guy brought up, I guess it may not be the most natural way to read text if you have to actually move your neck, but it may be decent enough that you could adapt to it. And thus, you could write code on the train/bus to work from your phone (at least, until someone mugs you and steals your expensive phone VR setup heh).

bhouston
> This is why I bought a DK2, I believe VR IDE's are going to be a thing, and they're going to be great. The fonts aren't very crisp in this video, but once we get some nice editors in there it's going to rock

To a degree. But I when I work I have two WQHD monitors (2560x1440) with one turned on its side so I have one text editor with a vertical resolution of 2560. When ever I work on a laptop, even an HD laptop, I miss this setup so much as it feels like I am drinking through a straw.

Because of this I am unsure if VR, which is going to be much less than what I am working with now, is going to be a serious replacement.

reqres
Since a headset could occupy your entire field of vision I would imagine that any setup could be replicated in virtual reality.

The display technology and software is still a long way away however.

brian_peiris
> The display technology and software is still a long way away however.

OP here. Well, I'm a bit more optimistic. Oculus now has ties with Samsung's display manufacturing, relationships with Nvidia and Facebook's cash. I wouldn't be surprised if they become the driving force behind high-density 5" displays and graphics horsepower in the next few years.

XorNot
John Carmack trying to figure out how to get Samsung to believe him on display technology fills me with confidence. That dude is clearly thinking very hard about how to make everyone happy to encourage technology to push forward.
3rd3
It would be great to have a text input device which allows to move freely, something like the key glove: http://www.keyglove.net
tinco
At the moment, the effective vertical resolution is about 3240p, and the horizontal I'd guess would be around 6000p. And that's just the technology preview, I don't see how you think VR is going to be less than your setup, when it's already more..
brian_peiris
OP here. Your numbers are definitely off. The effective resolution of the DK2 is more like 320p or something. Definitely lots of room for the tech to improve there but I'm hopeful. Despite this limitation, the experience is pretty darn good if I do say so myself.
tinco
Why 320p? I say 3240 because the height res is 1080p, and I think you can comfortably tilt your head to get three times that height resolution. The horizontal guesstimate I made similarily.

edit: ah I see, I hadn't factored in the warping of the optics, if you're correct then the effective resolution would only be around 900p so I'm being too optimistic :)

brian_peiris
Oh you're factoring in head movement. I see. Well you still haven't accounted for the warping that the optics do. They stretch that 1080 pixels across a 100 degree vertical field of view, so the pixels directly in your sight (not including peripheral vision) is really only about 320.
fsiefken
Funny coincidence, the oculus has a similar appearant resolution as the vanilla doom game on the crt's had.
bencoder
Those are very strange numbers - Are you getting those figures from the ability to look around? So in terms of the 180 degrees from straight up to straight down it's 3240p?

I think pixel density is what people want - to make full use of that 3240p you would have to move your head quite considerably, to make use of a large monitor from 75cm away doesn't need nearly so much head movement.

It's what I want and am hoping for, but the current DK2 is in no way usable for this

andyidsinga
agreed vr programming will be the thing. even with the res in the video several editor windows arrayed vertically or horizontally would be cool.

need some new/better input devices in addition to jey oard to manipulate the objects in the space.

does the oculus have a "axis lock" feature of some sort? foe instance, lock X so that any head movements only change the users camera along x?

mentos
I think we'll definitely see a version of Microsoft Windows that supports HMDs in 15-20 years.
JackC
> how is staring for hours at a stationary rectangle with miniature information more natural and easy for your neck muscles than having information spread around you in even spacing at a comfortable virtual distance?

Low-pixel displays can't replace high-pixel displays (yet). It's all about the speed your head can move, vs. the speed your eyes can move, vs. the speed your display can refresh.

As a simple test, try holding your head still while moving your eyes back and forth ten times, from one extreme of your FOV to the other. Then try moving your head back and forth ten times to see things beyond your FOV on each side. Two observations when I try this: (1) the eye movement was much faster than the head movement; (2) the head movement was much more tiring.

In fact, eye movement is the fastest thing we can do, and even uses special circuits to reduce latency:

"Saccades are the fastest movements produced by the human body. The peak angular speed of the eye during a saccade reaches up to 900°/s in humans; in some monkeys, peak speed can reach 1000°/s. ... Under certain laboratory circumstances, the latency of, or reaction time to, saccade production can be cut nearly in half (express saccades). These saccades are generated by a neuronal mechanism that bypasses time-consuming circuits and activates the eye muscles more directly."[1]

Head movement apparently maxes out at 500°/s or so, and that's not comfortable to keep up for long. So it's at least twice as fast to access high-resolution information by eye movement as to access the same information at lower res by turning your head. That's the tradeoff between using a fixed monitor at your max comfortable resolution, and using a lower-res head-mounted display.

At the same time, a fixed display (for now) allows you to move your head more rather than less while reading comfortably. A head-mounted display has to update the display in response to your head movement, and a fixed display doesn't.

As a test here, see how fast you can move your head back and forth while reading text on your fixed display. Then see how fast you can move back and forth while reading on your DK2. The second will be slower, because the Oculus has to impersonate a pixel on your virtual display by juggling which physical pixels are used to render it as your head turns. In addition to further throwing away resolution by aliasing your virtual display onto your real display (the virtual text has to look sharp regardless of how it happens to overlap with physical pixels), this means the refresh rate is a limiting factor on how fast you can turn your head and still read. The text which was crisp while your head was still is physically blurred before it even gets to your eye, because the Oculus can't quite keep up with where it should be. That stops being a factor at, say, 1000hz, but we're not there yet.[2]

So on a fixed display, you can actually move and stretch your neck and turn your head without affecting the reading experience; on a virtual display, those movements will blur the text you're trying to read.

I'm very excited about all of the things VR can be -- just saying that one thing it can't be, yet, is a replacement for a high-resolution monitor when you're trying to quickly access lots of text.

[1] http://en.wikipedia.org/wiki/Saccade [2] http://blogs.valvesoftware.com/abrash/down-the-vr-rabbit-hol...

daw___
Source code: https://github.com/brianpeiris/RiftSketch
fastball
That is the coolest thing I have seen on this website in a long time. I'm literally giddy with excitement.
vincentleeuwen
I couldn't agree more, very cool. This is not just technological innovation, this is potentially Art innovation too.
hanief
This is brilliant. I imagine remote collaboration will be very exciting with VR technology. Since I haven't tried Oculus yet, how is text rendered on the screen? Is it crisp? Or at least readable?
StavrosK
I tried one yesterday for the first time. It was a DK2, it was very low resolution, and you could clearly see the colored pixels of the display.

Text had to be very large, so only three or four lines fit on the screen at a time. After hearing people praise the DK2, I thought it would be basically seamless, but it was jarringly low-res.

The experience was great, very natural, but the pixel density left a lot to be desired.

hanief
Thanks. So I guess it's just a matter of resolution? I hope it will be improved in several future development iterations.
fsiefken
this is odd, i've got a z800 hmd (800x600) and i could read 16 lines comfortably.
StavrosK
Sorry, I only tried it for half an hour or so and saw that it only displayed three or four lines. 16 lines is probably doable, I just meant it can't display an entire Web page, like a screen can.
bencoder
That hmd has only a 40 degree field of view, as opposed to the rift's 100 degrees.

FOV is normally diagonal, so for 800x600 we have 1000 "diagonal" pixels, and for the rift's 960x1080 (per eye) we have 1444 "diagonal" pixels.

So that works out to:

  1000/40  = 25.0 pixels per degree for the z800 
  1444/100 = 14.4 pixels per degree for the rift
So the z800 would have almost 4 pixels for every 1 pixel of the Rift

In fact, it's even less for the rift because the DK2 doesn't have a standard RGB display but pentile pixels, which means you only have the full 1920x1080 for the green subpixels, the red and blue are reduced since humans are less sensitive to those colours

hauget
I own a DK1 and the low resolution makes text reading incredibly difficult. DK2 is a step up but text reading still strains my eyes. I think the 'sweet spot' for us coders will be when the DK4 hits market late next year (DK3's supposed to be out early/mid 2015).
StavrosK
I really hope so. I desperately want a VR desktop.
widdershins
Huh? Oculus has said there will be no more development kits before the consumer version. And they haven't announced a date for CV1.
hauget
cv1="dk3"; cv2="consumer version" BUT... it will all depend on the advances others make in the input space.
AJ007
It should be pretty simple to add collaboration with node and something like socket.io. We are working on a project using threejs, node, and socket, just no Oculus support (yet.)
brian_peiris
OP here. The text is comfortable to read. i.e. I made it big enough to be comfortable. I haven't experienced any of the eyestrain that others have mentioned, despite spending up to 2 consecutive hours playing with this app.

However, in addition to the low resolution in the current Oculus dev kits, text rendering is definitely hard to do in VR. How do you do crisp sub-pixel rendering when your face is ever so slightly tilted? Something that will have to be solved in future software or by sheer pixel density.

That said, it's early days for VR and the future is bright!

hypertexthero
Reminds me of [Ready Player One][rp1] by Ernest Kline.

[rp1]: https://en.wikipedia.org/wiki/Ready_Player_One

readerrrr
This is really impressive and I'm interested in trying this.

But... the video shows a very specific example of programming where you are basically designing a map or a level. As soon as you are programming something else than a simple 3d game, it becomes less useful.

The major problem we are trying to solve here is managing windows, but I argue that normal screens already do that and you don't need VR.

A large screen or two plus using alt-tab and virtual desktops can completely and better replace virtual screens in VR. You have to move in the VR world to access information. On the other hand using a normal desktop where the information is represented abstractly( where in VR it is a physical object ) can be accessed instantly with a simple and fast key-combination. I simply don't see which programming activity could you do better in VR than in a normal desktop.

simpsond
I don't think this will replace your big screen(s) setup any time soon. That said, think about the mobility of a setup like this. Your psuedo-physical dev environment can move with you. I don't know if that's a good thing or not, but I'm certainly interested to see how it evolves.
readerrrr
It is trivial to implement that functionality in today's desktop mode without separating you from the rest of the system.

I'm trying to think of a good example where VR could be used but just can't.

It will probably be fantastic for stock brokers where their information relies on quick glances over a lot of realtime graphs/tables and if you look at their monitor setup today( +4 monitors ) the extra space will work for them. But when you code you are heavily centered on a single window and the rest doesn't really matter.

hammerandtongs
"""But when you code you are heavily centered on a single window and the rest doesn't really matter."""

For times like this you might want a completely black/white background, a forest clearing with very gentle soundscape or a quietly psychedelic lightshow/music all centered around your code window.

Some people's VR world will be serene and spare and some people's will be chaotic and full of the energy of a city street. It will be up to the user.

Some people will use the fact that everyone's brain benefits from both of those extremes at different times and use that to improve their cognition.

readerrrr
Your example is very idealistic. As a programmer you have to know how to concentrate, it is true that things like music( especially ) and surroundings help, but VR is overkill.

A company would never buy an VR rig for every employee just to maybe boost their productivity by an unknown margin. Also the learning curve would set them back by an unknown margin and some might have problems adapting.

Using your example, VR is useful, but only as an edge case where VR measurably help a person.

deathcakes
Surely the idea that a VR 'rig' will cost about as much as the monitors it replaces would change that though?
readerrrr
Only if you include the learning time and the lost productivity in that cost. In reality it will never happen.

Think of it as switching every keyboard in the company for a Dworak version. That will not meaningfully affect your average programmer's productivity, as their typing speed gained from a faster layout gives only diminishing results. Dworak only makes sense for a programmer that really benefits from >100 words per minute, and most really don't. The bottleneck it the programmer, not the input. The same goes for VR, unless you have a special case as I have mentioned before.

Let's say you type 300 lines of 80 characters of bug-free(!) code per working day of 8 hours. That is an pessimistic estimate( for my theory ), average is much lower.

Let's assume the average typing speed is 60 words per minute. That gives you 83% of the time not spent typing(!). Doubling the speed to a very fast of 120 wpm, only gives you 92%.

gazarsgo
Here is one example: eye tracking could completely eliminate the need for alt tab or switching desktops. Put your keyboard's focus in a window by looking at it, and have it automatically recenter into your natural field of view. Still doable in 27" but not nearly as useful.

Just guestimating, I would need 11 or 12 27" screens to fully utilize the area I can see without moving my head. With gentle neck motion included I think that number triples or quadruples.

readerrrr
Let's assume; VR is much better than alt-tab and clicking at the taskbar. It will reduce the time taken by 95%. Assume 5 seconds average for non-VR method, which is a conservative estimate, this gives us 0.15 seconds for VR.

If you switch windows every 60 seconds, this gives us an improvement of ~8%.

Table:

300s -> 1.5%

120s -> 4%

60s -> 8%

30s -> 16%

15s -> 32%

I have no idea what my average window switch time is. I should measure it sometime. I doubt it is under 30 seconds where the VR actually makes a meaningful improvement.

If you take an average of 2 seconds for non-VR and a skilled user with keyboard shortcuts and virtual desktops, VR advantage becomes insignificant.

Table:

300s -> 0.6%

120s -> 1.5%

60s -> 3%

30s -> 6%

15s -> 12%

If you compare a specialized window managing software versus the VR equivalent then I don't see any benefit towards the latter. The latency is the same. The only difference is the abstract windows management versus spacial of VR.

I think programmers can handle the non-VR solution just fine. I really hope it becomes better though; the resolution is currently just too small.

sharemywin
I don't know I could see a scenario where you have an object moving around in space and you grab a special developer handle and a code window pops up with its behaviors. then you modify it and it starts moving around again.
readerrrr
Right, that falls into the very specific case which is the level/map making for 3d worlds, which usually isn't programming but more design and scripting.

Any suggestions for actual programming? You know, like; classes, boring code, loops, i/o, algorithms?

brian_peiris
OP here. Yeah, I was definitely not aiming for a full IDE here (obviously). It more like a playground environment. Something like CodePen for VR with a focus on the live-coding experience.

Using this app is comfortable but VR will definitely need higher resolutions for anything more serious. It's not too far in the future though. If things go well, I expect VR will be the driving force behind ultra-high-density 5" screens in the next few years. Or some entirely new display tech (retinal projectors?) will replace it.

endergen
I agree with Brian on this one, like I told him recently, IDEs were definitely next on my side project radar. I tried out the Gear VR and it's only about 1.5 times DK2's resolution (in both vertical and horizontal resolution) and it made a huge difference. At Oculus Connect I felt like a lot of the talks were about exactly this that we finally have a use case for way higher res screens instead of incremental demand that is only slowly cranked up by vested companies like Hardware Manufacturers and really early adopters.
bsaul
This demo begs for drag n drop interfaces.

You'll soon build a counter strike level by walking inside it and sculpting the world around you. How cool !

waldir
Like this? :) https://www.youtube.com/watch?v=VzFpg271sm8
davyjones
I haven't seen this mentioned in the comments here: I think this is huge for CAD/CAE. I can see this disrupting current [Digital Mockup](http://en.wikipedia.org/wiki/Digital_mockup) offerings.

Everything from design, analyses, testing moves into the VR environment. Already CAD/CFD/CAE computational models are saving huge costs to companies. Throw in VR and we are that much closer to a tangible, collaborative environment. Very very excited!

tgb
Are there any players in the high end VR market for that sort of work that look like they'll be better than the consumer oculus? I'm thinking that a very high resolution screen would be the main thing a high end device could buy. Maybe good inertial measurement units as well but I'm not sure how much that would help over oculus style positioning.
justifier
oculus is digital stereoscopy

this demo is cool but i'd argue impractical

be careful getting excited over demos of technology that add a layer of abstraction from the tech direct

ever seen a photo of a painting? ever read a book you loved and your friends hated?

i'd suggest you try the oculus as you would use it: in your home on your box; before spending any money on the device

also, you are blind to the keyboard with the headset on

i feel oculus caries a mythos around it that is very difficult to obviate if you have yet to have worn one.. i'll link to an old comment i posted to carmacks keynote at the first ocucon : https://news.ycombinator.com/item?id=8347450

essentially i have stopped referring to oculus as vr, oculus is digital stereoscopy, and being stereoscopy the oculus is exploiting an optical illusion and as such, i believe, a flawed direction for the touted outcome: virtual reality

this demo: http://www.phoronix.com/scan.php?page=news_item&px=MTcyMTI ; is what encouraged me to shell out for the dk2

the experience failed to deliver on my expectations, though at the time i thought my expectations were practical and rooted in evidence, of course now i realise those expectations were irrational and that evidence was secondary being derived from this video demo

simias
You have commented elsewhere in this thread with this same rant but I simply don't understand the point you're trying to make.

Of course stereoscopy is part of what the oculus does (although there's also head tracking, so talking only about the stereoscopy is misleading). You say it's a flawed direction, so what's the right direction then?

justifier
it is my belief that oculus will sell a lot of devices largely by manipulating the ad copy, resulting in frustrated people out a lot of money, i suppose i am attempting to include another perspective on the device, one that i failed to see before i bought one, and have yet to see since, but feel confident in my understanding of it

i suppose it depends on your definition of virtual reality..

but i think the verbiage of vr is very misleading

my position is that if oculus was called: oculus digital stereoscopy; the tech would lose a lot of its hype

as for the tracking it is certainly the best feature of the headset, again with the goal of calling the thing what it does i refer to the 'head tracking' as 'inferred spherical screen encapsulation'

what the head tracking is doing is windowing your vision to the size of a fisheyed cell phone screen and then inferring your position in a 3d space and displaying the contents of that field of view on the screen, refreshing the content of the image in regard to the movement of your neck, giving you the impression that you are in a sphere of flat displays looking through fisheye glasses with blinders with a forced forward stare

when i put it on the restricted periphery and the inability for me to use my eyes to look around was the most dramatic realisation

but that is just dismantling the vr hype, actual stereoscopy is flawed in its own ways..

peoples eyes differ in distance apart, the oculus is solid plastic, certainly the error is measured in mm to fractions of mm but that is still error that your brain now is required to correct,

also the goggles press against your face tightly to seal out distracting ambient light, this hurts after some time, and in the case of my watching a film, completely depleted my eyes of blood leaving white bags under my eyes

that is the hardware

as for the software if you read the comment i linked in the comments you are referring to i talk about an experiment i performed when i first received the oculus that shows that the brain error corrects the inconsistencies between the eye inputs at a pretty substantial distance, talking here of the blue and pink box experiment, in the experiment only the pink box moves, but when i showed it to a friend thae asked why when the two boxes began overlapping did the blue one start moving as well? i watching this all cloned on the screen in front of me explained that it still was the case that only the pink one moved and the blue one is being moved around by thaer brain, so if a simple experiment has my brain working overtime shifting still objects it would seem reasonable that a lot of the 'jitter' that is so often talked about when displaying entire worlds is your brain organising its inputs

as for what i believe is the right direction?

if we are talking about visual input manipulation.. i'd think projection, perhaps onto a dome or disc encapsulating the entire eye range including periphery of the eyes even with the pupil extended to the outer most regions of the eye

or one fantasy device i dreamed up used this recent research: http://newsoffice.mit.edu/2014/magnetic-hair-directs-water-f... ; if you watch the video, at the end, 57s in, you can see these hairs bending optics, what you'd do is have two rows of these hairs, one for each eye and each row bends to follow the movement of a pupil and each hair acts as a pixel

this way simulates two photons bouncing off a single object and reaching the eye, that would be wild

but mind that i referred to the effort as visual input manipulation, if you want to talk vr i think the only possible direction is bypassing our sensors and interacting directly with the processors, a la the suit in Zero Theorem : http://cdn-static.denofgeek.com/sites/denofgeek/files/styles...

cLeEOGPw
I don't understand your argumentation against stereoscopy. It is how you see the world. Even your suggested "dream device" would be stereoscopic device. You can't run away from it. Besides even if the device something like you suggested will ever be developed, there will be those few guys who will complain how it is not real VR and how you need to send signals directly to the optical nerve as well as spinal chord and all other senses in order to achieve "real VR". These kind of talks are unproductive. Better try to develop your version of VR instead of complaining about current best.
possibilistic
If a first-class, distributed VR world became a thing, I could get lost in such a world forever. I would stay logging in for work, hobbies, friends, adventure, and romance. Perhaps I would log off a few times a day for food, sleep, hygiene, and exercise; I could imagine that being the extent of my physical world interaction if the VR tech were good enough. (It has a long road ahead to reach that point though. Ancillary technologies would also need to evolve for closer to brain/full-sensory I/O.)

Given tools to continually innovate and improve the digital world, I don't think I could ever grow bored of it. There would be endless permutations of possibility with which to express myself. I could make games, movies, physics, patch the stack. I could play God. Or I could let someone else play God and adventure through the worlds that they create. Then perhaps send a pull request.

An all-enveloping creative world sounds so much more exciting than the life I'm living right now. I could finally be anyone I wanted. Make anything I wanted. You might say I have this freedom now, but none of us really do. In a virtual world there is no central society to tell us how to conform and fit in, no traditional family structure. No tradition. Just feeding your short life's want for expression. I want that so much.

I can see so many people like me preferring a life like this. With this level of freedom. Imagine: Religion, money, and power could have little bearing. Meritocracy reigns, but even that could be ignored if you wanted.

Furthermore, I can see costs as becoming nearly negligible. No more consumption of material things beyond bare necessities: food, shelter, hygeine, and healthcare. Nobody needs a physical high rise view when the digital world can give that to anyone. I don't see why anyone else would need to buy the things we are trained to as consumers. (Would that be an economic problem? It very well could...) I don't see this necessarily ruling out a digital economy, but unless things are contracted and paid for on a per-demand basis, digital assets can easily be copied... an economic problem that might need to be considered.

If this were widely adopted (and barring societal collapse stemming from large swaths of society switching to a VR lifestyle), I can see it bringing ungodly amounts of funding to biochem research. We might elevate our concern for solving our biochemical longevity problems above all else.

Sorry for rambling and wasting your cognition with this idle pipe dream. I'm actually pretty pragmatic and don't expect for much of this to happen while I'm still young enough to enjoy it. Or at all.

vlunkr
I enjoyed reading your pipe dream, it's a pretty interesting thought. It would be like living in our own matrix, where everyone can be Neo.

VR is still missing the ability to simulate smells, tastes, and tons of other sensations. I think I would get bored in the world like that. Plus my real life is pretty fun.

moron4hire
I'm building an open source framework for making these sorts of WebGL VR experiences. https://github.com/capnmidnight/VR/

It doesn't yet support "live programming" like this, but the hope is the framework would eventually be used to easily make these sorts of things.

So far, the focus has mostly been on the user input side. There is a constraint-based system for defining input values from different "devices", i.e. keyboard, mouse, gamepad, head mounted display motion tracking, arm motion tracking, etc. There is even a rudimentary gesture system, where things like a nodding gesture or head shaking gesture can be wired up and used to activate commands just as easily as a mouse click or key press.

The framework also supports multi-user environments, and I've already started building the text input controls (somewhat similar to what is shown here) to do things like this (right now, it's only useful for live chat). Eventually, I'll have buttons and text boxes and togglers, etc.

If you're interested in this sort of stuff, please consider joining me.

brian_peiris
OP here. I like where you're going with that. If you weren't already aware, you definitely need to integrate WebVR. A couple of devs from Mozilla and Google are adding VR APIs to Firefox and Chrome.

Join the mailing list: https://mail.mozilla.org/listinfo/web-vr-discuss

More info: http://blog.bitops.com/blog/2014/06/26/first-steps-for-vr-on... http://blog.tojicode.com/2014/07/bringing-vr-to-chrome.html

malux85
I like this. I have an oculus rift version 1, and I use the terminal (https://github.com/hyperlogic/riftty) to code in occasionally.

This is really good because I have a stomach condition that means being in a seated position is very painful for me, but using this I can stretch out on the bed, with a keyboard resting on me, and still code :)

jasonkostempski
Eye tracking would be a sweet enhancement to an interface like this. If you could just target something by looking at it (without having to turn your head) press a key to use/select it and then use some other keys to jump through the various pieces of code that created it, I couldn't even imagine how much time would be saved in a large project.
maga
I wouldn't mind doing an ordinary programming in it as well. It would be much easier than messing with multiple real displays.
XorNot
This is absolutely what I wanted an DK2 for. To see exactly how far we are away from this application, and think about improvements for it.

My dream is a world where we eliminate physical displays from our computers altogether.

But we're definitely not there yet - it's going to 8, maybe 12K displays in a Rift to get the necessary resolution. But once it's viable, yeah, I hope to never buy another computer monitor again.

fsiefken
yes, but then you have to take the lower resolution for granted. I think I might still be more productive with a single HD monitor.
spacefight
I'm not so sure how my neck would feel to the constant tiny movements in any direction.
jfoster
Wouldn't it be the exact same as outside the Rift, assuming the head tracking is implemented correctly? Outside the Rift, movement changes perspective. Similarly so inside the Rift.
None
None
goldfeld
Look into the Bates method, it's natural vision research from an ophthalmologist in the 20s. The argument is that animals (including us) naturally move neck and eyes in exactly tiny little movements literally all the time, non-stop. And that is healthy because constant movement promotes relaxation, whereas keeping your neck (and eyes) fixed leads to tension and over time is what actually increases our prescription, which he argues is just like being inflexible (that is, it becomes your new normal), but in the eye and neck muscles.

So I'm actually pretty excited for something that needs me to look around all the time. I've still gotta try making my text editor twitch about all the time. A moving target makes more sense for our predator eyes, so maybe VR will finally allow people who work with computers not to have disproportionally more need for glasses.

Dewie
So I guess I'll end up with a standing/treadmill workstation because sitting is unatural, and now also twitching/slightly evasive windows because staring at a fixed spot is unnatural. I guess the next thing after that is an input device which you have to bounce on and off the screen, in order to mimic those primal hunter-gatherer spear throwing techniques.
deif
Perhaps when the Oculus HD comes out. At the moment the text would be in such a low resolution that it would make it not worth the time. Pre-emptively creating an app in preparation for the Oculus HD might not be a bad idea however.
noobermin
This is freaking cool by itself, but the video gave me an interesting idea: what if you used this as an IDE for coding in general? You wouldn't need to have multiple monitors, you'd have 360 degrees of stuff to look at.

Then again, the idea of being cut off from the outside world with only your code to look at might be a less salient idea, so why not put a camera on this thing, then you could have "projected windows", basically like the holographic interface of tony stark without the holograms. I personally think that would be pretty cool.

andrewliebchen
I think I would prefer the "solitude" of a 360deg coding environment that would be afforded by an Oculus, NR headphones, and music. No visual distractions!

Besides, I imagine that this setup would be cheaper than two monitors and a desk. I'd need to drastically improve my touch typing skills...sometimes even now I need to look down. Maybe in this set up there's a representation of my hands or better yet, a VR native way to interact with code that isn't a keyboard.

Addendum: I remember seeing something awhile back about building an IDE designed to be used with a gaming controller only...

zacfinger
How has no one mentioned William Gibson or "Neuromancer" yet?

> The matrix is an abstract representation of the relationship between data systems. Legitimate programmers jack into their employers' sector of the matrix and find themselves surrounded by bright geometries representing the corporate data.

> Towers and fields of it ranged in the colorless nonspace of the the simulation matrix, the electronic consensus-hallucination that facilitates the handling and exchange of massive quantities of data.

ctdonath
It's a 30-year-old book ... older than many people who are making viable VR a thing now. Neuromancer sparked the otherwise naive notion of what VR would be like, but now we've developed and normalized a good chunk of the technology & language (verbal & visual), in a society which has grown accustomed to pervasive computer interfaces (I sit here with 8 on my desk alone, and totaling 22 in this room). Reality has taken a somewhat different & concrete direction from what Gibson fluidly predicted. The wonder of a hypothetical world of people jacking into their sector of the matrix surrounded by bright geometries has been replaced by the mundanity of a real world of people staring at Facebook et al; we live the electronic consensus-hallucination of massive data, and are not impressed by someone telling us in abstract terms how wonderful it will be.
erikpukinskis
If anyone played Myst, you may be tickled to realize that this is in fact what Myst was about.

The hand-wavy "this might look like a book with a video in it and some neat drawings, but it actually contains a magical language that describes the world you're in" ...

... you're looking at a book written in a magical language that describes the world it's in.

c3d
I really love the "in-world" editing. However, I'm not convinced by the description of the scene. Here is what it should look like: https://www.youtube.com/watch?v=paJG7Fy5Few.
source99
VR is no doubt very cool.

But I'm not sure this is taking advantage of the true benefits that a VR environment brings.

What are the major benefits that a VR environment brings?

Screen real estate would be 1 benefit but I would have to see an implementation where this was done well before I could believe it.

What other benefits does VR have?

createuniverses
Has anyone tried "praxis"?

https://www.youtube.com/watch?v=6rB39AXPmQQ

https://www.youtube.com/watch?v=1VRtRazMYSA

None
None
radisb
Nice, though I dont know if it is the wisest thing to start coding The Matrix in javascript.
CmonDev
The only thing sadder would be programming a strong AI or brain-computer interfaces in javascript. Hopefully the disaster will be stopped way before.
runewell
VR is a medium where I could see voice commands and visual programming actually having an impact. I would love to see a 3D version of UE4 Blueprints or Scratch.
alexkehayias
So cool! Great work. Might be nice to keep the window as more of a HUD with some transparency so you can move around and still see the results while you code.
mpg33
How easy is it to use a keyboard while wearing a headset though? Even though 90% of the time you dont need to look down ....occasionally you do.
pierrec
To see your keyboard, I can imagine a few hacks but all of them are a bit awkward:

- Have a "keyboard cam" that only shows you the keyboard (and your hands when they're over it) displayed at some natural spot in the VR.

- Have a keyboard with capacitive sensing keys, reproduced in the VR with keys highlighted as a function of the proximity or contact with your fingers.

- Something like this [1]. Typing in thin air or on a table probably wouldn't work very well, but if the leap motion senses the keyboard as well as your hands, and the whole thing is reproduced in the VR, that might work.

[1]: https://www.youtube.com/watch?v=TRhoQ6o4mI8#t=142

pkroll
On the DK2, you can look down at the bottom edges/corners and see out the, I'm guessing, airholes.
fsiefken
if it's < 10 minutes you could remove your headset temporarily, or you could leave some space between your face and the oculus rift to glance down. You can also mount a fisheye webcam on your head and transparently overlay it on the world when needed.
brian_peiris
OP here. You definitely need to be very familiar with your keyboard for this to be a fun experience. Even switching to a slightly different key layout would throw me off. As long as you're a proficient touch-typist, it's really not a problem.
fsiefken
i wonder if something like this can be achieved on the opensim platform, does anyone know? I know a webkit browser can be accessed in the sim, but that's on a 2D surface. You want to 'rev' the 3d them live and in game. https://www.youtube.com/watch?v=ubYYVnfLULI
LeicaLatte
Nice effort. Input for VR still sucks though...
melling
Some combination of voice recognition, gestures (e.g. space, tab, return), and a virtual keyboard is needed. Don't think people want to be confined to sitting in front of a keyboard at a desk. It's not going to be easy.
elwell
Now throw an Emotiv headset [0] into the mix.

[0] - http://www.emotiv.com/

ck2
Now do it the other way around, you create virtual objects with hand motions and it writes the code equivalent for you.
XorNot
VR enabled FreeCAD would actually do something like this. Everything in that is represented as Python code.
elwell
Every now and then you see a demo that catches a glimpse of the future.
shocks
Someone please build vim in VR...

Then I can legitimately wear an Oculus Rift at work!

vhiremath4
Very rarely do I watch something and am instantly consumed by it.
kentf
Brian FTW!
bikamonki
Beautiful!
curiously
I don't think VR ide will be practical. You have a screen beaming inches away from your eye. It's bad enough we stare at the computer monitor all day.

I think what would work is some sort of hologram reality agumentation that is able to project 3d images in the environment you are in.

Again, this ergonomic aspect of the Oculus might be improved but my primary concern is the vision health aspect of it.

ilaksh
VR HMDs have lenses that make it more like you are staring into the distance.

Its sort of like wondering about the health effects of wearing glasses.

curiously
I don't know but light being shone in your eyeballs for a long period of time I find worrying.
evan_
Sounds like CastAR: http://www.engadget.com/2014/03/20/castar-update-gdc/
IanCal
While there's a lot of talk here about the future of IDEs, difficulties with current hardware and other challenges, I'd like to take a moment to talk about what's actually in the video.

This fills me with almost childish glee. This is a virtual reality, where you can pop up a "god window" and create/mess with a world around you. Even just what's in the video, making some cubes move and change colour is wonderful. It's fun!

Remember the early bits of amazement at programming? Where you could make the computer say your name or draw a square? How much cooler would that be in VR?

rasur
Did you ever catch a look at (or play with) OpenCroquet (which then morphed into OpenCobalt and... kinda stalled, from what I can gather).

Croquet was this (but without the Oculus) in 2007. I miss it, but it went nowhere, but.. that's Smalltalk for you - seems to scare people for some reason.

m_mueller
This is pretty much how I imagined Notch's sadly abandoned space game project that had a programmable CPU as the heart of your ships. I hope someone takes over his torch on this, I'd love a fully fleshed out programmer's game. Imagine using such a thing in programming class as an introduction.
bjt
For a year or so I worked as a contract programmer on the inworld content team for Second Life. I didn't have VR, and the language I programmed in (LSL) was even worse than Javascript, but it was still incredibly fun to use code to shape the world around me. It's even more fun when you can do that in a shared space and see others interacting with your creations.

Someday there will be something like Second Life with Oculus support and a decent programming environment, and that will be awesome.

jpindar
InWorldz is working on it, or at least talking about eventually adding both Oculus support and another language. They've already made considerable improvements to LSL and to the grid and viewer software.
vlunkr
Second life looks like a great game that I would never play. Most people on it seem to be only interested in the social aspect, but it's pretty awesome that you can create programmable objects and sell them.
leeoniya
while it's neat that you can type code into a window in the VR space, this is far cooler (just add VR):

https://www.youtube.com/watch?v=FfxG_uB66vI

nightski
The thing is though you can do this today without the VR glasses. Sure the VR glasses offer a coolness factor, and I am excited for that, but I don't see the revolution here yet.

My problem is that it lacks imagination. Let's just take our text editor over to the VR world. Not come up with a new method of interacting with code utilizing VR.

IanCal
> The thing is though you can do this today without the VR glasses.

Yes, and I think live coding is pretty cool.

> but I don't see the revolution here yet.

Neither do I, but I don't think it needs to be. It certainly didn't seem to be sold as that. That was part of the reason for my post actually, that people are alternately seeing this as the next big thing or falling short of some incredible goal. It's neither, it's just cool.

> My problem is that it lacks imagination.

Well feel free to create something else incredible, and I'll be happy to find that awesome as well. This doesn't hold anyone back, it doesn't stop anything and it isn't being billed as TheNewWave(tm), it's just an interesting experiment combining a few different bits of tech together.

It's a drastic shame to see people poke holes in something just because it isn't world changing.

nightski
You don't need to down vote if you disagree. I was contributing to the conversation. Either way, in my opinion it's valuable feedback. If I was working on something like this, I'd want to hear what people really thought. Not useless cheerleading. I did not poke holes - I gave suggestions. Look for alternate methods of input not just rehashing the same methods from a desktop.

Also I am working on something on my own. Which is why I thought I'd give feedback. I encountered the same issues myself.

IanCal
> You don't need to down vote if you disagree.

I'm not sure if that was aimed at me, but if so you can't downvote replies to your own comments (at least I can't, maybe there's a higher karma threshold where you can but I doubt it).

girvo
You can't straight away, but you can after a certain time threshold has passed. At least, that's how it works on my account!
nightski
If it wasn't you I apologize. I have no problems being down-voted if I am out of line but it is annoying when people just down-vote because they disagree.
e12e
> ... I don't see the revolution here yet.

I agree with your overall sentiment, but I also have an observation: I have the DK2 -- and while I haven't had a chance to play with this yet, I have been playing a little bit of the Elite: Dangerous beta.

The revelation there, is that working headtracking makes working with a huge "display" feel very natural. You can only look at one (or a few) things at once anyway -- but with head-tracking you don't have to worry about lining up five 30" screens in order to keep track of everything -- having it all (sanely) mapped out in 3d actually works just as mundanely and boring and well as you'd expect.

And that in itself is kind of a revelation. Suddenly you have almost unlimited screen real estate, that is even more naturally and intuitively accessible than multiple desktops (and I'm generally a great fan of xmonad-style tiling, no-full window managers with easily accessible multiple workspaces).

Another "wow" factor for Elite was the fact that what seems like a kind of silly gimmick on a single screen -- that "holograms" pop up with info when you look at them -- actually works in VR.

The one thing they did botch in Elite in the previous beta, was font-size -- the DK2 leaves "half-hd" per eye -- and that's not really enough to read "fine print". They sort-of fixed it in the new Beta just out -- but now the UI for space docks is "too wide" -- the text is more readable, but the "holo" should've been placed "further away" from you, so you don't have to move your head from side to side to read a line of text. I've yet to try it plain 2d -- I suspect it works fine on a regular screen.

Anyway, the point is that even a completely boring an unimaginative use of VR for programming can actually be a minor revolution. Fully working gloves/hand tracking would be better -- maybe with just a "dummy" keyboard for touch input (I'm not convinced "drum on a tabletop, visual-tracking keyboard" is actually that usable for touch-typing -- at least not when your back-tick is at shift-two-places-right-of-zero and opening-curly-brace is at meta-7 etc -- as it is on a Norwegian layout).

Finally -- immediate feedback certainly is nice -- and this is where the "big screen" effect of VR can really help. You can be effortlessly aware of both your evolving 3d model, and the text-representation of your code -- without having to have two screens, or switch between virtual workspaces.

nightski
Thanks for this. I am really excited for the DK2 to arrive. I ordered it a bit late but it should ship soon. I did try one at PAX Prime, it was outstanding.

You make some good points and I'll have to investigate further once I get the headset. Currently using the SDK and it just isn't the same.

Skywing
While watching the video, I kept hoping he'd look down at the cubes and "jump down" to them. Like, enable gravity and jump down onto the cubes as a player. That'd have been awesome.
oneweekwonder
Just for interest sake, blender recently added such functionality.
elb0w
i cant wait to make lawnmower man a realtiy
sitkack
I have been mowing lawns and popping nootropics from Brin, not smart yet. waiting. mowing.
justifier
do you have an oculus?

i argue against people calling the oculus virtual reality (i)

that child like glee is warranted in a vr scenario, but what you are seeing demo'd here is digital stereoscopy

.

(i) https://news.ycombinator.com/item?id=8347450

unclebunkers
You're citing yourself??? Fuck dude, go outside for a bit, get some fresh air.
wildpeaks
Reminds me of Worldbuilder [1], a scifi short about a VR interface for editing the world around you.

I kinda like the "grab object & kiss to save" gesture: it wouldn't make sense in 2D but feels natural in VR, although "grab & push object in your brain" might be easier implement to represent a save function.

https://www.youtube.com/watch?v=VzFpg271sm8

IanCal
That looks really interesting, thanks.
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.