HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Light Field Rendering and Streaming for VR and AR

on-demand.gputechconf.com · 73 HN points · 0 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention on-demand.gputechconf.com's video "Light Field Rendering and Streaming for VR and AR".
Watch on on-demand.gputechconf.com [↗]
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Aug 07, 2017 · 73 points, 24 comments · submitted by thebyrd
otoy
OTOY is contributing the ORBX container and render graph system to MPEG I part 2 at MPEG 120 as a'tier 1'license (equivalent to MIT license). Paid licenses and patents IMO should not be in the baseline minimal schema/scene graph system, or we will never get to a truly open metaverse. I made as strong a case as I could that plenty of value and IP can still be implemented as services or modules on top of such open framework whenever this issue came up at MPEG 119 last month.

Here is one of the two ORBX docs from MPEG 119, the other (which has the full container schema) I'll post shortly.

https://home.otoy.com/wp-content/uploads/2017/08/m41018-An-i...

jayd16
I find everything I read about OTOY's stuff (seemingly purposefully) confusing.

They seem to have a render farm, light field format and renderer, and a streaming video format that they will always mix together in their demos.

"Look at these amazing renders. It runs on a phone!"*

*its actually just streaming video to the phone.

I'm pretty excited about this stuff but I keep finding myself frustrated trying to crack through OTOY's marketing to get my hands on something I can try myself.

Can someone please break my incredulity?

otoy
You can start here: https://unity.otoy.com
None
None
yoz
At 13:45 he talks about:

1: streaming a Unity/Unreal game into a surface texture

2: packaging an entire Unity project into an ORBX file

So... am I understanding this right: that ORBX can contain not just a light field, but all the assets and logic for a game, compiled to LuaJIT, which the ORBX player (or orbx.js) will then play? And Unity can target this for output?

otoy
1 yes (you can always try this in OTOY.com home page - cloud demos menu)

2) yes, it works as shown in the video using the standard Unity samples unmodified to prove viability of the system on GearVR and Samsung Internet. We also have this working non PC/rift and daydream now

ORBX.js is < ORBX.lua on top of luajit. But an ORBX file can be flattened down to a cloud stream if user agent can only play ORBX.js feature set.

ORBX is a container with the render graph of the content in xml/Json + assets. Just like a web page, it can be cached to archive (.orbx file) or streamed from a URL or URI over raw UDP/tcp or web wss or https

laythea
Is this just cool tech wrapped in a proprietary format requiring hefty licensing fees?
FieldSleep
I would hope not, at ~2:45 the presenter mentions working towards open formats, such as their orbx format which is open source.
ryandamm
Open source doesn't mean royalty free (though I have no idea if this format would be royalty-bearing).

There are open source implementations of various MPEG and JPEG codecs, but they still carry a licensing fee to use them. Most people who don't use Linux don't realize this (installing Linux and trying to deal with av codecs is a quick education in the difference).

otoy
ORBX framework should be like Lua or luaJIT -i.e. MIT license. What is built or done on top of that and how it's licensed to others is up to the creator or implementer to decide, assuming they have such rights already.
TD-Linux
Is the orbx format specification public? I can't seem to find it.
otoy
I need to download the latest version that was submitted to MPEG for the file and serial data format and get it on OTOY.com like the 1st doc (intro PDF) linked in my post above. Will share a link as soon as that's done. The container and stream are just one layer, the full node schema and module system is mapped out on docs.otoy.com for users and developers.

We're working on a flat C API for ORBX script nodes which will (we hope) leverage WASM+WEBGPU/VK to add novel functionality within the node graph itself instead of needing it to be in an arbitrary external renderer. This was how we first implements ORBX1 video in pure JavaScript back in 2013

Note that Graphs can further 'render' a scene to valid glTF/mesh and volume formats, not just texture/image output buffers.

I don't think glTF will be able to fully replace alembic/FBX soon in the schema sadly, based on last KHR discussion at siggraph (I understand the reasoning, do not going to advocate they change it); glTF is still a useful asset format for proxy volumes or baking to webgl pipelines.

laythea
The minute they mention MPEG, I already know whats going to happen - Licensing. Spoken by a dev that didn't incorporate video, due to licensing, not technical reasons. The open bit is just the draw.
otoy
See my post above. ORBX schema and formats submitted to mpeg must be usable by anyone and everywhere with only an MIT license (same as lua which we use script nodes in the graph). I was told this is fine, but that they need it as a tier 1 MPEG license (their equivalent as I understand it).
laythea
I may sound like a skeptic, but I am fairly sure that if MPEG have an interest in this, it is in order to license it.

If not now, then later. In my opinion, this would be MPEG getting their teeth into a new tech, in order to either license it now, or wait until it has traction and then license it.

It is absolutely amazing tech though. Shame.

Findeton
What about real video recording to generate real VIDEO light fields?

Also: how much processing is it required to get video light fields from video recording (with an array of cameras I suppose)? I mean: does it scale?

orbital-decay
Well, Lytro is still in business and they are doing precisely that. https://www.lytro.com/immerge

It's a true plenoptic (light-field) camera.

Findeton
Well, does that scale well? how much does it cost? Can a camgirl use it to "stream" (not live) her shows, for example?
otoy
This is the basis of our work with Facebook which can capture a light field using 6 or 24 cameras. We run a Job on cloud and it comes back as a 3D scene ORBX asset you can load into Octane for Nuke, Unity etc.

https://twitter.com/otoy/status/894972675679965184

Findeton
Can anyone do it? Is the software to do that open source?
otoy
You need the camera. The software to process is hosted by us on the cloud, I don't if FB can oss it. At least the ORBX file is open and loads in Unity Nuke and AE
LyalinDotCom
Wow
erikpukinskis
OTOY is absolutely at the forefront of digital imagery. They're like MPEG in a Kodak world, that's how different their approach is.

How?

1) Light fields are 4 dimensional images, light field video is a 5 dimensional stream. This is a basic requirement for hand- or head-tracked images, like in headsets or AR devices.

2) We're just getting to the point where real time ray tracing is truly economical, and OTOY is all-in on it. Up til now rendering has been a "bag of tricks" approach, where you try to paint sophisticated paintings on polygons. Many of these tricks fall apart when you try to do the predictive modeling required for 6DOF streaming. You see the reflections painted onto the countertop. Ray tracing actually simulates light.

3) They've fully embraced the cloud. They're offering everything they do as cloud services, which means it can work on every device, for a minimal cost, with no need for customers or users to be on the latest hardware.

4) Open formats. They're not trying to build a portal the way Oculus or Valve is, they're inventing the content pipeline and getting it integrated everywhere they can. I am skeptical any the closed content stores will win, we saw how big the web became, I think a better bet is that the metaverse will be more like the web than the App Store, and that's the bet OTOY is making.

Orbach has been working relentlessly on this vision behind the scenes. Not a lot has been coming out of the company, but I've been watching him lay the groundwork for the whole next generation of content distribution for 5 years now, and he's killing it. Release after release of core building blocks.

lightedman
"Which means it can work on every device"

From OTOY.com's page: "For optimal performance and smooth interactive use, Windows 7 64bit, 8GB system RAM, a modern Quad core CPU, and a GTX brand NVidia graphics card (like GTX 560 / 570 / 580 / 590) with 1536MB VRAM or more is recommended."

Sounds like a minimum hardware spec using fairly recent hardware to me.

otoy
For rendering an ORBX in Octane then yes (480 should be minimum), for playback 9( an ORBX package exported from Octane render job/target, HTML5 with webgl is enough for basic 360 viewing even without vr or ar
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.