Offscreen Colonies: VR Edition
category: general [glöplog]
first of all: good job. this is state of the art.
all we need is some (variations of) choreography aka "Amazing Coordinated Samsung Dance".
just let the other 299 PCs mine some bitcoin meanwhile..
Quote:
...because people won't have the same head parallax, so it'd need 300 independently rendered images.
all we need is some (variations of) choreography aka "Amazing Coordinated Samsung Dance".
just let the other 299 PCs mine some bitcoin meanwhile..
I did suggest that the next generation of VR should not only have GetHeadPose but SetHeadPose as well, to avoid people from looking away at things.
that, or we pretend "360 stereo video" counts as VR, and generate and stream a stereo 360 feed from the main PC in real-time to a load of cheapo mobile headsets..
AR would be better.. otherwise those 300 people wouldn't be able to locate their bottle of beer during the compo!
You think it would be possible to just have a mediocre VHS copy any days soon? - still waiting in my room for this to happen.
I have like several CRT screens capable of showing similar colors and with some trial and errors, I can possibly sync to of them to do some kind of stereophonic presentation almost like it's VR...or what ever it's called nowadays.
Of course this would require TWO VHS tapes, but I can do the copy's my self, so no worries here...
Please fax me the answer - not on this internet thing every month anyways...
I have like several CRT screens capable of showing similar colors and with some trial and errors, I can possibly sync to of them to do some kind of stereophonic presentation almost like it's VR...or what ever it's called nowadays.
Of course this would require TWO VHS tapes, but I can do the copy's my self, so no worries here...
Please fax me the answer - not on this internet thing every month anyways...
I wish for a pair of VR-glasses for Christmas...
Would be nice to watch it... unfortunately the occulus page keeps pushing me out because of password/email/PIN/facebook (that I don't have) account bullshit and my patience has ran out.
A direct link would be great if possible - I hope it runs on my vive !
A direct link would be great if possible - I hope it runs on my vive !
Oh awesome, I'll load it up here!
Thanks CNS! I've been trying to get people to make VR demos for the past few @parties but to no avail. Too bad I have to wait to talk a friend into grabbing this, I couldn't justify an Oculus Rift purchase in addition to my Vive ;(
Would love to make a Vive version if someone hands us a device for a week or two.
"Vive version" :D
I would have thought that it should run automatically on both platforms? being openvr and all that? At least my programs run on both (I develop on the vive) without any code change.
The controllers are very slightly different and maybe that can be a problem. But it *should* run out of the box.
The controllers are very slightly different and maybe that can be a problem. But it *should* run out of the box.
It's running on the Oculus Native SDK, not OpenVR.
FWIW I did a bit of fiddling with OpenVR and about 80% of it works okay so far, it's just that the tracking origin in OpenVR seems to be about 2-3m above where it is in OculusVR, so either 1) the APIs are different or 2) I didn't calibrate SteamVR properly. More on this later.
If the OpenVR version works nice, we'll put it on Steam just so Navis has some more passwords to remember :P
Yay!
That would be great. Sounds like a miscalibration problem to me. In general, I found OpenVR to work without problems.
Would it be possible to also put it on scene.org, being a scene release ? Then we can link it on pouet and get it without passwords.
Would it be possible to also put it on scene.org, being a scene release ? Then we can link it on pouet and get it without passwords.
gargaj: are you using the floor as the origin in both sdks? oculus has "eye level" as an option which would give you the 2m offset vs openvr if you were using it.
we use "floor level" always and our own simple routine to generate a centering position to mimic "eye level".
we use "floor level" always and our own simple routine to generate a centering position to mimic "eye level".
Quote:
... just so Navis has some more passwords to remember :P
all hail LastPass
also, will there be a version for google cardboard? highly unlikely i assume (Android and shit)
I salute the initiative, the effort, the over the top marketing, and while I've never tried to play directly with the Oculus SDK, I can only assume it required some ninja stuff to make it fit within the arbitrary space restriction. It's pretty refreshing to see the "128kB" size on the app description. :D
Now on the experience itself, some things worked, some didn't. In general I was surprised by how well it worked; I don't know how you guys author your demos, but I would assume that a lot of scenes are not at all how they look to be, so I would expect them to completely break in VR. But here the result makes it feels like it "just worked". I'm curious to hear how much effort it took for the scenes to look correct in VR though. :)
That being said, I would argue that the magic of the original demo shatters once in VR. Things that looked awesome now appear for what they are: the "Approximate Orbiter" looks like a bunch of crude cylinders, the vehicles in the "Mercury Station" look like opaque billboards, etc. I've found the "Cocoon" orientation interesting, but the "Rebels Colony" broke the ambiguity: in the 2D version, I had interpreted it as a well of epic proportions, but in VR it looks like a tunnel, horizontal and a lot less interesting.
On the contrary though, the "Andromeda Prime" scene gained a lot with VR (despite the massive aliasing), with that vertigo inducing camera move. It was also a nice touch to add things to look at around. :)
On the post-processing side, have you tried other solutions for the vertical blur? Like doing it in world aligned coordinates rather than screen space? As soon as you tilt your head, it gets that glass door effect unfortunately. That and the aliasing were two reasons for breaking immersion.
Performance wise, it ran smooth as silk on a 970, but I didn't measure.
Anyway, enough nitpicking, this is a pretty cool experiment!
Now on the experience itself, some things worked, some didn't. In general I was surprised by how well it worked; I don't know how you guys author your demos, but I would assume that a lot of scenes are not at all how they look to be, so I would expect them to completely break in VR. But here the result makes it feels like it "just worked". I'm curious to hear how much effort it took for the scenes to look correct in VR though. :)
That being said, I would argue that the magic of the original demo shatters once in VR. Things that looked awesome now appear for what they are: the "Approximate Orbiter" looks like a bunch of crude cylinders, the vehicles in the "Mercury Station" look like opaque billboards, etc. I've found the "Cocoon" orientation interesting, but the "Rebels Colony" broke the ambiguity: in the 2D version, I had interpreted it as a well of epic proportions, but in VR it looks like a tunnel, horizontal and a lot less interesting.
On the contrary though, the "Andromeda Prime" scene gained a lot with VR (despite the massive aliasing), with that vertigo inducing camera move. It was also a nice touch to add things to look at around. :)
On the post-processing side, have you tried other solutions for the vertical blur? Like doing it in world aligned coordinates rather than screen space? As soon as you tilt your head, it gets that glass door effect unfortunately. That and the aliasing were two reasons for breaking immersion.
Performance wise, it ran smooth as silk on a 970, but I didn't measure.
Anyway, enough nitpicking, this is a pretty cool experiment!
Quote:
gargaj: are you using the floor as the origin in both sdks? oculus has "eye level" as an option which would give you the 2m offset vs openvr if you were using it.
we use "floor level" always and our own simple routine to generate a centering position to mimic "eye level".
From what I can tell Oculus defaults to eye level at least, doesn't it?
Quote:
Would it be possible to also put it on scene.org, being a scene release ? Then we can link it on pouet and get it without passwords.
Is it necessarily a scene release though? The original was, sure, but this one wasn't released at a party :)
Quote:
also, will there be a version for google cardboard? highly unlikely i assume (Android and shit)
There's https://www.trinusvirtualreality.com/ which I've been told is doable?
Zavie: I was hoping to answer most of those questions in shape of a talk at possibly Demobit; unfortunately turns out Smash is speaking about pretty much the same things and I don't wanna trample on his talk by presenting an inferior greenhorn version of it so we'll see :)
Quote:
Quote:gargaj: are you using the floor as the origin in both sdks? oculus has "eye level" as an option which would give you the 2m offset vs openvr if you were using it.
we use "floor level" always and our own simple routine to generate a centering position to mimic "eye level".
From what I can tell Oculus defaults to eye level at least, doesn't it?
Yup, ovr_GetTrackingOriginType returns ovrTrackingOrigin_EyeLevel.