Using your phone as a virtual camera (aka demotool)?
category: offtopic [glöplog]
Has anyone tried using a phone as a virtual camera (return world space coordinates, up vector, and view vector), using the acelerometers, and compass (maybe giroscopes if it has one)? I was thinking about programming such a thing, but I have no idea if those devices have enough resolution for this kind of thing.
I would argue that it's fundamentally not possible to derive correct world space coordinates from just acceleration + orientation(gyroscope, compass, ...) no matter how much accuracy and time resolution you have (well, unless it's perfect [error=0 exactly] and time resolution is "continuous" [planck time?]).
Yeah, I confirm that, it sucks. What you can do however is use your phone's camera and do feature tracking with something like PTAM. I think the virtual camera has been done like this already, to control blender.
PTAM on Iphone 3G
PTAM on Iphone 3G
wii remote perhaps?
However, we are talking about demos. Does one need correct world space coordinates, or just coordinates that are close enough?
I did a little experiment with the accelerometer on the iPad, changing the location of the camera around an object, and with a low pass filter it looked pretty okay :)
Preacher: you will still end up with nonzero velocity for stationary device very quickly. Low-pass filtering can help processing relatively slow movements for some time. I think one could even train himself (subconsciously) to shake device just right to fix "ghost" speed when it gets noticeable.
v3nom same thing. it has the advantage of having hardware point tracking I guess, if you don't move it that much, but you can also do it using a camera, with enough libraries.