This requires a new approach to object interaction. But we're in VR now, we've got all three positional axes, plus rotation. You've only got two axes of input to worry about. Using a mouse to move a box on a screen is a pretty straight forward process. It's probably a bit confusing for first time users also as the box isn't described as necessary in the Documentation.Tomorrow Today Labs is working on an unannounced VR game for the HTC Vive in Unity and we've spent a lot of design and development time trying to find a method of interacting with objects that feels good to us. How do I file a bug report for this?Ĭlick to expand.You can reproduce the Ui raycasting issue by going to the example scene with the canvas in and disabling the box collider on it. Also, it would be nice if when Input.Touch code runs in the Editor and it detects a click in the "Game" pane then it should log a message telling us that we should be using the new Device Simulator. I may have missed it but the documentation should explicitly state that we should use the Device Simulator if running in the Unity Editor. So you have to delete or uncheck the "Disable Debug Plane on Device" component on the PlacementPlane in the supplied sample project. But there is a work around! If you use the new Device Simulator ( preview 2.0.0 ) Input.TouchCount works! But there is a catch! Since the code thinks it is running on a real device it disables the debug plane. So in the example project that uses the PlaceOnPlane script, that script will always return and not run the Update() code that follows because Input.TouchCount does not seems to work in the Unity Editor for mouse clicks when using the "Game" pane. So I got it working in the Unity Editor with ARFoundation 3.0.1 on a simulated iPad! You click the mouse to add the cube to the simulated or debug plane. (or technically not gone, but not compatible with other SDK implementations of VR like OculusSDK) link to some document that will tell me where certain features have gone.įor example: where is the Mock HMD now? I was happily using it in 2019.1 as a built in on the player menu - now it is gone and I can't find it. My experience so far witht he new VR framework stuff (2019.3 / 2020.beta) Is that the reorg makes sense - but it would be nice if the deprecation messages were EVEN FRIENDLIER. I'll also echo the sentiment that this feels like a really great start. I love Andybak's feedback above, especially #1 & #2. I think that should function as an either/or.Īdditionally there are a few different scenes in the package, it'd be nice if there was a "master" scene and some sort of UI interaction to load from one to another so the overhead in testing all scenes is a little lower. Something was happening that the teleportation curve beam and the pointer beam were both active at the same time. I built this for the quest and there was a strange behavior happening with the pointers and the teleport bends. Point 3 - only seems to apply to "Kinematic" interactables. Navmesh based teleporting is easier to work with than custom teleport location objects. I'll probably post more observations as they come to me. Do you accept pull requests or would you prefer everything to be in separate repos? (Actually - that's a viable question for new features but - for fixes and core improvements it has to be PRs. I'm sure the community will be happy to add some of these. Various missing features: Grab and scale objects, scale world, climbing, NewtonVR-esque levers and buttons. Hands and controller models would be a nice thing to includeħ. Passing an object from hand to hand doesn't work.Ħ. We need a non-instantaneous teleport mode. Touchpad to enter teleport mode and trigger to actually teleport? Nobody does that!Ģ. Just playing with the demo scene.ġ.The teleportation demo is very strange on a Vive. For more information, see the main documentation for the Interaction Package." "The XR Interaction system also provides various line rendering options. The Teleportation section has a link that doesn't seem to be formatted correctly. The Locomotion Systems section states that the example value is set to 600 seconds, while the attached image shows a Timeout of 10. If so, this needs to be updated in the docs.ī. On the Locomotion docs page ( You reference a Primary and Secondary device for the SnapTurnProvider, from what I see on the script this has been replaced by a "Controllers" array. On the Toolkit docs page ( Near the bottom of the page there is a reference to a Known Limitations git repo for issues ( ) this repo is either private or doesn't exist.Ģ.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |