This site uses cookies to improve your experience. To help us insure we adhere to various privacy regulations, please select your country/region of residence. If you do not select a country, we will assume you are from the United States. Select your Cookie Settings or view our Privacy Policy and Terms of Use.
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Used for the proper function of the website
Used for monitoring website traffic and interactions
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Strictly Necessary: Used for the proper function of the website
Performance/Analytics: Used for monitoring website traffic and interactions
But for now, I’ll just tell you what have been in my opinion the most interesting pieces of XR news of the week… Top news of the week (Image by KnightMD) Valve Deckard’s “Roy” controllers allegedly leaked The usual Brad Lynch with his team of data miners has found evidence of a controller codenamed Roy in the code of SteamVR.
WebXR is a technology with enormous potential, but at the moment it offers far worse tools to develop for it than standalone VR , where we all use Unity and Unreal Engine. As a Unity developer, I think that it is a very important enabling solution. How to get started with WebXR in Unity – Video Tutorial. Requirements.
These days I have finally managed to try it, so I can tell you everything that I have learned about it: What is it How does it work How to implement it in your Unity application Pros and cons. If you are a Unity beginner, I would advise you to watch the video. Please see our Phase Sync blog post for more details.
Oculus MRC in Unity – Video Tutorial. I have made for you a video tutorial where I explain everything I learned about Oculus Mixed Reality Capture in Unity, including the complicated problems with XR Interaction Toolkit, URP, and object transparency. At this point, import from the Unity Asset Store the official Oculus Plugin.
Or that it has multimodal input and can be commanded using controllers, hands, eyes, or voice. Developers can already access a preview of the development environment for native, Unity, WebXR. Controllers are to come in 2025, the year the headset will ship. The chipset is the Qualcomm Snapdragon XR2+ Gen2.
The past week has been a pretty interesting one for the tech world: between Microsoft Ignite, Unity Unite, and the OpenAI drama, there has been a lot to follow. Unity 6 takes back the original way of specifying engine versions and abandons the confusing one that bound a new Unity version to the year it was released.
Today’s article is one of the most beautiful published in this blog in 2022. XR ergonomics expert Rob Cole published last year on this platform an article detailing Project Caliper , that is his experiments in building fully configurable controllers for SteamVR. And in fact, it has not been written by me.
Sony unveils the new controllers of the PSVR2. One month after the reveal of a “next-gen VR headset for the Playstation” (whose name is not known, but I guess it will be PSVR2), Sony has revealed one important detail of its new VR system: the controllers. The controllers will be given to selected developers very soon.
Learn how industrial giant ABB is using Unity and augmented reality to transform field maintenance procedures into a completely paperless process. Kielar, to walk us through how they used Unity to develop a new digital field operator system. Make it easy to control what the user sees. by Nick Davis. odarczyk and Rafa?
One of the first accessories for AR/VR I had the opportunity to work on is the Leap Motion hands tracking controller : I made some cool experiments and prototypes with it and the Oculus Rift DK2. Leap Motion has also been the first important company I have interviewed in this blog. UltraLeap Unity SDK.
Ah, and before I forget: in the next couple of days, Im going to publish a very interesting interview on this blog! The project is pretty cool: the VR game is about humanoid robots fighting between them, and the idea is to take the best players and make them fight using real robots they are controlling from within the VR headset.
In fact, at launch, it won’t even be possible to create Unity content for it. According to the rumors, In the beginning only Apple’s first part tools (like RealityKit) will be allowed to create content, and only after, Unity support will come. An article in the Varjo blog warns us all about the risks of “negative training”.
Meta to introduce new parental controls on Quest. After many pressures from the community and the media, Meta has finally decided to introduce new parental control tools in Quest 2. More info (Quest 2 new parental controls?—?Road Road To VR) More info (Quest 2 new parental controls?—?Upload Top news of the week.
Mike Nisbet, Unity Engineer at SideQuest and Icosa Foundation Team Member , expressed his enthusiasm to be working on Open Blocks when the news landed that it was being open-sourced last month : “We’re thrilled to see Blocks join Tilt Brush in being released to the community, allowing another fantastic tool to grow and evolve.
Last week we had a first look at the controllers, while this week we had quite a confusing leak about its product line, which, if confirmed, would show an ambitious plan by the Chinese company. of our revenues with Unity. This is fair, and this is why a good chunk of the community answered positively to these new terms.
In February, a post on Google’s official blog recognised the “confusing and time-consuming” battle of working with various audio tools, and described the development of streamlined FMOD and Wwise plugins for multiple platforms on both Unity and Unreal Engine. Image courtesy Google.
Top news of the week (Image by Meta) There are new leaks on Meta Quest 3S This week we had new leaks on Quest 3S, including many images of the headset and its controllers and some leaked information on the Quest documentation website. The new CEO has finally made the great decision to remove it.
Announced last week via an official update to the Vive developer blog , Vive Pro developers now have full control of the headset’s front-facing stereo cameras to develop their own mixed reality experiences. The SDK also supports native development with plugins for both Unity as well as Unreal Engine 4. That is until now.
That’s why I feel very honored to host on my blog OpenBCI , a famous company aimed at building neural interfaces that have not only a very high quality, but are also opensource. Similarly, motor-impaired individuals may be able to gain more control over prosthetics and other pieces of assistive technology.
But there’s even more: Unity has already published a page about how it is possible to build for the Quest 3 using not only the XR Interaction Toolkit, but also AR Foundation. Unity page also confirms that the headset will provide plane detection, so you will be able for instance to detect the walls and the desks in your room.
Someone in the communities argued with me that HoloLens 2 has still some advantages over Magic Leap 2: it is well integrated with Azure, it is standalone so it doesn’t need a potentially dangerous cable connecting the headset to the computational unit, and it doesn’t need controllers (it uses hands tracking). See Unity running on Quest.
Different Controller: The Oculus Go Controller and Gear VR Controller share the same inputs: both are 3DOF controllers with clickable trackpads and an index finger trigger. If your app displays a visible controller, you should change the model displayed depending on whether you are running on Gear VR or Oculus Go.
You can imagine that after it has been released through two official blog posts, an enormous backlash from the community started to happen. Facebook already had full control of my Oculus and Facebook accounts, so it had already my XR data. So far for the announcement by Facebook. The same on Twitter.
DISCLAIMER: Before starting, I would like to acknowledge you that since some weeks Niels Bogerd has become a Patron of this blog. You have to connect your Tracker adapters to your USB ports (so that you can use the Trackers and the Controllers), and then do the pairing of your trackers. SenseGlove video review. Applications.
To keep the price this low, the headse t will ship without controllers, it will feature Fresnel lenses and black/white passthrough. More info John Riccitiello steps down from its role as Unity CEO The last chapter of the drama about Unity’s pricing change is that John Riccitiello has left his role of CEO at the company.
The puck of the Nimo runs on top of a Qualcomm Snapdragon XR 2 (Gen 1) and it is used both as a computation unit and as a controller. Senior Unity Developer Join our Unity team and be at the forefront of innovation in the global VR video streaming ecosystem. It looks like a cool gadget. Meta is investigating the topic.
It is relevant that for the first time, a Varjo headset is adding integrated audio and integrated controllers. It’s interesting that someone has already started modifying this application, and a guy already obtained a version that works just with hand tracking and no controllers. Lone Echo).
Oculus recently revealed in a blog post that in the past months they’ve been working on bringing native mixed reality capture support to Oculus Rift, and it’s available today for developers to start creating mixed reality videos. image courtesy Oculus.
You all know that I love Brain-Computer Interfaces, and so I have been very happy when NextMind has proposed to give me a sample of its brain-reading sensor to review here on my blog. BCIs are still in the early stages, and even if they are evolving at a good pace, the technology for “brain control” is still not here.
They also talk about the current difficulties of using Unity to develop for the Vision Pro. More info (How Puzzling Places was ported to the Apple Vision Pro) New apps get released to the Vision Pro every week : Job Simulator and Vacation Simulator have been announced for it (I teased this on my blog a few weeks ago).
I have made things in Unity ever since, now with 4 published titles: Custom Home Mapper (on Sidequest and Itch), as well as Space pilot Alliance , Pocket Racer , and RamCastle on App Lab. Unity is the game engine, Photon handles the networking and Dissonance is amazing for VOIP. Hello Ryan, can you introduce yourself to my readers?
The improved headset is pitched as an upgrade for current Vive owners, as it works with the original controllers and base stations. It is still not known exactly when the improved controllers and SteamVR 2.0 Magic Leap has launched the SDK for the device’s Lumin OS , with support for Unity and Unreal engines.
Founded in 2010, Leap Motion released its initial product (today called the ‘ Leap Motion Controller ‘) in 2012, a desktop-focused peripheral which offered markerless hand-tracking. The company details the developer-level changes on their blog here.
Aside from being able to use the objects or characters that you create for other applications, like bringing them into Unity and using them for VR games, we’ve been wondering if we’ll one day also get the chance use Blocks for 3D animation. Step Two: Controlling the Model. You can read the full process here.
It was after I tried Tilt Brush that I saw the possibilities of what 6DOF controllers could bring,” said Vermeulen, adding , “It brought a sense of depth and spatial awareness in a way that didn’t exist before.”. Having that type of control in a virtual environment opened up so many possibilities for Vermeulen.
Recently Oculus announced via their blog that they’ve added native MR support to their flagship PC VR headset, the Oculus Rift. There are a couple things the Oculus team recommends you keep in mind if you want to take advantage: You’ll need a way to mount the Touch controller for dynamic camera tracking.
The company says the SDK includes “a simple API used for creating apps inserted into Cardboard viewers, and the more complex API for supporting Daydream-ready phones and the Daydream controller.” Unity writes on their official blog : Unity’s native support for Daydream aims to solve the hard problems for you.
When I attended the Experiential Technology Conference in May 2016, I heard from a number of commercial off-the-shelf brain-control interface manufacturers that their systems would not natively work with VR headsets because there are some critical portions on the head that are occluded by VR headset straps.
Users can tap into Unity software to build new apps and tools. According to a blog post shared by Meta , extended reality and the metaverse opens the door to more interactive, collaborative learning, which could be the key to making education more powerful and accessible in the future.
More info Get started with Apple Vision Pro development Unity has just released the tools to start developing for Apple Vision Pro and interested developers can already start experimenting with them. Now Microsoft is discontinuing them , putting another nail in the coffin of the HoloLens. Thanks to Davy Demeyer for the tip!)
Behind the Scenes with Jean-Micahel Jarre’s VR Concert From the Unityblog , the story of how AR/VR consultant Antony Vitillo journaled the entire experience of putting on Jean-Michel Jarre’s Welcome to the Other Side (WTTOS), in a VR replica of the Notre-Dame Cathedral to celebrate New Year’s Eve. Remote Control Cars in VR Chat ?
The company also studied how to use the puck to interact with the AR experiences: they have used it as a controller, but also as a device to make a person you are having a call with appear as a hologram, like in Star Wars. If you are interested in reading a roundup of news dedicated to XR development, have a read of his post.
Side view of Pico 4 controllers. connector with OTG 3 Physical IPD adjustments Optical Positional tracking, with 4 tracking cameras Cameras have 400×400 resolution and run at 120Hz Maximum 10m x 10m tracking area and mm-precision accuracy Optical Tracked Controllers, with 32 tracking points 5300 mAh battery, for around 2.5-3h
If we sum these features to the other ones added in the past, like hands tracking, Passthrough Shortcut, or the multiple windows in Oculus Browser , we start seeing the first signs of a mixed reality operating system, with which you can interact with the controllers or with the hands in a natural way.
We organize all of the trending information in your field so you don't have to. Join 3,000+ users and stay up to date on the latest articles your peers are reading.
You know about us, now we want to get to know you!
Let's personalize your content
Let's get even more personalized
We recognize your account from another site in our network, please click 'Send Email' below to continue with verifying your account and setting a password.
Let's personalize your content