Start #AR development with VisionLib. Sign-in or sign-up now.
The new release comes with strong improvements when combining Model Tracking and SLAM. While the first is the de-facto standard for detecting and localizing objects, the latter tracks the environment and lets you quickly place superimpositions stable inside the world.
Combined, both make a perfect match to get more of AR: register content precisely to objects and also into the space around them.
The new release is all about better and more stable augmentation results at the verge of model tracking and SLAM combined. With overall better tracking results, faster re-initialization and more control over the states in between both tracking techniques.
Unity’s Universal Render Pipeline has evolved into a powerful graphics solution that combines appeal with speed and performance. Be it for marketing purposes, product visualisation or after sales solutions to enhance the buying process: some AR projects simply need a stunning graphical experience.
With the new release, there are fewer boundaries to accomplish this. While an URP extension was available for some time, we’re excited that with this release, URP gets major and better support.
With this release we introduce a revised workflow and enable to recover from “critical” tracking more quickly, when using SLAM-extended Model Tracking.
Tracking state “critical” indicates that tracking is either losing quality or is about to get lost altogether. Using only model tracking, state “critical” quickly causes tracking to stop, enabling users to re-initialize. With SLAM-enhanced tracking enabled, however, the SLAM map sometimes appeared to be valid, but eventually drifted or moved away from the model target, causing the augmentation to stuck at incorrect places.
In the new release, VisionLib will now decide much faster when to re-initialize the (model) tracking. An overall enhanced handling of invalid or implausible SLAM data now helps to improve the tracking quality. And, there is more control over the AR behavior for developers, in order to ease use for users.
Technically, VisionLib uses a pose predicted by SLAM alongside the model tracking pose. Developers gain control over this functionality with two new options to re-initialize model tracking even if SLAM prediction is still possible. This allows deciding if and when the SLAM pose overrides the model tracking pose and vice versa:
Try the new behavior and test it with your projects and use cases.
The new behavior is most noticeable when objects move while the camera looks away.
In the video, we test the behavior for different tracking settings: simple model tracking, as well as SLAM-enhanced model tracking with and without StaticScene mode enabled.
The latter is a parameter to specify that tracked objects are not expected to move in order to increase tracking performance.
We’ve talked about URP support in the beginning. Within a few steps, it’s now easy to get started to develop VisionLib projects based on URP. Read our new article to learn how to get started, or how to upgrade existing projects to URP.
› Read URP Support article. Or ,watch the tutorial below.
In order to help assess model tracking results and keep the workflow of integrating VisionLib into Unity projects lean, we introduce a new ModelTrackingSetup scene for standard and mobile as well as HoloLens development. It replaces the former AdvancedModelTracking scene.
This scene helps to test models inside Unity directly, with options to tweak parameters, and assess overall tracking quality before starting custom scene development.
For these purposes, it offers a debug-view, along with UI elements enabling direct manipulation of tracking parameters and initialization data at runtime.
Once a suitable setup is found, you can now save a .vl configuration file directly from within the scene. Either run the scene inside Unity or deploy it on mobile devices, make changes and save them there.
When finished, fetch the saved configuration from the mobile and use it on your desktop for further development. A time-safer, particularly when you would need to go to an object with the mobile device, as it cannot be reached and tested from the desktop:
Learn more about workflows to create custom tracking configurations in Unity with this scene:
Updates to the Tracking Configuration ComponentWe’ve added an input source selection to the `TrackingConfiguration` component and expanded existing input options. Now, choose here to either use what’s been specified in the tracking configuration; or enable users to select from available input sources at runtime; or select the new option to use an image sequence as input source for tracking inside the editor.
The latter makes development much easier. And with the new functionality, it is easy to change the input sources without having to manually edit the tracking configuration file.
There are many more improvements and changes to the overall SDK. Make yourself familiar and have a look at the detailed Release Notes.
We’re preparing for some great events this fall. Join us, meet the team and chat about AR, XR and the Industrial Metaverse with our experts. Safe dates for these events:
Watch out for announcements about upcoming events, details to current shows and options to save on tickets with exclusive discounts – Stay posted and check our event page for updates.
As always, we’ll keep you updated on social media. Be sure to follow us:
We help you apply Augmented- and Mixed Reality in your project, on your platform, or within enterprise solutions. Our online documentation and Video Tutorials grow continuously and deliver comprehensive insights into VisionLib development.
Check out these articles: Whether you’re a pro or a starter, our new documentation articles on understanding tracking and the new workflow around setting up tracking configurations in Unity will help you get a clearer view on VisionLib and Model Tracking.
For troubleshooting, lookup the FAQ or write us a message at firstname.lastname@example.org.
Find these and more helpful insights at our Support Page.
Start developing Augmented Reality apps with VisionLib. Sign-in or Sign-up now.
Home › VisionLib Release 2.3.0
VisionLib For Developers
© 2022 Visometry GmbH all rights reserved.