VisionLib Release 2.3.0

Start #AR development with VisionLib.  Sign-in or sign-up now.

Object Tracking. Real Smooth. 

The new release comes with strong improvements when combining Model Tracking and SLAM. While the first is the de-facto standard for detecting and localizing objects, the latter tracks the environment and lets you quickly place superimpositions stable inside the world.  

Combined, both make a perfect match to get more of AR: register content precisely to objects and also into the space around them.

The new release is all about better and more stable augmentation results at the verge of model tracking and SLAM combined. With overall better tracking results, faster re-initialization and more control over the states in between both tracking techniques. 

Improved URP Support

Unity’s Universal Render Pipeline has evolved into a powerful graphics solution that combines appeal with speed and performance. Be it for marketing purposes, product visualisation or after sales solutions to enhance the buying process: some AR projects simply need a stunning graphical experience.

With the new release, there are fewer boundaries to accomplish this. While an URP extension was available for some time, we’re excited that with this release, URP gets major and better support.  

Improved Re-Intialisation for SLAM-extended Model Tracking

With this release we introduce a revised workflow and enable to recover from “critical” tracking more quickly, when using SLAM-extended Model Tracking.

Tracking state “critical” indicates that tracking is either losing quality or is about to get lost altogether. Using only model tracking, state “critical” quickly causes tracking to stop, enabling users to re-initialize. With SLAM-enhanced tracking enabled, however, the SLAM map sometimes appeared to be valid, but eventually drifted or moved away from the model target, causing the augmentation to stuck at incorrect places.

In the new release, VisionLib will now decide much faster when to re-initialize the (model) tracking. An overall enhanced handling of invalid or implausible SLAM data now helps to improve the tracking quality. And, there is more control over the AR behavior for developers, in order to ease use for users.

Technically, VisionLib uses a pose predicted by SLAM alongside the model tracking pose. Developers gain control over this functionality with two new options to re-initialize model tracking even if SLAM prediction is still possible. This allows deciding if and when the SLAM pose overrides the model tracking pose and vice versa:

  • allowedNumberOfFramesSLAMPrediction: Limits the total number of frames which should be predicted via SLAM – a smaller number lets the user re-initialize faster, manually
  • allowedNumberOfFramesSLAMPredictionObjectVisible: Limits the number of prediction frames in which one is looking at the predicted model position – a smaller number will enable to re-initialize sooner, when the model has moved while one didn’t look at it.

Try the new behavior and test it with your projects and use cases.

See The New Behavior In Action.

The new behavior is most noticeable when objects move while the camera looks away.

In the video, we test the behavior for different tracking settings: simple model tracking, as well as SLAM-enhanced model tracking with and without StaticScene mode enabled.

The latter is a parameter to specify that tracked objects are not expected to move in order to increase tracking performance.

You are currently viewing a placeholder content from YouTube. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

Updates For Unity

Improved URP Support

We’ve talked about URP support in the beginning. Within a few steps, it’s now easy to get started to develop VisionLib projects based on URP. Read our new article to learn how to get started, or how to upgrade existing projects to URP. 

› Read URP Support article. Or ,watch the tutorial below.

Tutorial: Integrate URP Into VisionLib Projects

You are currently viewing a placeholder content from YouTube. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

New Tracking Config Scene

In order to help assess model tracking results and keep the workflow of integrating VisionLib into Unity projects lean, we introduce a new ModelTrackingSetup scene for standard and mobile as well as HoloLens development. It replaces the former AdvancedModelTracking scene.

This scene helps to test models inside Unity directly, with options to tweak parameters, and assess overall tracking quality before starting custom scene development.

For these purposes, it offers a debug-view, along with UI elements enabling direct manipulation of tracking parameters and initialization data at runtime.

Once a suitable setup is found, you can now save a .vl configuration file directly from within the scene. Either run the scene inside Unity or deploy it on mobile devices, make changes and save them there.

When finished, fetch the saved configuration from the mobile and use it on your desktop for further development. A time-safer, particularly when you would need to go to an object with the mobile device, as it cannot be reached and tested from the desktop:

  • For development, we’ve added the ModelTrackerParameters component to the VLModelTracker prefab. Here, one can set an URI and call SaveCurrentConfiguration to save the current configuration to a file.
  • For HoloLens we’ve added the new HoloLens2ModelTrackingSetup scene. It uses MRTK functionality to provide a good and consistent UI to adjust and save tracking parameters directly on the HoloLens 2.
  • There is also a new VLImageSourceParameters prefab variant for HoloLens scenes that contains the FieldOfView parameter

Learn more about workflows to create custom tracking configurations in Unity with this scene:

Updates to the Tracking Configuration Component

Updates to the Tracking Configuration Component
We’ve added an input source selection to the `TrackingConfiguration` component and expanded existing input options. Now, choose here to either use what’s been specified in the tracking configuration; or enable users to select from available input sources at runtime; or select the new option to use an image sequence as input source for tracking inside the editor.

The latter makes development much easier. And with the new functionality, it is easy to change the input sources without having to manually edit the tracking configuration file.

Other Changes Worth Highlighting

  • For HoloLens we’ve added the example package VisionLib.SDK.MRTK.Examples including examples for using VisionLib together with MRTK.
  • We reduced the file size of the libraries by about 6 % to 11 % (depending on the platform)
  • The PosterTracker now has the state critical when the pose was only predicted using SLAM. Previously, it would report the tracked state in this situation. This now makes it possible to distinguish between both cases.
  • For image sequences which are recorded with extendibleTracking active, we will also record the timestamps. This timestamp will be read and used when replaying the image sequence.
  • We’ve improved pose smoothing (especially with static scenes), Also, pose smoothing is much faster and cleaner now for staticScene enabled
  • staticScene, a parameter to indicate that tracked objects are not expected to move in order to increase tracking performance, has changed behavior. If you used it before, learn about the minor changes in the Change Log.
  • Convenience has increased when working with licenses under Windows. The new behavior is less sensitive, for example, when changing network settings.
  • Config file JSON syntax is now more type sensitive, so you get better hints for errors during development. E.g., if a parameter expects an int but instead gets an int inside a string, this is now noted.

Head's Up

We’d like to give a head’s up for some changes we’ve made regarding the current and future releases:
  • While we are improving usability, we’ve renamed the parameters minInlierRatioInit to minInitQuality and minInlierRatioTracking to minTrackingQuality. The former parameter names are now deprecated – their functionality, however, remains untouched.
  • We’ve cleaned up the parameter naming for transforms: The rotation parameter was sometimes named “q“, and sometimes “r“. Now, only “r” is valid. If you use stored init poses in your projects, check existing config files and rename the parameters accordingly to avoid deprecation warnings.
  • The TrackingManager is a singleton now. To call member functions you no longer have to inherit from TrackingManagerReference. If you’ve done this, check existing code and change it accordingly.

All Updates In Detail

There are many more improvements and changes to the overall SDK. Make yourself familiar and have a look at the detailed Release Notes.

News & Events

We’re preparing for some great events this fall. Join us, meet the team and chat about AR, XR and the Industrial Metaverse with our experts. Safe dates for these events:

  • Sept. 12 – 17, IMTS Chicago, USA
  • Sept. 14 – 16, XR Expo / Week, Stuttgart, Germany
  • Oct. 4 – 6, Vision Trade Show, Stuttgart, Germany

Watch out for announcements about upcoming events, details to current shows and options to save on tickets with exclusive discounts – Stay posted and check our event page for updates.

As always, we’ll keep you updated on social media. Be sure to follow us:

As always, we’ll keep you updated on social media. Be sure to follow us:

Learn VisionLib – Workflow, FAQ & Support

We help you apply Augmented- and Mixed Reality in your project, on your platform, or within enterprise solutions. Our online documentation and Video Tutorials grow continuously and deliver comprehensive insights into VisionLib development.

Check out these articles: Whether you’re a pro or a starter, our new documentation articles on understanding tracking and the new workflow around setting up tracking configurations in Unity will help you get a clearer view on VisionLib and Model Tracking.

For troubleshooting, lookup the FAQ or write us a message at request@visionlib.com.

Find these and more helpful insights at our Support Page.

You are currently viewing a placeholder content from Facebook. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

Get VisionLib now!

Start developing Augmented Reality apps with VisionLib. Sign-in or Sign-up now.