iPad Pro 11 Giveaway – enter now.

d
h
min

Recap: XR Expo 2020 Online

True object tracking for Mixed- and Augmented Reality with VisionLib, and how it leverages XR’s most essential business use cases. Our Updates & Impressions at Germany’s XR Expo 2020.

At a glance:

  1. We’ve showcased details of Release 20.6.1, which brings more performance to VisionLib Model Tracking on HoloLens 1 + 2, and also AutoInit to liberate from Model Tracking’s Init Pose.
  2. We’ve highlighted advances of Model Tracking and why it leverages AR cases better than other core tracking techniques, especially in changing lighting conditions and for moving objects and changing/dynamic environments.
  3. We’ve introduced our efforts behind what we call “Assistive Vision” – computer vision tracking, that can detect when objects change in reality and that understands differences of tracked objects between AS-IS, AS-PLANNED in CAD.

With three sessions and intensive conversations, we’ve contributed to Germany’s XR-EXPO congress, virtually this year, of course, given the Corona pandemic.

Big thank you to every one, who joined our sessionx, and of course to the XR Expo organisation team.

Our CEO Harald Wuest, our Head of Business Development Uli Bockholt and our Head of Product Experience Jens Keil gave insights into the current VisionLib SDK, presented the updates, which came with Release 20.6.1, and talked “future developments” showing results form the lab behind our “assistive Computer Vision” technologies.

Within two days, attendees learend why Model Tracking, which uses 3D models to enable reliable Object Tracking in enterprise #AR + #XR cases, has become almost the standard technique for industrial AR: because it overcomes critical or challegings situations for the copmuter Vision, such as changing lights, reflecitve surfaces, moving objects, or changing environments.

And, most importantly, it enables to scale deployment and usage, because objects can be tracked in real-time without scene preparations (e.g. markers or feature maps) or objects our the environments they are in.

YouTube player

While it would have been even more „wow” to meet in person, giving a remote demo, it was nice to see the audience as excited as us, seeing how the new VisionLib release is laser fast on HoloLens. See for yourself:

YouTube player

The update brought not only more speed. VisionLib Model Tracking is really case-making technology for HoloLens, because it enables the XR glasses to detect & register objects in the minute users look at them. That’s the essence to enable almost any frontline worker to benefit from augmented information while working on machines or industrial structures.

Why? Because objects are detected automatically. Without it, you would need to “scan” the environment in advance to place HoloLens anchors. But even if you do, you’d still need to manually align content inside those “scanned maps” in oder to have information pointing to the right objects.

Another “wow” was nothing less but showing AutoInit on HoloLens. It liberates from initial poses to start tracking. These initial viewpoints are characteristic for Model Tracking. While they are a  a nice way to guide user’s viewpoint, they somehow limit the user’s experience, too, because user need to match init pose and current viewpoint to successfully start tracking and the AR app.

So, VisionLib’s AutoInit in particular is a key feature as it eases object registration and tracking, and enables users to detect and track objects from any viewing angle. Easing especially the use of HoloLens, leading definitely to less stiff necks.

YouTube player

Usually taking place in Stuttgart, which lies in Germany’s (unofficially assigned) “region of engineering”, XR Expo is and has been a great place, where XR solutions and technologies meet industries – SMEs as well as big automotive players –  to talk #digitalization, #challenges and #opportunities for XR inn businesses.. As such, it has been a good place for maybe the most interesting announcement, which was our introduction around “Assistive (Computer-) Vision”.

That’s basically tracking technology, that delivers not only the foundation to be able to blend 3D graphics at all, but that moreover brings enhancements by means of computer-vision itself.

As such, it adds the ability to understand changes to objects in reality, can point out deviations, or assist users by quality checks, which tell whether or not a screw has been unscrewed during an assembly procedure. Doing so, it enhances enterprise AR, like AR-based maintenance and -inspections, and enables to detect differences of tracked objects between AS-IS, AS-PLANNED in CAD.

At Visometry, we believe that this is the key for a whole new level or #AR and #XR applications. Because with it, enterprise not only benefit from the AR views, but the Computer Vision itself: it combines (traditional) image processing tasks with AR. You think that sounds interesting? Us, too.

Stay tuned for upcoming events and don’t forget to follow us on Twitter or LinkedIn.

YouTube player

You are currently viewing a placeholder content from Facebook. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

Missed our talks & demos?

Get an overview on our products and all features:

Want to learn more?

Write Uli Bockholt, our head of sales and business development: business@visionlib.com

Stay posted on upcoming events: