October 17, 2021

SEO, Wordpress Support & Insurance, Mortgage, Loans, Legal, Etc Blogs

SEO, Wordpress Support & Insurance, Mortgage, Loans, Legal, Etc Blogs

, SEO, Wordpress Support & Insurance, Mortgage, Loans, Legal, Etc Blogs

Snap Lens Studio 3.4 Update Brings Better Hand And Body Tracking, AR Asset Library

Share This :
, SEO, Wordpress Support & Insurance, Mortgage, Loans, Legal, Etc Blogs
, SEO, Wordpress Support & Insurance, Mortgage, Loans, Legal, Etc Blogs

Snap is quickly becoming the gold standard for augmented reality. About a year ago, the company released new ground segmentation Lenses. Since then the company has focused increasingly on expanding its offerings for the Lens Studio app, an arm of the company that allows artists and developers to build AR experiences for Snapchatters.

On Thursday, Snap released Lens Studio 3.4, adding improved hand tracking, 3D multi-body tracking and full body segmentation to its AR offerings, reports Next Reality. The new iteration also includes an Asset Library to help creators build Lenses. The tool was previously able to recognize a hand in full view, but now it can detect 24 points on the hand, allowing developers to target finger joints and gestures.

The 3D full-body skeleton tracking allows creators track multiple bodies in camera view and apply 3D effects, and the full body segmentation upgrade allows tracking and segmenting of people’s bodies, not just their heads, isolating the person from the background behind them for green screen effects.

The Asset Library, which is an entirely new addition to Lens Studio, provides a collection of 3D models, materials, ML models, scripts and presets that creators can easily drag and drop into their AR experiences, according to Snap. Snap says the library will give developers shortcuts while working on projects, helping them to speed up workflow by saving assets or using pre-made assets.

MORE FOR YOU

By leveraging a worldwide community of developers and creators, Snap is able to get increasingly creative with its offerings and how they might be used. And Lenses doesn’t stop at just entertaining Snaps to send to friends. Snap works with a community of ML developers to create immersive experiences that may very well help us do a range of practical things. For example, in a December 2020 interview, Olha Rykhliuk, an immersive engineer at Snap, pointed out developers who integrated ML into Lens Studio to help people learn languages by pointing the camera at an object, prompting object recognition and translation, or scanning food barcodes and getting the nutritional information.

The previous December 2020 update, Lens Studio 3.3, integrated new templates for animations, face masks and virtual try-on Lenses, as well as visual scripting tools and a web-based Lens management portal.

Lens Studio originally launched in 2017, and since then creators have published more than 1.5 million Lenses.

Share This :