Ícono del sitio La Neta Neta

Apple’s augmented reality tool kit can now detect walls and 2D images in beta

Apple lanza iOS 11.3 con nuevos Animojis

Apple is rolling out an upgrade to its augmented reality toolbox, ARKit, to developers in a beta version today. ARKit 1.5 adds a few marquee features including one big one: wall detection.

Up until this point, ARKit has been focused on offering horizontal plane detection, allowing developers to know where the ‘floor’ is that it can then use to orient or place objects in 3D space. This left the picture well short of allowing for full 3D spatial recognition of a given field, though, as that horizontal plane would essentially go on forever, ignoring walls or other vertical surfaces.

With this update, ARKit can now recognize vertical surfaces and place objects onto those surfaces. It can now define those surfaces in conjunction with horizontal surfaces as well. Imagine a game that allows you to throw darts at a wall, with the target mounted to an actual wall, rather than floating in space.

When ARKit was introduced, I asked around about how hard it would be to get vertical plane detection done without an ‘active’ sensor of some sort. The consensus was that it was theoretically possible, just a lot harder because vertical planes like walls typically have less defining features that allow for easy plotting by a planar detection system. Basically, if it’s a smooth white wall, it’s hard for the camera to tell it’s a solid object.

Google’s ARCore has supported some wall detection previously. Apple’s version of wall detection will detect planes that are vertical or just off vertical but not heavily angled initially.

In addition, ARKit gets an update to horizontal plotting to allow for better recognition of irregularly shaped objects like circular tables or chairs, and line detection. The overall tracking has been improved in speed and accuracy as well.

ARKit will now also display the ‘real world’ at a full 1080p, an upgrade over previous versions which shipped the video components out to users at 720p, making the “real” parts of the scene look worse than the “fake” parts. So that’s a nice bump.

Another upgrade that seems fairly minor at first but could get very, very interesting later on is computer vision-based image recognition. If there are 2D images in the scene, say a poster or piece of art on a wall, ARKit can now parse the images and allow developers to map their physical position in a space and on a surface. This would allow for placement of related objects nearby, floating text, audio triggers — you name it.

The immediate applications are fairly clear. You walk into a museum, point your camera at a painting and the painter appears in front of it to talk about the artwork or even to show you how they painted it. If it’s a poster of a rocket launch, the rocket appears on the floor in front and you get a recreation of the launch.

The implications down the road are even more exciting. What happens, for instance, when you can slap a sticker on a wall which can act as a marker that ARKit can recognize without external libraries? It can then project out objects or scenes based on that marker. And some of the back end systems that I know other developers are working on rely on computer vision to create a persistent spatial map that can be used to ‘re-place’ objects or scenes very precisely between AR sessions or between different people at different times. This will help with those.

Lots of possibilities from these new additions and this isn’t even a full version release, just a point update. I know that a lot of fuss has been made that there are only about 2,000 apps that are using ARKit at the moment, but I’m not convinced that there will ever be a valid need for ARKit to be implemented in the majority of apps. Or rather, or the big spatial components of it as it’s also becoming a nice computer vision toolbox. Frankly many existing AR enabled apps are pretty crap.

With AR, instead of ‘many apps’ I think we’re going to see ‘many experiences’. There will be enabling apps or tools, but the experiences themselves will be universal. But I also think that this won’t become apparent to most commentators (or customers) for a while. For now, a nice iteration on the initial release of ARKit.


Source link
Salir de la versión móvil