Selasa, 07 Mei 2019

Google brings AR and Lens closer to the future of search - CNET

Imagine you're thinking about chairs, and you Google up some to try out in your living room. Or, you're looking at a subway map in Tokyo, and you see suggestions on what routes to take that suddenly appear, translated into your own language. Google AR and Google Lens are getting there faster than you think.

I'm in Google's Immersive Lab in Mountain View before Google I/O 2019, looking at a restaurant menu on the table through the lens of the phone in my hand. What looks interesting to order? There's a sparkling across the display, and suddenly options are highlighted for me.

These aren't the wild dancing dragons or holographic avatars you might associate with AR and VR. There are no Pokemon to follow, or weird new headsets to try on. Instead, Google's augmented reality vision this year is a double dose of attention to utility and assistance, as AR comes to Google Search and Google Lens aims to help people read. While it may seem tamer than years past, and it's not as showy as recent Google AR projects involving Marvel and Childish Gambino, Google's trying to be legitimately helpful. After dozens of smartglasses and AR headsets have come and gone, we need real reasons to use this tech. Can Google find a way forward?

Now playing: Watch this: Google Search gets AR, and Google Lens wants to be your...

4:19

It's been six years since Google Glass' early take on smartglasses, five years since Google Tango's experimental AR phone and two years since Google Lens offered a way to scan the world with a phone camera. Google's view of AR may be shifting and maybe that's inevitable. As AR in phones has crossed from novelty to ubiquitous reality, Google's next mission seems to be about finding a good use for AR -- at least until future smartglasses, headsets or other devices are ready to use what Google's learned.

"We think, with the technologies coming together in augmented reality in particular, there's this opportunity for Google to be vastly more helpful," says Clay Bavor, vice president of virtual and augmented reality, about the variety of AR updates Google has coming this year. You might also say that Google's laying the groundwork for its next big thing. The company has to become the digital glue for a world of services that don't quite exist yet. For now, that means putting computer vision to work even more, not just on Android, but on iPhones too. Google's got a smattering of new experimental ideas this year: This is what they're like.

google-io-2019-ar-augmented-reality-0490

NASA's Mars Curiosity rover, as surfaced by a link in AR-enabled Google Search.

James Martin/CNET

Google Search with AR feels like instant holograms

I see a clickable file when searching for "tiger," which launches an animated 3D file complete with roaring sounds. I can then launch it in AR in the room, and hey, realistic AR tiger. I can drop a scale model of NASA's Mars Curiosity Rover into the room. Or an anatomical model of human arm bones and musculature.

Google is introducing AR to Search this year, and this is how it works: Compatible Android and iOS devices will see 3D object links in Search, which will bring up 3D models that can then be dropped into the real word at proper scale in AR. Google Search will incorporate 3D files using the glDF format, as opposed to Apple's USDZ format used by ARKit in iOS 12. According to Google, developers will need to add just a few lines of code to make 3D assets appear in Google Search.

"Anyone who has the 3D assets, and it turns out a lot of the retail partners do, folks like Wayfair or Lowes, all they have to do is three lines of code," says Aparna Chennapragada, vice president and general manager for camera and AR products. "The content providers don't have to do much else." Google's already working with NASA, New Balance, Samsung, Target, Visible Body, Volvo and Wayfair to incorporate 3D assets into Google Search. The AR effects launch into a new Android feature called Scene Viewer.

What struck me when I tried a few demos was a simple thought: I use Google Search as an extension of the way I think. If AR-enabled Search can eventually be added to a pair of AR glasses, maybe it will mean effortlessly conjuring objects into the real world, without launching any apps at all.

google-io-2019-ar-augmented-reality-0373

Google Lens can highlight menus with popular choices, which link to Google Maps-related photos and details.

James Martin/CNET

Google Lens keeps evolving, starting with dining help

Using Google Lens, meanwhile, already feels like a pair of smartglasses without the glasses. The camera-enabled app can already be used for object recognition, translation, shopping and recognizing the world.

But Google's exploring a whole new wave of Lens features that are pushing Lens further in 2019, ranging from the fine-grained to the far-reaching, that are starting to more actively superimpose things into the world in AR. "We're taking Google Lens and taking it from 'oh, it's an identification tool, what's this, show me things like this,' to an AR browser, meaning you can actually superimpose information right on the camera," says Chennapragada.

I looked at Google Lens' newest features firsthand, and they're starting to feel like ways to transform reality as much as interpret it. Lens can now show translations from other languages that map onto signs or objects seamlessly and stick there in space, as if the text is really there. It's an evolution of what's been in Google Translate, but now Google will analyze the context of entire documents, starting with restaurant menus.

I peek down at a sample menu, and suddenly dishes on the menu are highlighted. They're popular dishes, according Google Maps restaurant information. Tapping on the menu items brings up photos and review comments on the fly. It almost feels like annotated reality.

A similar idea is being approached for museums. Google's experimenting with de Young Museum in San Francisco to bring curated pop-up information to recognized works of art when it's analyzed with Google Lens. Would curators build fenced-in spaces where objects in that space will have coded information? That's the idea.

There are new Shopping, Dining, Translate and Text filters to help Lens know what to do in context, plus the do-it-all "Auto" mode. The shopping filter helps, for instance, to recognize a plant on a table and find places to buy that plant instead of just identifying what kind of plant it is. That's the challenge: if you're holding a magic lens that sees everything, how does the lens interpret what you need?

google-io-2019-ar-augmented-reality-0426

This poster animates (at least, in AR)

James Martin/CNET

AR gains new magic tricks

There are some funky new AR tweaks, too. Augmented images can also make 2D images suddenly animate. A recipe page from Bon Appetit magazine suddenly transforms, animating as it shows cooking instructions. I hold up a phone, and a real poster of Paris animates on the screen, with moving clouds, in Google Lens. 

Google's doing these tricks with 2D images for now, not 3D, but pulling these transformations off without specialized marker codes feels like a peek at what a world filled with augmented reality could be: Signs that come alive at a glance. I'm reminded of that animated box of cereal in Minority Report.

An improved AR lighting engine called Environmental HDR aims to place objects more realistically in 3D with uneven light, even when the light changes on the fly.

google-io-2019-ar-augmented-reality-0374

Lens can translate languages and read back text on low-cost Android One phones.

James Martin/CNET

Google Lens translation comes to low-end phones

What Google's Bavor and Chennapragada are most excited about, however, is a use for Google Lens that's coming to low-end phones running Android Go software. Instant translation and reading assistance is running on phones that aren't powerful enough for ARCore, leaning instead on cloud services. I snap a photo of a sign, and now the phone is reading what it sees back to me, highlighting each word. A tap, and I can translate it into another language.

"One of the questions we had was, if we can teach the camera to read, can we use the camera to help people read?" says Chennapragada. "this is obviously useful in cases where you're in a foreign city and you can't speak the language, but in many parts of the world, people can't speak or read their own language." Chennapragada relates her own experiences growing up in India: "I grew up in India speaking about three languages apart from English, but not the other twenty languages. So if I go into a neighboring state, I'm hopeless, I'm just staring at the store sign and can't do anything."

I immediately wonder if this can, in a way, be used as visual assistance for the blind, to read the world. Chennapragada considers the technology "situational literacy," and down the road, it could very well be a seeing eye as well as a universal translator and reader. What strikes me on the Android Go I try it on is how it just works...and quickly, too.

The new feature launches in the Google Go app on Android Go, at the tap of a button. Google's move away from relying on higher-end hardware, to run Lens features on even a low-end $50 phone, raises another question: could this point to how Google's going to make assistive AR run easily on future low-power AR headsets?

'Future form factors'

Bavor admits that Google's in a phase of "deep R&D" towards new technologies beyond phones, but for now, the goal is to solve for uses on the phone first. The hints of some other type of hardware on the horizon, however, are there.

"If you think about voice search, the web answers that you get forward translated so well to the Assistant. We're taking the same approach here and saying, what are features and capabilities ... that really are useful in the smartphone context, but then forward translate very well to future form factors," says Chennapragada. Lens, to her, is shifting to becoming an "AR browser" this year: "this goes two or three more Lego bricks towards the other form factors."

With no new VR or AR hardware emerging at Google I/O this year, Google's focus on services and utility could indicate that Lens might evolve into a reality browser for other platforms.

It's a mystery for now. Bavor says there's no commitment to platforms such as Microsoft's HoloLens or the Magic Leap yet, but he admits, "you don't even really have to squint, there's so much in common with these things, and I think Google has a history of building platforms and services that are widely available and try to deliver helpfulness and utility as broadly as possible to everyone. I think we'll take a similar strategy with this."

In the meantime, all these Google AR features feel like continuing experiments, including Google's upcoming AR navigation features in Maps. At this year's I/O conference, even the I/O app has built-in AR to guide attendees to sessions. The feature could be a hint of where AR guidance could evolve in the future. Or maybe some of these features won't succeed. Maybe it's Darwinian. And maybe that's what's needed to figure out how AR will succeed on phones and beyond.

Google I/O 2019

Let's block ads! (Why?)


https://www.cnet.com/news/google-brings-ar-and-lens-closer-to-the-future-of-search/

2019-05-07 17:09:33Z
52780288103573

Tidak ada komentar:

Posting Komentar