Google Search is getting new features for Indian users, including multi-search in Hindi and support for bilingual search results.
Google Search will focus more on visuals, videos and regional language content for Indian audiences, the company said at its annual event in New Delhi yesterday. For starters, Google is bringing its multi-search feature in the English language to India, and next year this will be available in Hindi. There are plans to expand multi-search to other Indian languages.
Multi-search is a new feature where users can add a text-based query to a visual image search via Google Lens. For instance, if you search for a cloth pattern on Google using an image, you can add text, saying find me a dress in that particular pattern.
“We see multi-search being used in many ways. We talked about searching for a particular flower with Google Lens. But what if I want to know where to buy one, right? Or how to take care of it? So we are working in the US on the ‘multi-search near me’ feature and will bring it to India as well,” Elizabeth Reid, Vice President- Search, Google, told indianexpress.com
She added that expanding multi-search to other languages is not an easy task, as Google must ensure that the content shown matches the query. “You have to consider not only the quality of understanding the language but also the ability of search to join the images and show the underlying content,” she explained.
For Google, the emphasis on enhancing search results for visual-based queries makes sense in a market like India, where this dominates. Globally, Google sees about 8 billion questions asked per month for its Lens tool. And India leads in this particular segment.
More importantly, Google sees multi-search as an avenue to answer other kinds of queries in a richer way. According to Reid, users also rely on multi-search to look for new products, plants, and food.
Google is also bringing its ‘search in video’ feature to the main app in India. The feature is currently in pilot globally. Now, if users in India see a video result in Google Search results, there will be a button next to it that says ‘Search in Video’. A user just has to press that button, and then search for an exact point in the video by typing some keywords.
“You’ll hear stories of people who go and then look at the video transcription to find where something is. Yes, you can move back and forth, but it is painful. Say if you were listening to an earnings call, could you find where they talked about this topic? Or listening to a professor’s video, and you want to dig into one particular part of the material to learn more about it,” Reid explained, adding that this was part of the efforts to make it easier to access information.
According to Puneesh Kumar, General Manager- Search, Google India, the search engine relies on the transcription of the video to help users jump to the exact point they searched for, and again they are relying on artificial intelligence to ensure accuracy. “Typically, you can index the transcription in Search. But what if it’s not an exact keyword match? It will be a synonym. Search is meant to understand what I was looking for, maybe not the exact word that I was looking for. So that’s where the intelligence comes in and what AI is powering,” Kumar said.
Another change in search results is the growing emphasis on regional content, which is an India-first feature. Google will now start showing results in the regional language, along with English, when a user searches for any topic. This will begin with Hindi first, with Tamil, Telugu, Marathi, and Bengali support to be added in the coming year.
“The aim is to try and help the user as much as possible. We learn as we go, and we will continue to improve that experience. We are launching it with Hindi and then looking at expanding into other languages over time,” Kumar said. Google says it will look at signals such as geography, a user’s language preference, etc when it shows up the results.
Most people will benefit from it. But we’re always working on giving an opt-out. So people can say I don’t prefer it or it doesn’t apply to me,” he added.
Google is also looking at helping those who might not communicate as well as others and suffer from speech impairment. It announced that its Project Relate– which helps people with non-standard speech communicate using voice with the help of AL and machine learning– will be available as a pilot in Hindi in 2023. There’s no confirmation of when this will roll out for all such users, but the pilot will be used to gather feedback on how to improve the technology.
At its launch event, Google showed how it tested this with one user, a woman named Chandani Kumari, who faces such an issue. In the video, Kumari used the Project Relate app to communicate with others, as the app would translate what she said into standard speech that others could understand.
When asked when the project would be moved out of beta-testing, Reid stressed this would take some time. “We usually learn tremendous amounts from a pilot. When you release it at scale, you want to ensure you adjust it. The pilot may tell us we have a little bit of work to do or we have a lot of work to do before we launch,” she added.
When asked about the future of Search and whether something like a ‘ChatGPT’ is what users could expect, Reid stressed that while AI would make it more natural, Google was also conscious of its responsibility around the issue. “Google has been the pioneer of much of the underlying technology for AI. We want to be bold and innovative. But we also think it’s essential to be responsible,” she said, adding that the latter would be increasingly important as AI advances further.