Google has announced a new feature that lets you search within videos using Google Lens with AI. This was announced at Google I/O 2024 and brings video search into Lens. Now you can point your camera at a video, ask a question and get AI answers based on the scene. This makes Google search even more powerful and lets you extract moments from videos faster than ever.
The technology behind this is Google’s Gemini AI model which breaks down videos frame by frame, detects motion, objects and even audio cues to answer questions. During the demo, Google’s Senior Director of Product Management, Rose Yao, showed how this works by recording a video of a vinyl record player. Without using specific terminology, Google Lens was able to identify the issue with the tonearm and suggest solutions.
Search with Google Lens
Google Lens was originally an image search tool, now it can analyze and extract information from video content. You can point your smartphone camera at a video, on screen or in the real world, and search within the video. You can identify objects, get links to related content or even answer questions about what’s happening in the video. Imagine watching a documentary and being able to instantly identify a historical figure or a piece of art by simply pausing the video and pointing your camera at the screen.
This is powered by the same AI that powers Google’s broader search. The AI interprets visual cues, recognizes patterns and gives you search results in real time. This will roll out globally to Android and iOS users
AI in Video Search
The AI behind this doesn’t stop at recognizing objects in videos. It also enhances multisearch, a feature that lets you search with text and images at the same time. This is an evolution of Google’s efforts to make it easier for people to find and interact with information in a more multimedia world. The Lens functionality also works with voice commands so you can ask questions about the video content you’re looking at
For instance, you’re watching a wildlife video and ask “What’s that bird?” Lens will use AI to identify the bird in the frame and give you info on its habitat and behavior and links to more resources.
Implications Across Industries
The impact is huge across many industries. In education, teachers and students will be able to find key moments in instructional videos instantly. Businesses can use the tool for product demos, so customers can search for specific features or information in promotional videos. Security professionals can use it to scan hours of surveillance footage and find key moments in seconds.
The possibilities also extend to retail and shopping experiences. With Lens, you can pause a video of a product (e.g. clothing or furniture) and search for similar products, read reviews or find where to buy.
This video search integration could change the future of e-commerce.
AI-Organized Search Results
Along with Lens video search, Google is introducing AI-powered enhancements to its traditional search results.
This new system organizes content more intelligently, grouping text, videos and images together based on what you search for. The results are curated not just by relevance, but by diversity so you get a full view of all the content (This is already happening with recipe and meal inspiration searches on mobile.
But the broader implications of AI-organized search results could make finding information in all types of media more efficient. For people who consume content in many forms (blog posts, videos, forums) this new search will make sure no resource goes unseen.