When Google first launched 25 years ago, it was far from the first search engine. But quickly, Google Search became known for it’s ability to help connect people to the exact information they were looking for, faster than they ever thought possible.
Over the years, they’ve continued to innovate and make Google Search better every day. From creating entirely new ways to search, to helping millions of businesses connect with customers through search listings and ads (starting with a local lobster business advertising via AdWords in 2001), to having some fun with Doodles and easter eggs — it’s been quite a journey.
For Search 25th birthday, they’re looking back at some of the milestones that made Google more helpful in the moments that matter, and played a big role in where Google is today.
2001: Google Images
When Jennifer Lopez attended the 2000 Grammy Awards, her daring Versace dress became an instant fashion legend — and the most popular query on Google at the time. Back then, search results were just a list of blue links, so people couldn’t easily find the picture they were looking for. This inspired us to create Google Images.
2001: “Did you mean?”
“Did you mean,” with suggested spelling corrections, was one of our first applications of machine learning. Previously, if your search had a misspelling (like “floorescent”), we’d help you find other pages that had the same misspelling, which aren’t usually the best pages on the topic. Over the years we’ve developed new AI-powered techniques to ensure that even if your finger slips on the keyboard, you can find what you need.
2002: Google News
During the tragic events of September 11, 2001, people struggled to find timely information in Search. To meet the need for real-time news, we launched Google News the following year with links to a diverse set of sources for any given story.
2003: Easter eggs
Googlers have developed many clever Easter eggs hidden in Search over the years. In 2003, one of our first Easter eggs gave the answer to life, the universe and everything, and since then millions of people have turned their pages askew, done a barrel roll, enjoyed a
funny recursive loop and celebrated moments in pop culture.
One of our earliest Easter eggs is still available on Search.
Wouldn’t it be nice to type as quickly as you think? Cue Autocomplete: a feature first launched as “Google Suggest” that automatically predicts queries in the search bar as you start typing. Today, on average, Autocomplete reduces typing by 25% and saves an estimated over 200 years of typing time per day.
2004: Local information
People used to rely on traditional phone books for business information. The web paved the way for local discovery, like “pizza in Chicago” or “haircut 75001.” In 2004, Google Local added relevant information to business listings like maps, directions and reviews. In 2011, we added click to call on mobile, making it easy to get in touch with businesses while you’re on the go. On average, local results in Search drive more than 6.5 billion connections for businesses every month, including phone calls, directions, ordering food and making reservations.
2006: Google Translate
Google researchers started developing machine translation technology in 2002 to tackle language barriers online. Four years later, we launched Google Translate with text translations between Arabic and English. Today, Google Translate supports more than 100 languages, with 24 added last year.
2006: Google Trends
Google Trends was built to help us understand trends on Search with aggregated data (and create our annual Year in Search). Today, Google Trends is the world’s largest free dataset of its kind, enabling journalists, researchers, scholars and brands to learn how searches change over time.
2007: Universal Search
Helpful search results should include relevant information across formats, like links, images, videos, and local results. So we redesigned our systems to search all of the content types at once, decide when and where results should blend in, and deliver results in a clear and intuitive way. The result, Universal Search, was our most radical change to Search at the time.
2008: Google Mobile App
With the arrival of Apple’s App Store, we launched our first Google Mobile App on iPhone. Features like Autocomplete and “My Location” made search easier with fewer key presses, and were especially helpful on smaller screens. Today, there’s so much you can do with the Google app — available on both Android and iOS — from getting help with your math homework with Lens to accessing visual translation tools in just a tap.
2008: Voice Search
In 2008, we introduced the ability to search by voice on the Google Mobile App, expanding to desktop in 2011. With Voice Search, people can search by voice with the touch of a button. Today, search by voice is particularly popular in India, where the percentage of Indians doing daily voice queries is nearly twice the global average.
2009: Emergency Hotlines
Following a suggestion from a mother who had a hard time finding poison control information after her daughter swallowed something potentially dangerous, we created a box for the poison control hotline at the top of the search results page. Since this launch, we’ve elevated emergency hotlines for critical moments in need like suicide prevention.
2011: Search by Image
Sometimes, what you’re searching for can be hard to describe with words. So we launched Search by Image so you can upload any picture or image URL, find out what it is and where else that image is on the web. This update paved the way for Lens later on.
2012: Knowledge Graph
We introduced the Knowledge Graph, a vast collection of people, places and things in the world and how they’re related to one another, to make it easier to get quick answers. Knowledge Panels, the first feature powered by the Knowledge Graph, give you a quick snapshot of information about topics like celebrities, cities and sports teams.
2015: Popular Times: We launched the Popular Times feature in Search and Maps to help people see the busiest times of the day when they search for places like restaurants, stores, and museums.
By launching a personalized feed (now called Discover) we helped people explore content tailored to their interests right in the Google app, without having to search.
Google Lens turns your camera into a search query by looking at objects in a picture, comparing them to other images, and ranking those other images based on their similarity and relevance to the original picture. Now, you can search what you see in the Google app. Today, Lens sees more than 12 billion visual searches per month.
2018: Flood forecasting
To help people better prepare for impending floods, we created forecasting models that predict when and where devastating floods will occur with AI. We started these efforts in India and today, we’ve expanded flood warnings to 80 countries.
A big part of what makes Search helpful is our ability to understand language. In 2018, we introduced and open-sourced a neural network-based technique to train our language understanding models: BERT (Bidirectional Encoder Representations from Transformers). BERT makes Search more helpful by better understanding language, meaning it considers the full context of a word. After rigorous testing in 2019, we applied BERT to more than 70 languages. Learn more about how BERT works to understand your searches.
2020: Shopping Graph
Online shopping became a whole lot easier and more comprehensive when we made it free for any retailer or brand to show their products on Google. We also introduced Shopping Graph, an AI-powered dataset of constantly-updating products, sellers, brands, reviews and local inventory that today consists of 35 billion product listings.
2020: Hum to Search
We launched Hum to Search in the Google app, so you’ll no longer be frustrated when you can’t remember the tune that’s stuck in your head. The machine learning feature identifies potential song matches after you hum, whistle or sing a melody. You can then explore information on the song and artist.
2021: About this result
To help people make more informed decisions about which results will be most useful and reliable for them, we added “About this result” next to most search results. It explains why a result is being shown to you and gives more context about the content and its source, based on best practices from information literacy experts. ‘About this’ result is now available in all languages where Search is available.
To help you uncover the information you’re looking for — no matter how tricky — we created an entirely new way to search with text and images simultaneously through Multisearch. Now you can snap a photo of your dining set and add the query “coffee table” to find a matching table. First launched in the U.S., Multisearch is now available globally on mobile, in all languages and countries where Lens is available.
2023: Search Labs & Search Generative Experience (SGE)
Every year in Search, we do hundreds of thousands of experiments to figure out how to make Google more helpful for you. With Search Labs, you can test early-stage experiments and share feedback directly with the teams working on them. The first experiment, SGE, brings the power of generative AI directly into Search. You can get the gist of a topic with AI-powered overviews, pointers to explore more and natural ways to ask follow ups. Since launching in the U.S., we’ve rapidly added new capabilities, with more to come.
As someone who’s been following the world of search engines for more than two decades, it’s amazing to reflect on where Google started — and how far we’ve come.