Do you really need to submit your website to search engines?
Google and other search engines weren’t built to rely on manual submissions; that’s why they crawl the web.
Not familiar with crawling? It’s when search engines look for new links on websites and subsequently “follow” them. If a newly‐discovered link leads to something useful, that page is then added to the index.
Matt Cutts explains more about crawling and how it works in this video.
It’s also theorized that Google looks to many other data sources, such as Chrome browser usage statistics and domain registration data to aid in their never‐ending pursuit of new websites and webpages.
To summarise, this means that search engines are pretty good at discovering new websites and webpages on their own, provided that they’re linked‐to from somewhere on the web. (We’ll talk more about the importance of links later!)
Why you should still submit your website to search engines
Here are just a few reasons why manual submissions are still a “thing”:
- It’s better to be safe than sorry. Let’s face it, search engines will probably be able to find your website regardless of whether or not you choose to manually submit it. But is “probably” good enough? I mean, submitting your website only takes a minute‐or‐two. So why risk it?;
- Search engines can’t figure out everything via crawling. If you submit your website via the methods discussed below, you’ll have the opportunity to supply Google and Bing with some useful information about your website. For example, you can tell them how important you deem each of your pages to be. They couldn’t obtain this information from crawling alone.
- It helps to improve your website. Google and Bing each offer some insights as to how they view your website in their respective “management dashboards” (more on this later!). There are also various tools for testing your web pages, and they’ll alert you if and when potential problems or errors occur on your site.
Having said that, be aware that submitting your website to, and getting indexed by Google is only part of the battle. The real difficulty often lies in ranking for your desired search terms.
Don’t worry though, we’ve included some advice on how to do that in this very article.
But let’s not get ahead of ourselves.
It’s very easy to submit websites to Google (and other search engines).
Just watch our quick video tutorial or follow the written steps below.
What is a sitemap? It’s a file that lists all the pages on a website. It’s most commonly an XML file and looks something like this:
This particular sitemap lists all the blog posts on my personal blog.
Your sitemap is usually located at yourdomain.com/sitemap.xml. If you don’t see it there, check your robots.txt file by visiting yourdomain.com/robots.txt; this will usually list the sitemap URL.
You need to create one.
There are a few ways to do this. If you’re using WordPress (or another CMS), you can use one of many plugins for generating sitemaps automatically. We recommend Yoast SEO. For static websites, this sitemap generator works well. If you prefer a more manual/customized approach, follow this guide by Screaming Frog.
Found your sitemap? Good. You now need to submit it via Search Console.
Search Console > select your property > Sitemaps > paste in your sitemap URL > hit “submit”
Note that I’m using the new beta version of Search Console. Here’s how to do it in the old one:
Search Console > select your property > Crawl > Sitemaps > Add/test sitemap > paste in your sitemap URL > hit “submit”
For those with multiple sitemaps (this may be the case when using a plugin like Yoast, or if you have a large site), just rinse and repeat this process.
What if I just want to submit an individual webpage to Google?
Google previously provided a way to do this via their URL submission tool, which was discontinued in 2018.
However, if you’re still using the old version of Search Console, you can submit individual URLs using the “Fetch as Google” tool. To do that, go to:
Search Console > Crawl > Fetch as Google > paste in your URL > Fetch
Next, you need to hit the “Request indexing” button, which brings up a modal box like this:
Tick that you’re not a robot (you’re not, are you!?), then choose whether you want Google to crawl just this URL, or this URL and its direct links.
That’s it. Job done.
But what about in the new Search Console? Can you still do this?
Well, it seems that Google has removed the “Fetch as Google” tool and replaced it with the URL inspection tool. The main purpose of this tool is to inspect URLs to uncover issues, but it may also work the same way as the old “Fetch as Google” tool when it comes to (re)crawling.
The problem is that Google’s instructions on the use of this tool don’t make much sense.
To submit a URL to the index:
1. Inspect the URL using the URL Inspection tool.
2. Select Request indexing. The tool will run a live test on the URL to see whether it has any obvious indexing issues, and if not, the page will be queued for indexing. If the tool finds issues with the page you should try to fix them.
Inspecting the URL is easy enough; there’s just no “request indexing” button.
So, right now, we’re not sure how this works. However, this tool is still in beta, so chances are that Google will add this button at a later date.
For now, however, you should probably just stick to the “Fetch as Google” tool in the old version of Search Console.