Google Search Console (formerly Google Webmaster Tools) is a powerful, free product from Google that offers many tools for improving the SEO of your website. Here are a few of the best.
Search Analytics
One of the most powerful SEO tools available in Google Search Console is “Search Analytics”. This report is found under “Search Traffic” on the Search Console menu and shows you how your website is performing in Google search results. The data in this report can be used to show you many useful things. The three we will focus on today are:
- How each of your webpages ranks in Google searches.
- How your website ranks for different keywords.
- The percentage of users click through to your website.
Find Out How Your Webpages Rank In Google
To find out how your webpages rank in Google search results first click “Pages” on the top bar and then check off “Impressions” and “Position”.
This will show you a list of your webpages, how often they show up in consumer searches (impressions) and where they ranked (position). The number listed in the “Position” column is the average position of that page in search results across all searches that it appeared in.

Filtering your report by page allows you to see how your SEO changes have affected your ranking over time.
Use this report to show you how key pages on your website are performing. You can then begin working on improving the SEO of any webpages that aren’t ranking as well as you would hope. For tips on how to improve the SEO of your webpages, download our Ultimate Guide To Small Business SEO.
Going beyond this basic report, Google Search Console also allows you to filter your results by Country, Device and Dates. Use these filters to show you more relevant data such as:
- How well your pages rank in your country vs. the rest of the world. Unless you sell all over the world, you should only really be interested in ranking highly in your area.
- If you rank higher on desktop searches than you do on mobile. If so, your website could have issues with mobile performance that are affecting its SEO.
- How your SEO has changed over time. Use the “Compare date ranges” filter to see how the performance of your website has changed over time.

Comparing date ranges allows you to see how your SEO efforts have changed your ranking.
Find Out How Your Website Ranks For Different Keywords
To see how your website ranks for different keywords click on “Queries”, leaving “Impressions” and “Position” checked.
This gives you a list of all of the keywords (aka “search queries”) that at least one of your webpages ranked for. You can use this report to:
- See how you rank for keywords that are essential to the success of your business. You can then begin optimizing your website for any keywords that you’re not ranking well for.
- Find new keyword ideas. Look through your list and see if there are any good keyword phrases that you hadn’t thought of. You can then begin optimizing your pages for these keywords an use them as targets for online ads.
As with the Pages report, you can also filter your results by country, device, and dates. The date filter is especially useful for showing you how your ranking for target keywords has changed over time. This gives you a great indication of whether or not your SEO efforts are actually working or not.
See The Percentage Of People That Click Through To Your Website
You should now have a pretty good idea of how well your website ranks in Google. How well is that search traffic translating into traffic for your website though? The Search Analytics report within Google Search Console can help there too!
To see how efficient you are at turning impressions into website visitors click on “Pages” and check “Impressions” and “CTR”.

Filter your report by CTR to see which of your webpages are underperforming. Ignore pages with less than 100 impressions to see if they improve as more people see the page.
This shows you the click-through rate (CTR) of all of your webpages. Look for pages with a low CTR (anything under 1% should definitely be looked at) and adjust their meta information to make them more appealing to your search traffic.
How To Improve The CTR Of Your Webpages
Meta Information is additional information about the content of a webpage that is not displayed on the webpage itself. Instead, this information is added to the code of the webpage to give search engines more information about the page. This information is then used for things like building your search engine results page (SERP) listing on Google. While this may seem complex, most website builders and CMSs (like WordPress, Wix or Shopify) have fields where you can easily add custom meta tags to your webpages.
The two meta tags that you should focus on are the “title tag” and the “meta description” tag. These are what search engines use to make up your listing. Think about these two meta tags as an ad. Try to sell consumers on why they should click through to your site. What will they learn? Why is your site the one to answer their question? What makes your company unique? Answering these types of questions will help improve the CTR of your webpages. Like any good ad, try to include keywords and a strong call to action within your meta tags.
Adding A Sitemap
XML Sitemaps are files that give search engines information about how your site is organized. This information includes details about your page content, images, video, and how often your site is updated.
Search engines work by sending out millions of automated robots (known as either “spiders” or “crawlers”) to find and decipher webpages. These crawlers follow web-links between websites and document each webpage they come across. Information about these webpages is then indexed and stored in the search engine’s archive with similar pages. When a consumer searches for information, the search engine looks to its archive of webpages to see which ones best match the consumer’s search.
While submitting a sitemap isn’t mandatory for your website to appear on search engine results pages, it does allow Google to easily crawl and index your website. The easier it is for Google to crawl your website, the quicker it can see when you add or change content on your website.
Having your website quickly indexed is important ever since Google’s Panda update. In this update, Google began punishing websites that post duplicate content (content that another website has already published). Google displays these websites lower in search results than the original publisher.
At the time of the Panda update, there was a serious problem with websites simply copying other website’s content and publishing it to their own website. Panda was meant to stop the practice by only ranking the original publisher of content.
While Panda did help, there are still websites that are full of duplicate content. If they copy content from your site and are indexed first, they will rank that site as the original publisher and punish your site for having “duplicate content”.
How To Generate and Submit A Sitemap
XML Sitemaps can be generated using free online tools. Once generated, Search Console allows you to submit your sitemap directly to Google.
From your Search Console dashboard, select the site you want to submit a sitemap for. On the left, you’ll see an option called “Crawl.” Under “Crawl,” there will be an option marked “Sitemaps.”
Once there click “Add/Test Sitemap” on the top right of the page. You will then need to add in the URL of your sitemap.
Click “Test” to ensure that your sitemap is working properly. If everything checks out fine, click “Submit”.
Checking Your Robots.txt File
Robots.txt is a text file placed in the root folder of a website that tells search engines how to crawl and index websites. Not every page on a website needs to be indexed. If there are pages on your website you want to keep out of search engines you can use robots.txt to block them from indexing and displaying these pages in their listings.
Improperly setup robots.txt files can seriously hurt your website’s SEO. Blocking the wrong file or not following the correct syntax can inhibit Google’s ability to understand your website, or prevent it from showing in search results altogether!
If you want to check your robots.txt file to see exactly what it is and isn’t allowing, log into Search Console. Under “Crawl”, click on the “Robots.txt Tester”. On this screen, you will see a copy of your robots.txt file. You will also be shown any warnings or errors in it that may be costing you traffic. If you have any warnings or errors, feel free to contact us. We will be happy to help walk you through how to fix it.
Infographic: “Why Every Small Business Needs To Take SEO Seriously”
Fetch As Google
Fetch as Google is a tool that allows you to see how Google sees specific pages on your site and manually request pages be indexed. It is very useful if your site is new, has undergone significant changes, or if it has pages that aren’t linked together well.
If you’ve made significant changes to a website, the fastest way to get the updates indexed by Google is to submit it manually. This will allow any changes done to things such as on-page content or title tags to appear in search results as soon as possible.
To see how Google sees your website click “Crawl” on the left side menu of Search Console, then choose “Fetch as Google.”
In the address bar at the top, enter the URL of the page that you want to index. If you want to fetch your homepage, leave the centre box blank. Once you’ve entered the page you want to index, you have 2 options. You can either “Fetch” the page or “Fetch and Render” it.
“Fetching” a URL will show you the HTTP response that Google gets from the site. “Rendering” it will show you how Google thinks your page looks. For this demo click the “Fetch and Render” button.

The render tab shows you how Google sees your webpage. If it looks different than on your computer screen, your robots.txt file may be blocking a file Google needs to interpret the page properly.
How To Manually Submit Your Webpages For Indexing
After Google has finished indexing the site you will see a status, which should either say “Complete” or “Partial”. Partial indicated that the indexing was successful but that some files were blocked by your robots.txt file. If you received a different status, click here for a full list of what different indexing statuses mean.
Beside the status there you will notice a “Submit to Index” button. Click this button to request Google to index the page.
You will be given the option to either “Crawl Only This URL,” which is the option you want if you’re only fetching/submitting one specific page, or “Crawl This URL and its Direct Links,” if you need to index entire sections of your website. This is useful if you’ve made changes to large chunks of your website.
Click “Go”, wait for the indexing to complete, and you’re done! Google now has begun indexing your new content. The changes should start appearing in Google within the next couple of days.
Crawl Errors
Errors on your website can be a big problem. Missing pages or blocked resources can cause you big in terms of SEO. The reality is that most websites have at least a few errors on them but few owners know about them. They are usually only discovered after a customer tells you. Instead of waiting and losing web traffic, use Google Search Console to show you the errors.
To check for errors on our website go to “Crawl Errors” in the “Crawl” section.
This report is split into two sections; Site Errors and URL Errors:
- Site errors are serious problems with your website that require immediate attention. They indicate problems with your entire website such as missing files, server problems or connectivity issues. If your website has any site errors, contact your web developer immediately to resolve them.
- URL errors are errors that only affect one page of your website. Typically these errors happen after large changes to your website or if you’re blocking specific pages with your robots.txt file. If you have made significant changes to your site, mark your URL errors as fixed and check back in a couple of days. If they’ve returned, investigate further.
Wrapping Up
Google Search Console is a powerful SEO tool. It allows you to learn a great deal about how Google sees and ranks your website. This free information can be used to optimize your site and rank even higher in search engines.
If you want to connect Google Search Console to your website but aren’t sure how, consult this handy guide.
We Help Small Businesses Grow Online
Tired of guessing how to market your business online? Let us design a customized marketing strategy that will work for your business goals, will speak to your target market, and will fit within your budget.
Start Growing Your Business Today
We are a digital marketing agency that helps small business owners across North America grow their businesses using progressive digital marketing and website design tactics. How can we help you do the same?
Lure Marketing
Lure Marketing is a digital marketing and website design agency with offices in both Kingston and Belleville, Ontario. We help small businesses navigate and thrive in the online world.
Get In Touch