Google Search Console (formerly Google Webmaster Tools) is a powerful (and free) SEO service provided by Google. It is a collection of tools and reports that allow you to properly strategize and optimize how your site ranks in search engines. You can use it to monitor your site’s health and performance to maximize your website investment.
Search Console’s built-in reports allow you to see:
- How people find your site.
- How your web pages rank in Google.
- Who is linking to your site.
- If there are any errors that are affecting your SEO.
- How Google sees your website.
How To Set Up Google Search Console
To connect your website to Google Search Console first sign into your Google account. Once signed in you will be prompted to add in the URL of the website you want to link. Type in your website’s URL and click the red “Add A Property” button to add your website.
You will now have to prove to Google that you own your website. This can be done a few different ways, the most common being either adding a code snippet to the header of your website or uploading a file on the backend of your site. If this sounds too complex it is best to consult with your web developer.
Once you complete your preferred verification method, click “Verify”. Your website is now connected to Google Search Console! Now that it is connected, how do you use Search Console to improve your SEO?
Using Google Search Console To Improve Your SEO
Google Search Console offers many tools for improving the SEO of your website. Here are a few of the best.
One of the most powerful tools available in Google Search Console is “Search Analytics”. This report shows you how your website performs in Google search results. The data in this report can be filtered to show you many useful things. The three we will focus on today are:
- How your webpages rank in Google searches.
- How you rank for different keywords.
- The percentage of users click through to your website.
Find Out How Your Webpages Rank In Google
To find out how your webpages rank in Google search results click “Pages” on the top bar and check off “Impressions” and “Position”.
This will show you a list of your top webpages, how often they show up in consumer searches (impressions) and where they ranked (position). The number listed in the “Position” column is the average position that that page appeared in across all searches that it appeared in. Use this report to see what your top performing pages are. To see how an individual page has performed over time, click the “>>” on the far left of that page’s row.
This will give you a graph of the impressions and average rank over the last 28 days. This is especially useful for checking the results of efforts to improve the SEO of specific pages. Check back 30 days after making SEO improvements to see if they’ve had the desired effect. To see page performance over a longer period of time click “Set Date Range” under “Dates”.
Find Out How Your Website Ranks For Different Keywords
To see how your website ranks for different keywords click on “Queries” and leave “Impressions” and “Position” checked.
This gives you a list of all of the keywords (aka “search queries”) at least one of your webpages ranked in. Use this list to try and find desirable keywords that you’re not ranking for. Find pages on your website that are relevant to those keywords and optimize that page to rank better for them. If you don’t have any relevant pages, make a new one!
Check back in a month to see if your ranking for that keyword has increased.
See The Percentage Of People That Click Through To Your Website
Ok so now we have a pretty good idea of how we rank in Google; how is it translating into website traffic? For that most people would turn to Google Analytics and see how much organic traffic they’re getting. While there is nothing wrong with this approach, it is missing something; impressions. Sure, GA can tell you that you’re getting 1,000 people a month from Google search results, but is that good or bad?
To demonstrate what I mean picture two companies; Company A and Company B. Both companies got 100 website visitors from search engines last month. Company A had 2,000 impressions and Company B had 10,000 impressions. Which performed better?
As you can see, Company A converted 5% of their impressions into website visitors (100/2,000) while Company B only converted 1% (100/10,000). This means that if Company B had been as efficient as Company A, they would have generated 400 more website visitors last month! That’s a lot of missed opportunity.
To see how efficient you are at turning impressions into website visitors click on “Pages” and check “Impressions” and “CTR”.
Aim for 4-5% as your benchmark. Any page converting less than 4% of it’s impressions is in need of some changes. How can you change a webpage to convert more people in Google? You have to change it’s meta tags.
How To Improve Your CTR
Meta Information is additional information about the content of a webpage. It is not displayed on the webpage itself but instead is added to the code of the webpage to give search engines more information about the page. This information is then used for things like building your search engine results page (SERP) listing on Google. While this may seem complex, most website builders and CMSs (like WordPress or Shopify) have fields where you can easily add in meta tags.
The two meta tags that make up your SERP listing are the “title tag” and the “meta description” tag. These are the two tags you want to change to improve your CTR. Think about these two meta tags as an ad. Try to sell consumers on why they should click through to your site. What will they learn? Why is your site the one to answer their question? What makes your company unique? Answering these types of questions will help improve the CTR of your webpages.
Adding A Sitemap
XML Sitemaps are files that give search engines information about how your site is organized. This information includes details about your page content, images, video, and how often your site is updated.
Search engines work by sending out millions of automated robots (known as either “spiders” or “crawlers”) to find and decipher webpages. These crawlers follow web-links between websites and document each web page they come across. Information about these webpages is then indexed and stored in the search engine’s archive with similar pages. When a consumer searches for information, the search engine looks to it’s archive of webpages to see which ones best match the consumers search.
While submitting a sitemap isn’t mandatory for your website to appear on search engine results pages, it does allow Google to easily crawl and index your website. The easier it is for Google to crawl your website, the quicker it can see when you add or change content on your website.
Having your website quickly indexed is important ever since Google’s Panda update. In this update, Google began punishing websites that post duplicate content; content that another website has already published. Google displays these websites lower in search results than the original publisher.
At the time of the Panda update there was a serious problem with websites simply copying other website’s content and publishing it to their own website. Panda was meant to stop the practice by only ranking the original publisher of content.
While Panda did help, there are still websites that are full of duplicate content. If they copy content from your site and are indexed first, they will rank that site as the original publisher and punish your site for having “duplicate content”.
How To Generate and Submit A Sitemap
XML Sitemaps can be generated using plugins or free online tools. Once generated, Search Console allows you to submit your sitemap directly to Google.
From your Search Console dashboard, select the site you want to submit a sitemap for. On the left, you’ll see an option called “Crawl.” Under “Crawl,” there will be an option marked “Sitemaps.”
Once there click “Add/Test Sitemap” on the top right of the page. You will then need to add in the URL of your sitemap.
Click “Test” to ensure that your sitemap is working properly. If everything checks out fine, click “Submit”. Your sitemap is now submitted and Google will start using it to help index your set better.
Checking Your Robots.txt File
Robots.txt is a text file placed in the root folder of a website that tells search engines how to crawl and index websites. Not every page on a website needs to be indexed. If there are pages on your website you want to keep out of search engines you can use robots.txt to block them from indexing and displaying these pages in their listings.
While it can keep web pages from becoming discoverable, an improperly setup robots.txt file can seriously hurt your SEO. Blocking the wrong file or not following the correct syntax can inhibit Google’s ability to understand your website, or prevent it from showing in search results all together!
If you want to check your robots.txt file to see exactly what it is and isn’t allowing, log into Search Console. Under “Crawl”, click on the “Robots.txt Tester”. On this screen you will see a copy of your robots.txt file. You will also be shown any warnings or errors in it that may be costing you traffic. If you have any warnings or errors, feel free to contact us. We will be happy to help walk you through how to fix it.
Fetch As Google
Fetch as Google is a tool that allows you to see how Google sees specific pages on your site and manually request pages be indexed. It is very useful if your site is new, has undergone significant changes, or if it has pages that aren’t linked together well.
If you’ve made significant changes to a website, the fastest way to get the updates indexed by Google is to submit it manually. This will allow any changes done to things such as on-page content or title tags to appear in search results as soon as possible.
To see how Google sees your website click “Crawl” on the left side menu of Search Console, then choose “Fetch as Google.”
In the address bar at the top, enter the URL of the page that you want to index. If you want to fetch your homepage, leave the centre box blank. Once you’ve entered the page you want to index, you have 2 options. You can either “Fetch” the page or “Fetch and Render” it.
“Fetching” a URL will show you the HTTP response that Google gets from the site. “Rendering” it will show you how Google thinks your page looks.
For this demo click the “Fetch and Render” button.
How To Manually Submit Your Webpages For Indexing
After Google has finished indexing the site you will see a status, which should either say “Complete” or “Partial”. Partial indicated that the indexing was successful but that some files were blocked by your robots.txt file. If you received a different status, click here for a full list of what different indexing statuses mean.
Beside the status there you will notice a “Submit to Index” button. Click this button to request Google to index the page.
You will be given the option to either “Crawl Only This URL,” which is the option you want if you’re only fetching/submitting one specific page, or “Crawl This URL and its Direct Links,” if you need to index entire sections of your website. This is useful if you’ve made changes to large chunks of your website.
Click “Go”, wait for the indexing to complete, and you’re done! Google now has begun indexing your new content. The changes should start appearing in Google within the next couple of days.
Errors on your website can be a big problem. Missing pages or blocked resources can cause you big in terms of SEO. The reality is that most websites have at least a few errors on them but few owners know about them. They are usually only discovered after a customer tells you. Instead of waiting and losing web traffic, use Google Search Console to show you the errors.
To check for errors on our website go to “Crawl Errors” in the “Crawl” section.
This report is split into two sections; Site Errors and URL Errors:
- Site errors are serious problems with your website that require immediate attention. They indicate problems with your entire website such as missing files, server problems or connectivity issues. If your website has any site errors, contact you web developer immediately to resolve them.
- URL errors are errors that only affect one page of your website. Typically these errors happen after large changes to your website or if you’re blocking specific pages with your robots.txt file. If you have made significant changes to your site, mark your URL errors as fixed and check back in a couple of days. If they’ve returned, investigate further.
Google Search Console is a powerful tool for any SEO, web developer, marketer or small business owner. It allows you to learn a great deal about how Google sees and ranks your website. This free information can be used to further optimize your site and rank even higher in search results.
The higher you rank in search results, the more website traffic you will get; growing brand awareness and increasing your sales.
If you want to connect Google Search Console to your website but need help, feel free to contact us anytime.
We Help Small Businesses Grow Online
Tired of guessing how to market your business online? Let us design a customized marketing strategy that will work for your business goals, will speak to your target market, and will fit within your budget.
Start Growing Your Business Today
We help small business owners across Ontario grow their businesses using progressive digital marketing and website design tactics. How can we help you do the same?
Lure Marketing is a digital marketing agency based out of Kingston, Ontario. We help small businesses navigate and thrive in the online world.