How To Improve Your Drupal Site's Traffic Using Google Search Console
As webmaster or owner of a Drupal website, it’s common practice to look out for new customers. It is also crucial to understand how prospective customers behave when they come to visit your website, and this is possible with the help of Google Analytics. Unfortunately Drupal site owners often fail to see the importance of their websites being properly crawled or indexed by Google. Now, Google has made things easier for businesses with Google Search Console (formerly webmaster tools) by directly communicating with website owners in order to help them get important information.
Google Search Console may not come with the trendiest user interface, but there’s more raw data in there than colorful charts and graphs. Using the system will help you improve your Drupal site's ranking in the Search Engine Results Pages (SERPs) if you take some time to learn how it works.
Let us now try to understand how GSC can help grow your business.
Why Use Google Search Console (GSC)?
Google Search Console is free if you have a Google account. It automatically opens a two-way communication channel between your website and Google. Search Console can help you monitor, maintain, and improve your website presence in Google’s search results. More specifically, once you log into Google Search Console, you will be able to see what search queries are tied to your business and where you could improve your website ranking. You will also be able to manage your website sitemaps, indexing of links, besides determining what Googlebot should be crawling and indexing content from your website.
How to Set Up GSC?
Setting up Google Search Console for your website is simple and easy. If you already have an account, look for it under Search Console. You just have to follow 3 simple steps:
- Click on the red color “Add A Property” button
- Enter your website URL
- Select the verification method and then click “Verify”
That’s all you need to get started. Please note, you won’t immediately see data for analysis. Wait for a couple of days for data to start populating your search console. Meanwhile, you’ll still be able to make use of some Console features, which are mentioned below.
How to Optimize Your Business Website With GSC?
As the owner of a business that operates via a website, you must ensure it is properly indexed and placed in good position in search engines. However, please note that any attempts to cheat Google Search ranking by artificially boosting your website rankings will result in a penalty from which it is extremely difficult or sometimes even impossible to recover. However, this fear of penalty should not deter you from taking the best action to promote your website. Believe me when I say this; contrary to popular perception, SEO is not dead, but very much alive. Google actually wants webmasters to ensure that your website is optimized not just for Desktop version, but also for mobile platform. So, don’t think Google is ready with a hammer to punish you for doing what it stands for – promoting relevant business and making them visible. You can do this by using its GSC.
Once you sign into GSC, you will see a lot of data and a lot more functionality to sort out. Now, here comes the important issue you must identify. How often would you be reviewing these data, and which section should be your focus area to make your website most visible to search engines?
Depending on how big your website is and how frequently you publish fresh content, once or thrice a week or month should be enough. So, you need to understand most of the features of Google Search Console, but at the same time, know where you should be devoting your time and energy:
- Understanding Structured Data
- Understanding HTML Improvements
- Understanding Search Analytics
- Understanding Index Status
- Understanding Keywords
- Understanding Crawl Errors
- Understanding Fetch as Google
- Understanding Robots.txt tester and
- Understanding Website Sitemap
Let us now look closer at each of these areas in details:
Understanding Structured Data
Google collects these data, and they are unique information specific to your business. It could be information like a specific recipe, ratings, or reviews. Or, it may have info about a film, a product, or an event. For a local business, it could be info about location and contact numbers.
Structured data, in simple terms, is a standard method of annotating specific details that are relevant for search engines to understand what that info really means or represents. So, if your website has enhanced info that could be typically displayed in the SERPs, it is a good practice to keep track of any structured data errors. Ignoring them can have a negative impact on how your website is displayed to people looking for your service. For example, if your website theme or design is not up to a standard, GSC has a feature called, “Data Highlighter.” This tool allows you to highlight your data and then apply specific and relevant tags. Once the data on your webpage is appropriately tagged, search engines will automatically tag similar pages whenever you publish fresh content.
Understanding HTML Improvements
Whenever Googlebot crawled your website, it will keep record of any HTML changes that may be required to improve UX. It could also record overall performance. So, specific improvements you may want to focus on in this section are:
- Duplicate meta descriptions;
- Long / short meta descriptions;
- Issues with title tags, and
- Non-indexable content
Understanding Search Analytics
Search Analytics is right under ‘Search Traffic’ tab. It provides insights on how well your website performed for specific queries. Please note that just because your site may be displaying data for a particular query does not mean that a potential traffic generated by that query is 100% correct and relevant. You have to give some thought as to which of the queries are most relevant to your business and then work on your content accordingly. Google strongly recommends you start the work by sorting out specific queries based on clicks and not based on page impressions. This method will give you a much more accurate picture of which queries made for your website are actually bringing traffic to your website. You can start work on this by checking these four boxes shown below:
[[{"fid":"787","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false}},"attributes":{"class":"media-element file-default","data-delta":"1"}}]]
- Clicks: How many times a visitor clicked through your site from a particular query
- Impressions: This is the total number of page impressions on your website
- CTR: This is the ‘click through rate’ or percentage of times a user clicked through to your website
- Position: It’s the average position of your website on a search engine when presented to visitors
Now, in order to sort out data by clicks, you’ve got to click on the appropriate header:
[[{"fid":"790","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"2":{"format":"default","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false}},"attributes":{"class":"media-element file-default","data-delta":"2"}}]]
For this screenshot I filtered for queries containing the word "bootstrap". You can use this keyword filter to look at specific areas of search interest.
I avoided using the country filter because this website is international. Now, if your business is country specific, you should always use the appropriate country filter.
Now having these data, your first job is to quickly analyze and take a call on what is needed to improve your position and the CTR of the query. If you do this right, your website will attract more qualified clicks, which will give you returns on your investments.
If you click the actual query on the page and then select each of the pages, you will notice which are ranking high for a particular query. After that, examine individual pages for on-page SEO (on your site, not in GSC). You can also do a quick Google search and compare how pages from your competitors fare. You will be able to see why they are doing better and how to compete with them in a healthy manner.
[[{"fid":"791","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"3":{"format":"default","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false}},"attributes":{"class":"media-element file-default","data-delta":"3"}}]]
You can work on relevant queries or pages to gain insights into how each one can be improved.
Understanding Index Status
This is located right under Google Index. Index Status gives you information on the total number of web pages that Google crawled and indexed over a period of time. There may not be much info here, but this simple chart shows some info about the nature and quality of your web links.
Now, if you are publishing fresh content regularly, you will see an upward trend in the chart. On the other hand, if you see a drop in indexed pages, you should be concerned. This could mean that Google has detected a malware on your website. On the other hand, if you are publishing content or more pages than Google is indexing, is could be something wrong with your internal linking structure. So, you have to do some fine-tuning. You are also provided with an advanced tab that, when clicked, will show you blocked pages along with the ones that have been removed from Google.
Understanding Content Keywords
As Google crawls your website, it automatically takes note of specific keywords. The significance of every keyword depends on how many times Google found that keyword on your website. This report will give you a clear idea of how the search engine interpreted your content. What does Googlebot think of your website? The actual subject matter (niche of your business) should be reflected in the report.
[[{"fid":"793","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"4":{"format":"default","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false}},"attributes":{"class":"media-element file-default","data-delta":"4"}}]]
Finally, if you found some keywords that look out of place or are totally inappropriate to your business, it is an indication that your website has been hacked by some scammers. For example, if some keywords related to “Viagra” start appearing on the Content Keywords list, it is an indication your website has become a victim of pharma hack.
Understanding Crawl Errors
You should also check out your website link crawl error report from time to time to find indexing issues and fix the ones that are listed. A 404 error page might not invite penalty from Google, but it helps to clean up dead links. Your visitors will appreciate it a lot.
Understanding Fetch as Google
‘Fetch as Google’ is a manual version of Googlebot. You can use this feature any time after a fresh important content update or if you feel the need to check an existing page. This feature provides the most accurate answer on how Google Search bots will “see” your website link and render a specific page. It is a great tool to ensure that your web pages and all your content on a page are accessible by Google.
[[{"fid":"794","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"5":{"format":"default","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false}},"attributes":{"class":"media-element file-default","data-delta":"5"}}]]
After entering your website URL in the bar, click either the “Fetch” or the “Fetch and Render” options for Google to start crawling your web page. After a short wait, it will return a result that indicates the Googlebot type and status. The status could indicate complete, partial, redirected, or another specific error. Now if you click on the result, you’ll have access to all the additional info that shows what was being fetched, along with the time taken to download them. The rendering tab should present a comparison chart of how Google sees your website versus how your visitor sees it. Any resources that could not be reachable will also be displayed. Some of them may be your website image, code or, its stylesheet.
Understanding Robots.txt Tester
The robots.txt gives instructions to robots on how to crawl your website. Its work is to mainly send requests to search engines on what to crawl and what to ignore. So, to make your website indexed accurately by search engines, use robots.txt Tester to know whether Googlebot is able to crawl a particular web link or whether some links you don’t want to be crawled are blocked.
Understanding Sitemaps
Google says that having a Sitemap is not compulsory for websites to be indexed, but having it is still a good practice.Google recommends that you submit your sitemap through GSC, especially if your website is new or has few external links. It is also good when your site does not have proper internal linking structure.
Adding a website sitemap is simple. After creating a sitemap by using the Drupal module XML Sitemap just copy and paste the sitemap URL into GSC and then click “Test Sitemap.” After a few seconds, you will see the results, and if there are no errors, submit it. To do this, go back to the main dashboard and click on “Add/Test Sitemap.” After that, enter the sitemap URL and click “Submit Sitemap.”
You should try to check your sitemaps on a regular basis for possible errors. A monthly checkup would be just fine.
Using a sitemap is optional and not everyone agrees on the importance of sitemaps for small websites. At Sooperthemes.com we don't have a sitemap installed because the public-facing site is simple in structure. Google should have no problem figuring out how to crawl our blog and other pages.
Conclusion
I must admit that I’m not able to cover all the Google Search Console features, but the ones that are mentioned here are enough to give a kick start to your business. Unless you decide to go bigger or expand your business, the features mentioned here should be just fine for your Drupal website.
It is strongly suggested that you should enable the email notifications feature. This will allow Google to send you notices about critical issues that require your attention. From my own experience, I often get notice about ‘malware’ on my site via the email. It gives me ample time to resolve the problem before the issue affects my search rankings.
Another important issue you should bear in mind, if you have a website with lots of great content, Google loves them, and it will crawl and index the links. Google Search Console looks may not be that great, but the data that you get from various features is worth thousands of dollars.
Are you using Google Search Console? If yes, share how it has helped you in making your website visible to search engines.