How to Sell Online to Generation Z
Generation Z (Gen Z) is no longer the future, it is the present. Let's learn about their shopping habits, what makes them unique, and how to sell to them.
If you have a website, one of your main goals will be to rank at the top of the search engine result pages (SERPs). This is only possible if Google "likes" your page. In this article, we will be talking about the crawl budget, explain what it is and what it has to do with increasing your visibility in terms of ranking.
The goal, when working with SEO, is to have a good position in the search engine results, preferably on the first page and towards the top.
There are many things you can do and optimise to reach this goal: you can research keywords, create good content, use external and internal links... and make the most of your crawl budget (and possibly increase it).
The concept behind the crawl budget is very technical, so we will try to explain it in a simple way and let you know how you can optimise it.
Table of Contents:
Before we begin, we need to briefly explain three key terms. These are often used as synonyms, but actually have different meanings.
What is meant by scanning, indexing, and positioning a page?
First of all, the Googlebot scans the page. This can be a new page with new content or an existing page with updated content.
During the scan, the page is indexed, which means that it is taken into account in the search engine database and can appear in the SERPs.
Then, Google decides where to position (or rank) the page in response to users’ search queries.
Having explained that, we can start…
A crawl budget is the time Google decides to invest in indexing and scanning the pages of a website.
Let’s go a little deeper into that definition:
The Google spider, which is also called Googlebot, is responsible for scanning every page on the web. It performs a deep-crawl once a month and a fresh-crawl almost every day.
The Googlebot visits as many URLs as possible, as Google is hungry for content. The purpose of the scans is to keep up to date with the new content on the web and to provide users with a relevant answer to their search queries.
The time the Googlebot spends indexing and scanning the pages is not unlimited. On the contrary, not all URLs are scanned regularly. That’s why optimising your crawl budget is very important. The better your pages are, the easier they are to read and the more time the Googlebot invests time to read them.
Therefore, the more pages that are scanned and indexed by the Googlebot, the better the chance of appearing among the first SERPs.
The questions you will most likely ask yourself after reading this section are:
Does Google like my website?
How much time does the Googlebot spend on my pages?
Can I increase my crawl budget?
These are questions will be answered in the following section.
Shutterstock/Min C. Chiu
To keep track of the Googlebot activity on your page, a useful and free tool is the Google Search Console. This tool allows you to find out how many requests per second the search engine dedicates to your website and how much time passes between the individual scans.
Discovering Googlebot activity over the last 90 days is easy. You can view the statistics of the scan like this:
Go to Google Search Console
Click on "Previous Tools and Reports"
Click on "Scan Statistics"
Three charts will be displayed side by side and with three values (high, medium and low). The first graph shows the pages scanned per day by the Googlebot, the second shows the kilobytes downloaded daily, and the third shows the time spent downloading a page.
For the first graph, it can be said that the higher the value, the better. However, for the second and third graph, the opposite is true.
In addition, a faster download means that more pages can be scanned with the available crawl budget.
If you’ve noticed that your crawl budget isn’t very high, you don’t need to panic. There are some simple tricks you can use to convince the Googlebot to consider scanning your page more:
Create high-quality content regularly
Update your content (new and old) regularly
Your content should vary- this means that not only text should dominate your page, but you should also incorporate images, videos, PDFs, etc.
Make sure that the organisation of your sitemap is clear an readable for Google, especially if you are running a large site. Also, do not forget to synchronise your sitemap in the Google Search Console.
Use internal links on all of your pages. Otherwise, the Googlebot might end up in a “dead end” if there are pages that do not have any internal links in them.
Also, think of backlinks. The more you have, the more likely it is that Google will consider scanning your pages. They are also essential to increase the popularity of your page.
The popularity of your website is essential. According to Google, “The most popular URLs on the Internet tend to be scanned more often to keep them up to date in our index.”
Shutterstock/enzozo
If you’ve followed the tips above and managed to capture Google’s attention, it’s time to focus on SEO. "Optimising" in this sense means to maximise the “little” time the Googlebot dedicates to your page.
If you have pages on your website that have no added value for users or that present duplicate content, you are risking the Googlebot losing valuable time. This is, of course, something you want to avoid.
Always keep in mind that:
A popular website with recent or updated content will increase the crawl budget.
In contrast, duplicate, low quality, and no longer up-to-date content and 404 pages will decrease the crawl budget.
Although it might seem obvious, creating low-quality content is counterproductive. In fact, there is the risk that the Googlebot will waste time scanning the low-quality content pages rather than being able to scan those of high quality that are popular and also up to date.
Google is already good at its job. However, you can help it by keeping your content up to date and of high quality.
So there are two sides of the crawl budget you can optimise: first of all, you can increase the crawl budget and decrease the time between scans.
Speed is essential because the time the Googlebot devotes to pages is not unlimited.
Optimising the download time of a page means giving the Googlebot more time to spend on others. Here are some tips you can consider:
Invest in a quality server
Optimise the source code to make it more "readable" for the Googlebot
Compress the images without having to sacrifice quality. Tinypng is an example of a free website that reduces the file sizes of images.
Surely you are familiar with this image, which shows how a site should ideally be structured. A clear and linear structure of the pages on your site allows you to take advantage of your crawl budget. The Googlebot can therefore easily scan and index more pages.
Pay attention to the following points:
Follow the famous three-click rule, which states that a user should be able to get from any one page on your site to any other page on your site with a maximum of three clicks.
Avoid dead-end pages. This refers to pages that don’t have any internal or external links.
Use canonical URLs for your site in the sitemap. Even if it doesn’t help boost your crawl budget, it allows Google to understand which version of the page it should consider scanning and indexing.
Internal links, which basically serve to add depth to a certain topic, should point to the pages you want the Googlebot to take into account.
If you point out your preferred pages, you have a greater chance that the crawler will end up scanning them. Google focuses more on pages that contain a lot of links (both internal and external).
Broken links do not only penalise you in terms of ranking, but also waste the precious crawl budget that you have.
No-follow links save your crawl budget as they tell the Googlebot not to waste time scanning them. These pages could be the ones you determine as the least important on your site or the ones that are already linked within a certain topic or category and shouldn’t be scanned several times.
Robots.Txt, a simple text file, is useful for blocking uninteresting pages. It also provides a guide for the content of each page and tells Google how to scan it.
The robot.txt is fundamental if you want to avoid wasting the crawler’s time with those pages that don’t need to be scanned. These could be private pages or admin pages. It is also helpful to indicate which pages should be scanned.
You can never say it enough: updating your content leads to an increase in the time Google dedicates to it.
Getting rid of all the pages that are no longer relevant or don’t have a lot of traffic will prevent you from wasting your crawl budget. If you don’t want to completely lose the content of these pages, you can move it to (or combine it with) similar or new pages.
In this sense, do not forget the backlinks. If other pages refer to a page you are about to delete, you have two options: you can either do a 301 redirect, which tells the search engine that the page is accessible from another URL, or you can change the backlink URL directly.
By regularly updating your content, you will notice an increase in the Googlebot scans. Every time the Googlebot visits a page and finds revised content, you get an increase in the crawl budget.
Managing and increasing your crawl budget is the secret to success. If your content is good and your pages are easy to read, more frequent crawling will almost certainly lead to an increase in visibility (i.e. higher ranking in SERPs).
What to always keep in mind is that when talking about crawl budget, you should always think about optimising your pages.
This article was originally published on our Italian blog: Crawl budget: cosa è e come aumentarlo per migliorare il tuo ranking
28/01/20Generation Z (Gen Z) is no longer the future, it is the present. Let's learn about their shopping habits, what makes them unique, and how to sell to them.
Trusted Shops creates new products to help businesses build more trust with their audience. Read about our latest product updates here!