Posted: 17 May 2019 07:31 AM PDT
SEO goes far and beyond keyword research and building backlinks. There is also a technical side of SEO that will largely impact your search ranking.
This is an area where your robots.txt file will become a factor.
In my experience, most people aren’t too familiar with robots.txt files and don’t know where to begin. That’s what inspired me to create this guide.
Let’s start with the basics. What exactly is a robots.txt file?
When a search engine bot is crawling a website, it uses the robots.txt file to determine what parts of the site need to be indexed.
Sitemaps are hosted in your root folder and in the robots.txt file. You create a sitemap to make it easier for search engines to index your content.
Think of your robots.txt file like a guide or instruction manual for bots. It’s a guide that has rules that they need to follow. These rules will tell crawlers what they’re allowed to view (like the pages on your sitemap) and what parts of your site are restricted.
If your robots.txt file isn’t optimized properly, it can cause major SEO problems for your website.
That’s why it’s important for you to understand exactly how this works and what you need to do to ensure that this technical component of your website is helping you, as opposed to hurting you.
Find your robots.txt file
Before you do anything, the first step is verifying that you have a robots.txt file to begin with. Some of you probably never came here before.
The easiest way to see if your site already has one is by putting your website’s URL into a web browser, followed by /robots.txt.
Here’s what it looks like for Quick Sprout.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Quicksprout-Robots-Txt-300×185.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Quicksprout-Robots-Txt-768×472.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Quicksprout-Robots-Txt-1024×630.png 1024w” sizes=”(max-width: 1138px) 100vw, 1138px” />
When you do this, one of three things will happen.
Most of you will likely fall into the top two scenarios. You shouldn’t get a 404 error because the majority of websites will have a robots.txt file setup by default when the site was created. Those default settings should still be there if you’ve never made any changes.
To create or edit this file, just navigate to the root folder of your website.
Modify your robots.txt content
For the most part, you normally don’t want to mess around with this too much. It’s not something that you’re going to be altering on a frequent basis.
The only reason why you would want to add something to your robots.txt file is if there are certain pages on your website that you don’t want bots to crawl and index.
You need to get familiar with the syntax used for commands. So open up a plain text editor to write the syntax.
I’ll cover the syntax that’s most commonly used.
First, you need to identify the crawlers. This is referred to as the User-agent.
This syntax above refers to all search engine crawlers (Google, Yahoo, Bing, etc.)
As the name implies, this value is speaking directly to Google’s crawlers.
After you identify the crawler, you can allow or disallow content on your site. Here’s an example that we saw earlier in the Quick Sprout robots.txt file.
This page is used for our administrative backend for WordPress. So this command tells all crawlers (User-agent: *) not to crawl that page. There’s no reason for the bots to waste time crawling that.
So let’s say you want to tell all bots not to crawl this specific page on your website. www.yourwebsite.com/samplepage1/
The syntax would look like this:
Here’s another example:
This would block a specific file type (in this case .gif). You can refer to this chart from Google for more common rules and examples.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Common-Robot-Txt-Rules-Examples-300×253.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Common-Robot-Txt-Rules-Examples-768×648.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Common-Robot-Txt-Rules-Examples-1024×865.png 1024w” sizes=”(max-width: 1426px) 100vw, 1426px” />
The concept is very straightforward.
If you want to disallow pages, files, or content on your site from all crawlers (or specific crawlers) then you just need to find the proper syntax command and add it to your plain text editor.
Once you’ve finished writing the commands, simply copy and paste that into your robots.txt file.
Why the robots.txt file needs to be optimized
I know what some of you are thinking. Why in the world would I want to mess around with any of this?
Here’s what you need to understand. The purpose of your robots.txt file isn’t to completely block pages or site content from a search engine.
Instead, you’re just trying to maximize the efficiency of their crawl budgets. All you’re doing is telling the bots that they don’t need to crawl pages that aren’t made for the public.
Here’s a summary of how Google’s crawl budget works.
It’s broken down into two parts:
The crawl rate limit represents how many connections a crawler can make to any given site. This also includes the amount of time between fetches.
Websites that respond quickly have a higher crawl rate limit, which means they can have more connections with the bot. On the other hand, sites that slow down as the result of crawling will not be crawled as frequently.
Sites are also crawled based on demand. This means that popular websites are crawled on a more frequent basis. On the flip side, sites that aren’t popular or updated frequently won’t be crawled as often, even if the crawl rate limit has not been met.
By optimizing your robots.txt file, you’re making the job of the crawlers much easier. According to Google, these are some examples of elements that affect crawl budgets:
By using the robots.txt file to disallow this type of content from crawlers, it ensures that they spend more time discovering and indexing the top content on your website.
Here’s a visual comparison of sites with and without an optimized robots.txt file.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Robots-Txt-Visual-Comparison-300×230.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Robots-Txt-Visual-Comparison-768×590.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Robots-Txt-Visual-Comparison-1024×786.png 1024w” sizes=”(max-width: 1112px) 100vw, 1112px” />
A search engine crawler will spend more time, and therefore more of the crawl budget, on the left website. But the site on the right ensures that only the top content is being crawled.
Here’s a scenario where you’d want to take advantage of the robots.txt file.
As I’m sure you know, duplicate content is harmful to SEO. But there are certain times when it’s necessary to have on your website. For example, some of you might have printer-friendly versions of specific pages. That’s duplicate content. So you can tell bots not to crawl that printer-friendly page by optimizing your robots.txt syntax.
Testing your robots.txt file
Once you’ve found, modified, and optimized your robots.txt file, it’s time to test everything to make sure that it’s working properly.
In order to do this, you’ll need to sign into your Google Webmasters account. Navigate to “crawl” from your dashboard.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Webmaster-Tools-Crawl-300×228.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Webmaster-Tools-Crawl-768×585.png 768w” sizes=”(max-width: 772px) 100vw, 772px” />
This will expand the menu.
Once expanded, you’re going to look for the “robots.txt Tester” option.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Robots-Txt-Tester-300×141.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Robots-Txt-Tester-768×362.png 768w” sizes=”(max-width: 798px) 100vw, 798px” />
Then simply click the “test” button in the bottom right corner of the screen.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Robots-Txt-Test-300×172.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Robots-Txt-Test-768×440.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Robots-Txt-Test-1024×586.png 1024w” sizes=”(max-width: 1418px) 100vw, 1418px” />
If there are any problems, you can just edit the syntax directly in the tester. Continue running the tests until everything is smooth.
Be aware that changes made in the tester do not get saved to your website. So you’ll need to make sure you copy and paste any changes into your actual robots.txt file.
It’s also worth noting that this tool is only for testing Google bots and crawlers. It won’t be able to predict how other search engines will read your robots.txt file.
Considering that Google controls 89.95% of the global search engine market share, I don’t think you need to run these tests using any other tools. But I’ll leave that decision up to you.
Robots.txt best practices
Your robots.txt file needs to be named “robots.txt” in order to be found. It’s case-sensitive, meaning Robots.txt or robots.TXT would not be acceptable.
The robots.txt file must always be in the root folder of your website in a top-level directory of the host.
Anyone can see your robots.txt file. All they need to do is type in the name of your website URL with /robots.txt after the root domain to view it. So don’t use this to be sneaky or deceptive, since it’s essentially public information.
For the most part, I wouldn’t recommend making specific rules for different search engine crawlers. I can’t see the benefit of having a certain set of rules for Google, and another set of rules for Bing. It’s much less confusing if your rules apply to all user-agents.
Adding a disallow syntax to your robots.txt file won’t prevent that page from being indexed. Instead, you’d have to use a noindex tag.
Search engine crawlers are extremely advanced. They essentially view your website content the same way that a real person would. So if your website uses CSS and JS to function, you should not block those folders in your robots.txt file. It will be a major SEO mistake if crawlers can’t see a functioning version of your website.
If you want your robots.txt file to be recognized immediately after it’s been updated, submit it directly to Google, rather than waiting for your website to get crawled.
Link equity cannot be passed from blocked pages to link destinations. This means that links on pages that are disallowed will be considered nofollow. So some links won’t be indexed unless they’re on other pages that are accessible by search engines.
The robots.txt file is not a substitute for blocking private user data and other sensitive information from showing up in your SERPs. As I said before, disallowed pages can still be indexed. So you’ll still need to make sure that these pages are password protected and use a noindex meta directive.
Sitemaps should be placed at the bottom of your robots.txt file.
That was your crash-course on everything you need to know about robots.txt files.
I know that lots of this information was a little technical, but don’t let that intimidate you. The basic concepts and applications of your robots.txt are fairly easy to understand.
Remember, this isn’t something that you’ll want to modify too frequently. It’s also extremely important that you test everything out before you save the changes. Make sure that you double and triple-check everything.
One error could cause a search engine to stop crawling your site altogether. This would be devastating to your SEO position. So only make changes that are absolutely necessary.
When optimized correctly, your website will be crawled efficiently by Google’s crawl budget. This increases the chances that your top content will be noticed, indexed, and ranked accordingly.
Posted: 16 May 2019 01:01 PM PDT
There are so many different elements of SEO.
For the most part, all of these various aspects can be broken down into two main categories; on-page SEO, and off-page SEO.
The biggest factor of off-page SEO is backlinks. While on-page SEO is comprised of elements like content copy, title tags, meta descriptions, internal linking, and site architecture.
It’s nearly impossible (unless you get extremely lucky) to have a successful on-site SEO strategy without conducting keyword research.
There are tons of tools on the web to help you with finding keywords related to your business, but Google Keyword Planner is arguably the most powerful.
The best part about this tool is that it’s completely free for anyone to use. All you need is a Google Ads account.
It’s also worth noting that the primary purpose of the Keyword Planner is for PPC advertising.
But with that said, you don’t need to spend any money on ads to do your keyword research with this tool. The only thing you won’t be able to access is the exact monthly search volumes for specific keywords. As you’ll learn shortly, you’ll still be able to see an average range, but Google will only show exact volumes when you run an ad campaign.
I’m assuming that most of you already have a Google Ads account. If not, it’s very easy to set up. So go ahead and do that as soon as you’re ready to proceed.
Then just follow along this guide to learn how you can take full advantage of the Keyword Planner for bringing your SEO strategy to an elevated level.
Google Keyword Planner features
Before we dive too deep into the specifics, it’s important for you to understand exactly what the Keyword Planner can be used for.
As I said before, this tool is designed with PPC ads in mind. So about half of what you’ll see is going to be geared toward running a successful paid search campaign. These are some of the top benefits of Google Keyword Planner:
It’s worth noting that some of these features are only available if you sync your Google Ads account with your Google Analytics account.
For our purposes today, we’re going to stick to the features that focus on finding keywords that you can use to improve your on-page SEO strategy.
Discover new keywords
The first thing you should use the Keyword Planner for is finding new keywords. This is very straightforward.
Once you know what keywords are related to your site, brand, niche, or a specific campaign, then you’ll be able to use those keywords to improve the content and on-page SEO of your website.
So log into your Google Ads account and navigate to the Keyword Planner.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Keyword-Planner-300×109.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Keyword-Planner-768×279.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Keyword-Planner-1024×372.png 1024w” sizes=”(max-width: 1999px) 100vw, 1999px” />
From your Google Ads dashboard, click on the “tools” icon in the top right corner menu bar. This will expand the menu, showcasing an additional five categories.
Now select “Keyword Planner” from the planning list on the left side of the expanded menu.
Next, you’ll be presented with two options.
For now, just select “find new keywords.”
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Find-Keywords-300×115.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Find-Keywords-768×294.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Find-Keywords-1024×391.png 1024w” sizes=”(max-width: 1978px) 100vw, 1978px” />
Next, it’s as simple as entering keywords into a search bar and letting Google take care of the rest for you. Although it seems simple, this is probably the most important step of the entire process.
The Keyword Planner tool is extremely advanced, but it can’t provide you with valuable keywords unless your initial search terms lead it in the right direction.
A great benefit of this search bar is that it allows you to enter words, phrases, and a URL that’s relevant to your business. To get the most out of your searches, I recommend taking full advantage of the search options at your disposal.
Here’s a look at an example of what a search would look like if I was conducting keyword research for content here at Quick Sprout.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Example-Keywords-300×127.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Example-Keywords-768×325.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Example-Keywords-1024×434.png 1024w” sizes=”(max-width: 1718px) 100vw, 1718px” />
As you can see, I used some single words like “SEO,” two-word phrases like “content marketing” or “ecommerce conversions,” and even some three-word phrases like “small business marketing.”
I also included a link to the Quick Sprout homepage to give the tool a better understanding of the content related to our site.
This is much better than just adding “marketing” to the search bar without adding anything else.
Analyze the search results
Once you begin your initial search, you’re going to get lots of information thrown at you. Do not be overwhelmed or intimidated by this. We’ll eventually narrow down the results.
Again, if you’re not planning to run any PPC campaigns, you can ignore some of this data.
First, let me show you how to read and interpret the results.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Interpret-Results-300×131.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Interpret-Results-768×335.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Interpret-Results-1024×446.png 1024w” sizes=”(max-width: 1999px) 100vw, 1999px” />
There’s a couple of things I want to point out right away.
The Keyword Planner generated 4,403 keyword ideas based on my initial search. By default, the results that you’re going to see are based on the last twelve months of search data. But you can play around with that to see how the data changes if you view those keywords over a longer or shorter period of time.
Before you get new keyword ideas, the Keyword Planner shows you results for what you’ve already searched for.
The only columns you’re going to want to look at are average monthly searches and competition.
Ad impression share, top of page bids, and account status are all for pad ad campaigns.
As you can see, the average monthly search ranges are pretty broad. For example, it says that the search range for “SEO” is from 100,000 to 1 million.
There is a big difference between 150,000 searches and 950,000 searches, which both fall into that range. But the only way to get the exact data is by running an ad.
The competition data is crucial.
High competition keywords are going to be more challenging to rank for since more people are running paid ads for these words and phrases. But maybe you can try to gain an advantage over your competitors by taking steps to outrank them organically.
Some of you might have more success with low competition keywords. It all depends on your priority and the keywords in question.
By looking at the search results above, the term “social media marketing” has a high competition level, while “link building” is low. Let’s keep this information in mind as we continue.
Now it’s time to analyze the keyword ideas based on the keywords that we searched for.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Analyze-Keywords-1-150×150.png 150w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Analyze-Keywords-1-297×300.png 297w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Analyze-Keywords-1-768×775.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Analyze-Keywords-1-1015×1024.png 1015w” sizes=”(max-width: 1132px) 100vw, 1132px” />
Here are the top 12 keyword ideas, sorted by keyword relevance.
I highlighted some of the suggestions to give you an idea of how you should be approaching this process.
All of the keywords on the list are useful and worth incorporating into your content. But you need to find ways to prioritize them.
Low competition keywords with high search volumes might seem like the easiest for you to rank for. But it doesn’t always work out that way.
For example, look at the data for “social media.”
It has a high search volume and low competition, so it must be a home run—right? Not necessarily.
Since that topic is so broad, it will be tough to rank for. That’s probably why people aren’t spending money on PPC campaigns to rank for that term.
On the flip side, “social media manager” at the bottom of the screenshot has a high search volume and high competition level, making it a challenging keyword to rank for.
Now let’s take a look at the keyword ideas that I boxed.
Both of these have high search volumes and medium competition levels. Ranking organically for these keywords won’t necessarily be easy, but it’s definitely not impossible.
Even though a term like “web marketing” has a lower search volume, it’s still in that 1,000 – 10,000 range, and has low competition. I boxed that as well because it’s related to the two other terms we’re discussing.
You could potentially use these three terms to conduct a new search that’s more specific. But we’ll get into that shortly.
Before you get into anything more complex, you should experiment with filtering the results.
Organize the keyword ideas by low competition, high competition, low search volume, and high search volume.
Narrow your search
Now that you’ve taken some time to sort your list of keywords, you’ve probably realized that 4,400+ keywords are too much. Lots of these keywords won’t be used by you.
So you’ll want to narrow the results to make sure that you’re only seeing ones that are the most relevant, and will actually benefit your SEO strategy.
The easiest way to do this right away is by changing one of the filters from “broadly related ideas” to “closely related ideas.”
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Related-Ideas-300×162.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Related-Ideas-768×416.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Related-Ideas-1024×554.png 1024w” sizes=”(max-width: 1138px) 100vw, 1138px” />
As you can see, this filter alone cut the search results in half.
So scroll through and get more keyword ideas using the new results. Use the keywords on this list to help you create new searches that are highly relevant.
Refer back to what I did earlier.
I took SEO company, digital marketing agency, and web marketing from that initial list of ideas. Here’s what the search results look like for those keywords combined with the Quick Sprout URL.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Location-287×300.png 287w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Location-768×803.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Location-980×1024.png 980w” sizes=”(max-width: 1286px) 100vw, 1286px” />
Those new terms combined with the closely related filter yielded 296 keyword ideas.
This list is much more reasonable for you to manage.
As I mentioned earlier, you can also narrow your results by locations, language, and search networks.
For example, let’s say you have a local business that has retail locations scattered across New England. You don’t need to get data on the entire United States.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Map-300×152.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Map-768×390.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Map-1024×520.png 1024w” sizes=”(max-width: 1999px) 100vw, 1999px” />
Instead, you can just focus on those six states in New England.
With that said, this feature is definitely more beneficial for those of you who will ultimately run PPC campaigns. In this case, you can choose to only target users who are searching in that region.
But it’s still worth seeing how the competition and search volume changes if you adjust the location.
With each list of ideas, you can download the information as an excel spreadsheet as well.
In my opinion, this makes it easier for you to keep notes and organize the data in a way that aligns with your SEO plan and content strategy.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Download-Keywords-300×42.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Download-Keywords-768×107.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Download-Keywords-1024×143.png 1024w” sizes=”(max-width: 1794px) 100vw, 1794px” />
Just look for the “download keyword ideas” button at the top right corner of each page.
View keyword forecasts
Head back to the main keyword planner page that we landed on earlier.
Only this time, we’re going to select the other option; get search volume and forecasts.
quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Search-Volume-Forecast-300×99.png 300w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Search-Volume-Forecast-768×254.png 768w, quicksprout-wpengine.netdna-ssl.com/wp-content/uploads/2019/05/Search-Volume-Forecast-1024×339.png 1024w” sizes=”(max-width: 1824px) 100vw, 1824px” />
We previously saw the search volume when we were discovering new keywords. It showed us data from the past 12 months.
Maybe you changed around the date range and saw something different.
While the Keyword Planner tool won’t show you projected search volumes for the future, it will show you a forecast for your keywords if you decide to run a PPC campaign.
Based on those three keywords that we looked at most recently, Google projects that a PPC campaign would get 20,000 impressions and 280 clicks for $580 per month. The average search position would be #3.
This is not an ideal forecast. But it’s not awful either.
Personally, I wouldn’t proceed with it. But this decision is completely up to you.
You can use this tool to give you a better idea of how certain keywords will perform. If you can find a way to get 20,000 monthly impressions organically using these keywords, it will be better than paying for it.
But you might see forecasts that are worth pursuing based on the keywords, projections, and the budget that you’re willing to allocate for paid keywords.
Now that you have your list of keywords, it’s time to enhance your website’s on-page SEO.
Decide which keywords you want to prioritize, and then produce content that will help you rank for those terms. Write blog posts and guides. Create images, videos, and infographics. Produce content that’s a combination of these.
Focus on your title tags, header tags, and internal linking with exact-match keywords.
You can refer to my complete guide on SEO for more information on how to do this. There’s a section in here for on-site SEO that will help you out tremendously.
Start experimenting with Google Keyword Planner. Since it’s free to use, it can’t hurt to try.
Once you get familiar with navigating and searching, you can use this guide as a reference to help you find keywords that will be easier to rank for.
|You are subscribed to email updates from Quick Sprout.
To stop receiving these emails, you may unsubscribe now.
|Email delivery powered by Google|
|Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States|