How to use Google Search Console to Improve Your SEO

Google search console, previously known as Google Webmaster Tools is one of the best free SEO tools out there, but most people use it for pure vanity metrics, like checking if impressions or clicks increased.

Others use it to look at average ranking position. But these things as standalone metrics provide very little value. So today I’m going to show you how to use Google search console to actually improve for your website. Stay tuned. This lessons helps you grow your search traffic, research your competitors and dominate your niche. So before we dig into the tutorial, I’m going to assume that you’ve already handled the basics like verifying your site and submitting your sitemap to Google. If you haven’t done that yet, do it first because you’ll get a lot more value from this video. Let’s get to it. So the first step is to improve clickthrough rate for underperforming keywords. So let’s define an underperforming keyword as any page where you don’t rank in positions one and two for that keyword. And the reason why I’m defining it this way is because if you look at this graph of the curve, you’ll notice that anything not in positions one or two gets significantly less search traffic. For example, if your page is ranking in position seven, your Setara will be around two point one six percent. Now, if you were able to bump that keyword ranking to position to your click through rate would be around fifteen point five percent. Now, to put this into perspective, if the keyword your page rank for how to search volume of a thousand, then being in position seven would get you around twenty one search visits per month, whereas in position two you’d get one hundred and fifty five monthly search visits, which is more than seven times the search traffic. To find these underperforming keywords, go to the search results, report underperformance and make sure that you’ve clicked on the average KCR and average position boxes.

So they show up in the table below. Next, scroll down to the table and set up positions filter to show pages that have an average ranking position of less than eight point. One reason being, it’s easier to move from position eight to the top to since you’ll get more exposure than the absolute bottom of page one. Now, one limitation to search console is that they don’t let you set up positions range. So we’ll just sort the table by position in descending order and start skimming through the keywords for houris we may want to optimize our pages for. We’ll also want to look at the number of impressions too, because there’s likely no point in optimizing for a query with twenty impressions. One that stands out to me is this one, how to create backlogs. So I’ll click on that keyword, then I’ll go to the page stat from here. You’d want to do an individual analysis of this page and see if you can further optimize for the keywords since it’s virtually on the same topic. So whether that be on page work, adding internal links or something else, you’d have to assess the best course of action and experiment. But be sure to use some common sense. For example, you’ll see that we rank four three on redirect. So this keyword doesn’t make sense grammatically. So we wouldn’t throw in typos for the sake of quote unquote optimizing since we’re on the topic of click through rates, the next step is to find and analyze pages with high keyword rankings. But Lozito, that’s ETR curve that I showed you before just shows averages. So that means not every single keyword that ranks in position one is going to get a thirty percent click theory. So what we need to do is find out which keywords have subpar setters, analyze the cause and see if there’s a way to get more clicks and traffic to our pages. So while we’re still in the search results report, let’s change the position filter to show pages that have a ranking position of less than three, meaning better than a top three ranking. Next, I’ll sort the table by start from lowest to highest. Here’s an interesting one. We’ve got around seven thousand impressions for the keyword DIY SEO with a staff of only two percent while ranking in the top three. KTR should be somewhere in the ballpark range of nine to fifteen percent since I know we’re definitely matching search and ten here. So let’s go to Google and search for this keyword. Makes sense now. The featured snippet takes up a ton of real estate, then it’s followed by videos. The people also ask books and then our page, which is actually the number one organic ranking. So in this case, we could definitely work on trying to own that featured snippet. And we could also create a video tutorial around the topic to try and claim a spot in the video carousel. And if we were able to successfully execute, we’d own the entire fold of the search. Now our page on White Hat Link building also gets around a two percent ETR, even though on average we’ve been ranking in the top two for the past three months for this key we’re looking at the SERP. You’ll see that the entire top section is plastered with ads. Then a featured snippet of people also ask books and then the organic results where ours is actually the first BlueLinx result. In this case, it will come down to your priorities. The ads tell me that there’s commercial content to this keyword. And if we were to optimize and try and own the featured snippet, it may be worth the effort. But on the other hand, since there are a ton of ads for White Hadlee building services, which we don’t sell, only the feature snippet may not result in a crazy boosting clicks. So you’ll have some tough decisions to make, but that’s also tough decisions, some which will be super profitable and others that may not have been worth the time. All right. The next step is to check for site mappers, warnings and exclusions. Site maps are files that tell search engines which pages are important on your site. They also help crawlers crawl your site more efficiently. Now, if you have issues with your site map, then you might have a problem since you could be confusing crawlers leading to wasted time and resources on their end. To see if you have any issues, go to the site, map support, then click on the icon beside the site map. You want to investigate, you’ll see a few tabs showing the number of errors, warnings, valid, you URLs and excluded ones. Since we don’t have any errors for our blog, let’s look at the one issue under excluded. And you can see that one page has been excluded because of a duplicate submitted your URL, which is not canonical. If you click on the error, then you’ll see this URL unguessed plugin. Looking at the status code for this URL, you’ll see that there’s a three on redirect to our newer post on guest blogging. The reason why this happened is because the old article is still set as published in WordPress, which means that Yoast, which is the plugin we’re using, is adding it in our site map. So I’ll delete this post and the issue should resolve itself the next time Google checks our site map for issues. Next step is to find pages that need internal links are those that need to be pruned. Let’s say that you’re publishing a new post on the best dog treats. If you already have relevant pages on, let’s say, dog food and another one on puppy nutrition, it would make sense to add internal links from these pages pointing at your new post in assuming that you’ve built some link authority to these pages, there’s a chance that your new page will get indexed faster and rank higher. Now, certain posts don’t have many or any internal links. Then there’s a good chance that it’s a forgotten post, meaning it probably doesn’t get much search traffic or provide much value to your site. So let’s go to the Links report in Google search console, where you’ll see a summary of various categories for both external links and your top linked pages via internal links. And I’ll click on the more link under internal links. Now, let’s at the table by the number of internal links pointing at our target pages to find the forgotten, one of the pages that pops up is this one with a slug hire me page, which only has one internal link and you’ll see that it was published back in October. Twenty fifteen. Now, this page isn’t exactly in line with what we published today, so I’m sure we’ll be deleting or redirecting it to another relevant page soon. Now, if you find pages that are worth keeping, then it’d be advantageous to either a add more internal links pointing at them or B update the content and add more internal links where appropriate. For example, if we had an old post on Kiwa research that was out of date, we first update our content to make it relevant today. Then we can go to Google and search for something like, say, colon, your domain and then I’ll add keyword research to the court. This will show you all pages on your site that include your target keyword there. Just visit the pages and add internal links where it makes sense. Now, the main downside to search console’s internal links report is that they don’t show you pages that have zero internal links. These are called orphan pages. Now, assuming there are no external backlands pointing at these orphan pages, this poses a problem because they can’t be crawled by Google, meaning it won’t be indexed and never discovered through search. Now, search console is a super powerful tool. And when it comes to accuracy for your own site, I strongly recommend using it. But there are three huge limiting factors to your success. First is that it’s extremely limiting when it comes to discovering deeper technical issues on your site. And even if you’re a whiz with Google Sheets, they only allow you to export up to a thousand rows of data in places like their links reports. The second one is that there’s no keyword volume data. Yes, Google has keyword planer, but that’s only somewhat useful if you’re paying for ads. Otherwise, you end up with range values like this. And even if you do have search volume numbers, they’re rounded annual averages, which gives you super broad estimations. The third and final downside is the biggest when it comes to doing SEO in is that you only have data for your site. SEO isn’t a one percent game. You’re competing against other websites and pages for the top spot. And it’s extremely difficult to do anything meaningful without understanding the competitive landscape for the keywords you’re targeting. So here’s my advice for you. If you’re new to SEO, search console is going to be the best place to start. You’ll get a ton of insights on your website for free. But if you want to get any kind of competitive analytics on your competitors, then you’ll need third party tools like HFS to help you achieve that.

For example, if we enter back link to dot com and decide Explorer, which isn’t our domain, and go to the organic keyword support, you’ll see that he’s ranking for YouTube tags in around forty four thousand other keywords. Now I know we don’t rank for this keyword, and since HFS Keywords Explorer provides YouTube keyword data, this might be a topic we could potentially target in the future.

Leave a comment

Your email address will not be published. Required fields are marked *