SEO Audit – How to Fix Your Website’s Technical SEO Issues (Tutorial)

I’m going to help you find technical issues on any website. And so we’re going to focus on a workflow using a site audit tool. If you’re already in the user, you can follow along step by step, pause and resume the routine.

So first you’ll need to go to a site audit tool. If this is your first project, then you’ll see an option to create a new project right in the middle of the screen entering your domain. For now, I’ll be doing an audit on pro blogger dot com. For our example here, you’ll need to set your seeds and scope. So first is scope, which is basically the boundaries of which you want to crawl your site, since we’ll be focusing on a basic audit and we’ll set our scope as pro bloggers entire domain, which includes their subdomains too. But you can do an audit on just subdomains of folders or even an exact URL if you wanted to. You’ll see at the bottom of the screen that iTrust validates the URL. So you want to make sure that you get a two hundred response code before moving on to the next step. This section down here are where seeds are, the seeds are the euros or the URL where iTrust will begin its crawl. There are a few options you can choose from here, like the specified URL. So in this case, pro bloggers homepage, you can also choose to have your crawl start from your URLs that have back links from site maps or from your own custom list of URLs. And since we’re keeping things simple, we’ll start from their home page. It’s important to note that your seats must be within your scope. So a common example might be if you have a blog on your main domain and you run a Shopify store on a subdomain like Now, if you wanted to isolate your order to your store only and you set your scope as Stordahl domain dotcom and then you said your seem to have a custom URL of the homepage or site maps of your domain, then your seats would be out of scope and the crawl would actually never start. All right. So click the next button and you’ll have the option to verify your website. Verifying your website is similar to how you would do it with Google search console. So in short, the benefit is that you have your website crawl faster and you get access to some advanced features, but you don’t have to do this to run a site on it. So for now, we’re just going to click next, which will take us to the crawl settings. Now, a lot of these settings are self-explanatory. The one that I do want to recommend and touch on is the execute JavaScript option. By setting this on, it allows site on it to analyze pages and links that depend on JavaScript, which will result in the most accurate website audit.

So if you use JavaScript frameworks like ANGULAR or react, then you would definitely want to set this to on the last two things you want to set are the maximum number of internal pages in the maximum crawl duration. So if you know you have a small website, then you can leave these on the default settings at ten thousand pages and emacs crawl duration of 48 hours, which should be sufficient. But if you’ve been blogging every day for the past 10 years or you have some kind of user generated platform like a forum, then you’ll want to set these to a higher number. So since a blogger has been around for a while, I’m going to set the maximum number of pages to fifty thousand and I’ll set it to the maximum allowed duration to make sure we catch everything. Then there’s some advanced features here if you really want to laser in on subsections of your audit. But I won’t cover that in this video. If you guys want to see more advanced tutorials on using site audit tool, they just let me know in the comments. Or you can just answer the poll in the top right corner of your screen that’s about to trigger. All right, so last step, click next, and you’ll have the option to run a scheduled crawl on a daily, weekly or monthly basis, and this is super cool because as you continue adding pages, you start deleting them and your restructuring things on your website site audit will continue to find them on complete autopilot.

And if you want to run just a one off audit, then you can turn the scheduled crawl to us. Finally, if you want the order to run immediately, leave this switch in the on position and click create project. Right away you’ll be able to see the live crawl happening on your website and get real time data in the overview page, which will be moving on to next. So I already ran the full audit on Pro Blogger, and you can see this fancy dance dashboard here with an overview of pro bloggers technical issues. The first thing that you probably noticed is the health score. Health score represents the proportion of URLs on a site that have critical issues. And since many Web sites will have thousands of pages, we assign a grade. So to simplify this concept, if we crawl one hundred pages and 30 of them each have at least one critical issue, then your health score will be 70. On the overview page, you’ll see a few graphs that cover the basics like content types of internal URLs and HTTP status codes. And it’s worth noting that everything that you see on this page has clickable links, which will give you deeper insights in Data Explorer here you can see that there are one thousand one hundred and eighty four four hundred series errors.

That’s four point sixty three percent of their internal URLs. And these are most likely broken 404 pages on their website. And if we click the link on this graph, it’ll open up Data Explorer, where we can see all of the affected pages with this error. Data Explorer is basically the heart of a travel site audit tool. And this is where you can get access to literally all of the raw data and customize it however you want. And you’ll notice that by clicking on one of the links from the overview page that we set up preset filters for you, which you can expand by clicking here. If you’re an absolute beginner to technical SEO, then I’d recommend sticking with some of the preset filters that we provide in the overview page, like the broken four hundred series errors that we’re looking at right now, and then start moving on to your own custom configurations later. Now, obviously, fixing over a hundred broken pages isn’t going to be at the top of your priority list, right? So what I would recommend doing is prioritizing this workflow by adding one custom column here, click on manage columns and then in the search bar here, just typing. Do follow and choose the number of do follow back links under the metrics category. Click the apply button and right away you’ll see the new column here, which you can then sort in descending order to see which 404 pages are wasting the most link equity.

This is one of the awesome features withinside audit. You’ll get access to a ton of metrics which you can include in virtually any audit report, so you can then export the list to CSV and start picking away at each 404 error. Or with a massive list like this, you could outsource it to a freelancer and have them tackle each issue in the priority that you want them to be fixed. OK, so back to the overview page. If we scroll down a bit, you’ll see this graph of HTML tags and content where we can get some quick wins. The two things you should focus on are the bad duplicates and the ones that are not set as indicated in red and yellow. So the one that stands out here is obviously the meta descriptions. A good a description is crucial for attracting clicks to your website and more clicks is equal to more visitors. Right. So are these worth fixing? Most likely. Again, all of these sections are clickable. This particular site has one hundred and sixty five bad duplicates on the content itself. So basically duplicate content issues. So we’ll click here to see the affected pages in the table. The first result that comes up is this page on creating content. And you might have noticed that the columns change from the last time we were in here assessing for or for errors. And this is because each report and data explorer is set up to provide you with the resources you need to actually analyze and fix these issues. So under the number of URLs having the same content, we can see that this one has two different pages. So if we click on this, then you can see that there are two pages here. One has the slash at the end and the other doesn’t all open up both of these pages in a new window. And sure enough, both are the exact same page without a proper redirect. And I’ll open up the source code for each of these pages. And if I do a quick search for the word canonical, you’ll see that neither have the set. So it is indeed a bad duplicate. So jumping back to the previous page, you’ll see that the reason we found this page in the first place is because of this column here, number of links. The correct URL has nearly twelve thousand internal links pointing to it in the one without the slash has one internal link pointing to it. So if we click on the one under the number of links, we can see that. A page that has the improper hyperlink is from their start here page, so to correct this issue, there are potentially two things that you could do here.

The first is to set the Réal equals canonical tag inside the head section of the page. And the second thing that you could do is you could just change the URL in the start here page to the Crooklyn, or you could just do both, since they’re pretty quick and easy to do. Clearly, you can see that this page is an important one, considering nearly half of the pages on the entire domain are linking to it. OK, so let’s jump back to the overview page and give you a bit more of a structured workflow. If you continue scrolling down the page, you’ll see this table here in this table shows all of the actual issues that we found during our crawl. And there are three types of issues. We call them errors, warnings and notices. And you can choose a value in this dropdown to see each category. So in terms of a workflow, what I would recommend doing is to filter for errors and then tackle those issues first, since they’re likely the most pressing. The cool thing about this table is that we don’t just tell you that your website has errors, but we give you actionable advice on how to fix them, too. So you might look here and see that your website has two hundred and nineteen redirect chains. We have no idea what they are. No problem. Just click on the info icon and it’ll bring down the issue of details as well as SEO Best Practices. Advice on how you can fix it. Next, you can click on the number under total URLs to see the affected pages. If you’re a pen and paper kind of person, then you can just export this list here, print it out and pick away at each issue, finishing off by adding a satisfying checkmark to your list. Or if you have a team of CEOs on your side, then you can export each issue, send the CSV file and assign it to the appropriate person. Then you can go back to the overview page and continue working on the different issues and move on to the warnings as well as the notices and as your schedule. Kraul continues to run at your set interval, you should see your health score go up and hopefully that will result in more organic traffic for your website. So that’s it for this SEO tutorial, as your are one of those rare things that you have complete control over with search engine optimization.

So I highly, highly, highly recommend going in and fixing these issues or at least running an audit to get a top level view of your websites as your health. Plus, you’re going to be improving the user experience for all of your wonderful visitors.

Leave a comment

Your email address will not be published. Required fields are marked *