Ultimate magazine theme for WordPress.

How to Perform a Complete SEO Site Audit: A Step-by-Step Guide. 25+ useful tools

How to Perform a Complete SEO Site Audit: A Step-by-Step Guide. 25+ useful tools

The main tasks that an SEO specialist faces are a thorough study of the content, structure, technical characteristics of the site and the external factors that affect it, identification of weaknesses, planning and implementation. implement the necessary improvements, taking into account the principles of search engines.

And most importantly, such analysis and optimization should be done regularly, because search algorithms are constantly being improved and many SEO tactics become obsolete and stop 

The main tasks that an SEO specialist faces are a thorough study of the content, structure, technical characteristics of the site and the external factors that affect it, identification of weaknesses, planning and implementation of necessary improvements. , taking into account all the principles of search engines. And most importantly, such analysis and optimization should be done regularly, because search algorithms are constantly being improved and many SEO tactics become obsolete and stop working.

While you can only get the best results from an SEO audit by working with a professional, you can learn a lot on your own by using the guide below, links to external content, and SEO tools. you will find in the second half of this article. . For your convenience, all of the tools in the article are clickable. Throughout the article, Yandex and Google tools will be mentioned as the most important and practical tools for search engine promotion.

Step 1. Preparing for an SEO Audit The

The best place to start is to explore your site using a crawler (search engine) such as Screaming Frog SEO Spider. This tool analyzes the code, content, internal and outgoing links, images, and other elements of the site from an SEO point of view and provides an overview of the state of play.

We must not forget about the capabilities of standard search engine services – Yandex. Webmaster and Google Webmaster Tools – they also provide a large amount of valuable information.

Step 2. Internal SEO audit audit

Technical

Robots.txt

The robots.txt file is optionally placed in the root directory of the site and contains instructions for indexing it for search engine robots.

Using various robots.txt directives, you can:

  • prohibit or allow robots to index certain sections and pages of the site or the whole site;
  • specify the path to the sitemaps.xml sitemap, which facilitates proper indexing;
  • let the robot know which mirror on the site, if there are multiple copies, is the main mirror, and which mirrors do not need to be indexed;
  • reduce the load on the site of search bots if you need to save resources.

At the same time, different rules can be created for individual search engines and even for different bots of the same system.

Use all the features of the robots.txt file. Make sure that no indexing of “sensitive” areas of the site, pages with poor quality content, and duplicate pages are prohibited. Check if access is allowed to all areas of the site that need to be indexed by search engines.

Yandex. Webmaster, Google Webmaster Tools, and other services will help you analyze your robots.txt file.

Guidelines for search bots in tags

But for even more flexible control over the indexing of a site and its individual pages, place guidelines for search bots in tags. This will allow or prevent bots from indexing specific pages and clicking on links placed on them.

Sitemap XMLSitemap

A file (sitemap) is added to the root directory of the site and gives search engines information on which pages on the site to index, which should be indexed first, and how often they are updated. day.

If the site already has a Sitemap file (preferably in XML format), check the accuracy of its code using a validator (such a tool is available, in particular, in the services for webmasters of Yandex and Google). Also, make sure that your sitemap contains no more than 50,000 URLs and weighs no more than 10MB. If these limits are exceeded, you will need to create multiple sitemaps and a Sitemap Index file listing all the maps.

If you don’t have a sitemap yet, create it manually or by using one of the many tools (e.g. XML Sitemap and its analogs, plugins for WordPress, and other common engines; a large list of tools can be found on Google Resources).

After creation, analyze the map in the validator and notify search engines of its existence through their webmaster services, as well as adding the sitemap path to the robots.txt file.

Markup Validator – HTML code validator to eliminate inaccuracies and code errors that reduce the site’s position in search results. Created by the community that develops and approves international standards for web development.

Site Indexing Quality Score

: Usually entered in the search engine bar to narrow the search area to a specific domain. But this command also lets you know the approximate number of all the pages on the site that have been indexed. To do this, simply enter the site: with the domain name without the search words.

Compare the number of indexed pages with the total number of pages on the site, which you learned in the stage of creating a sitemap.xml and crawling the site using Screaming Frog, Xenu’s Link Sleuth or other tools.

If the two numbers are almost identical, the site is well-indexed. If not all pages are indexed then find out the reason – maybe your site is not optimized for search engines, has many sections and pages that are closed for indexing, or face penalties. If the number of indexed pages exceeds their actual number, you probably have a lot of duplicate content on your site, which will be covered later in this article.

Let’s take a closer look at how the Google Webmaster Tool works and what it shows.

When you open it, you see a similar graph showing

the number of pages currently indexed. The graph shows data for the last year.

And if that shows that the number of indexed pages is constantly increasing, that’s great. This means that new content on the site is found, ranked and added to the Google index.

But sometimes more detailed information is needed. For this, there is an “Extended data” button. After clicking on it, a new graph opens that

displays not only the number of pages indexed at any given time but also the total number – those that have been scanned during the entire specified period. It also shows the number of pages blocked by the robots.txt file and unindexed pages.

HTTP Status Codes The HTTP status

code is part of the first line of the server response for HTTP requests. This is an integer of three Arabic numerals found in the first line of the server response when requesting a web page and indicates its current state. You need to determine which site URLs are getting an error message – usually with a code like 4xx or 5xx. For example, we know that code 404 means that the page was not found, and code 503 means that an internal server error was detected. Code 200 says everything is working fine.

If the site uses redirects (redirects) from one URL to another, make sure that they are exactly 301 redirects, not 302, and not redirects written in tags or using JavaScript. By using a 302 redirect (temporary redirect) instead of a 301, the original URL will remain in Google’s index and retain its position as if the page was still available. However, users who click on the link will be redirected to your new URL – exactly where you intend to direct them.

To check HTTP status codes, you can use various services, for example, unspécialdes out iloutils Monitor Backlinks ouintégrés Yandex. Webmaster and Google Webmaster.

site

page URL of A competent page URL is not more than 100-120 characters, mainly consists of easy-to-read words (for example, this one: https://myacademy.ru/kursi/seo-poiskovaya-optimizatsiya / seo -dlya-nachinayuschih – contains keywords describing the page.

All of this not only helps better search indexing but also increases convenience for site visitors.

Try to avoid complex URLs with parameters and favor static links, use directories for sections of the site structure, rather than subdomains, separate words separated in the URL with dashes or underscores, this spelling is better perceived by site visitors. Site

loading speed

Internet users are impatient and immediately leave slow sites. Similarly, search engines have a certain delay to process each site, so fast sites are indexed more carefully and in a timely manner. shorter time period. 

How to analyze download site speed?

Use the advanced tools of web analytics systems (for example, reports on page load time are available in Google Analytics and Yandex.Metrica. And for the most comprehensive analysis of speed, you can use specialized services, for example, Google PageSpeed ​​Insights:

or YSlow:

 

if the site needs acceleration, optimize the images in a graphics editor using the function of preparing graphics for publication on the Internet, reduce the amount of code HTML and CSS, remove unnecessary JavaScript code, use compression wisely, browser and server caches and other necessary actions

Step 3. Audit the site structure Site

architecture

The site should have a clear structure and logical pages, sorted by categories and closely linked to each other by internal links

Avoid a large number of levels of nesting: leave all important pages located within a water click from the main page, and other pages – no more than 3-4 clicks.

Such easy-to-use site architecture will allow search engines to index all pages on the site faster, and will help visitors not get lost and quickly find the information they need, which in the end will also have a positive effect on SEO.

Try not to use navigation menus created with Flash and JavaScript on your site. This is undesirable, although search engines are much smarter today than in the past.

If, however, JavaScript navigation is present on the site, go through two steps of indexing the site using Screaming Frog, Xenu’s Link Sleuth, or another specialized service (we talked about this at the start of this tutorial ): with JavaScript enabled and disabled. This will reveal if certain sections and pages of the site are inaccessible for indexing due to the presence of a JavaScript menu.

Internal Links 

internal contribute to a better indexing of the site and a reasonable distribution of the weight of the pages.

, will help you in this difficult task Page Rank Decoder, a tool for predicting page weight distribution when using various linking schemes.

Make lots of links between pages on the site, keeping to simple requirements:

  • not only use keywords as anchors, but also various neutral texts – for example, calls to action such as “check”, “download “, Etc. (this makes the overall mass of links more natural to search engines, while the abundance of keywords looks, suspect);
  • anchor text pages and keywords should be relevant to the content of the landing pages;
  • direct more links to those pages which should occupy higher positions;
  • link to these pages from the “Main”;
  • don’t place too many internal links on one.

Step 4: Content Audit Page

Title Page Titles

are the first thing your audience sees in search results and social media, after which they decide to visit your site. Therefore, it is important to pay special attention to optimizing headlines.

Briefly form your headlines: Try not to exceed 70-90 characters, otherwise, the headline may be cut off in search results, on social media, and Twitter users will not be able to add comments to it.

The titles of the service and the various information pages of the site (with the exception of articles and other products of similar content) must accurately describe their content.

Remember to add keywords to your page titles – preferably near the beginning. But don’t go overboard: As with any page content, cover people, not machines.

Make sure all the pages on your site have unique titles. This will help you, for example, the Google Webmaster Tools service, which has a tool for finding pages with the same titles.

Page descriptions in tags

A page description from a tag can be included in a snippet in search results. So it’s worth taking a responsible approach to handling important page meta descriptions. A

code snippet is a block of information about a found document that is displayed. in search results. A code snippet consists of a title and a description or annotation of the document, and may also include additional information about the site.

Create descriptions from multiple words, with a total length of 150 to 160 characters. It should be cohesive text that talks about a specific page, not the entire site, and not overloaded with keywords. Keep descriptions up to date: If the information on the page has been updated and the description is out of date, make any necessary changes.

Let each page have a unique description. You identify all pages with the same tag information

can use the Google Webmaster Tools for.

Keywords in Tags

For some time now, most search engines have ignored keywords in tags, so it makes sense not to add this information to the site code at all, so as not to provide competitors unneeded data on your SEO strategy.

Content

This truth is already common, but: you need to create content for people, not search engines. Don’t get carried away with search engine optimization – it will make the text almost unsuitable for comfortable reading and ultimately negatively affect SEO results.

Make sure that your site’s pages contain valuable and unique content, not over-optimized text and copies of other sites, and that the amount of content on each page exceeds 300-400 words (it is proven that (other things being equal, pages with 2,000 words and more tend to rank higher in search results). In modern algorithms, the optimal amount of content is automatically generated for different topics. In some topics, 200 to 400 words per page will suffice, and in some areas, you will need large texts. If we take the issue very seriously, the amount of content for each page is determined by an analysis of the search results for specific requests in a specific region.

Here, the following services will come to your aid:

ContentMonster

Content Exchange. Checks text for uniqueness monitors the quality of the authors’ work, returns 120% of the cost of the text if you consider it low quality.

Webex

Combines an article exchange and an article promotion service. Editors will write articles for your project, webmasters will post articles with perennial links on reliable sites.

Content-Watch

Tool to check the uniqueness of text content. Allows you to check the uniqueness of the text entered by the user and the uniqueness of the content of your site.

Advego Plagiatus

Free software to check the uniqueness of texts. Shows percentage of match, text sources. Check texts and web pages.

Pay attention to the presence of keywords in the text of the pages – first of all, in the first paragraphs. Edit the text so that using keywords does not lead to repetitions and meaningless phrases written for the sole purpose of re-mentioning the keyword. Pleasant, thin, and useful text, with keywords invisible to the reader – that’s what you should strive for.

It goes without saying that you should ensure that the text of the site is free from grammatical errors: ignorance speaks of the unprofessional nature of the author, so much content will occupy low positions in the results of research.

Such services will help: that

Orforgraf youOrforgrafell check is one of the very first services that Artemy Lebedev offered to the runet.

 

Yandex.Webmaster spell checker – processes the entire web page. Checking spelling on Yandex is important from a neologism writing standpoint. How to write: Twitter or Twitter? Yandex will tell you in what form its robot is more likely to recognize controversial words.

Duplicate Content

If there is anything on or off your site that duplicates other site content, search engines need to determine which version to index and show in search results. And this problem only gets worse when other sites link to different copies of the page or the same page that opens at different URLs.

Here are some of the reasons for duplicate content: a

  • Site CMS can make the same pages available through different links;
  • the second part of the URL on many sites is dynamically generated, contains additional parameters, it is dynamically generated, contains additional parameters, the site’s
  • content is often stolen, published on other resources without a backlink, and the search engine is placed on other resources without a backlink, and a search engine for the system
  • when visiting the site, a session can be created with a single session, which is used in dynamic URLs (this is necessary, for example, to temporarily store information about items added to the cart, until the order is placed);
  • print-optimized versions of web pages may be considered duplicates.

Duplicate content on the site can be detected, for example, using Google Webmaster Tools (the service is able to find pages with the same titles and descriptions) and search operators, which can be specified with queries in the chain.

You can solve the problem of duplicate content by simply deleting it, creating 301 redirects, disallowing indexing of duplicates in the robots.txt file or in individual page meta tags, using the rel = “canonical” directive , and by other means.

Online titles Organize your publications

text clearly. Use headings (tag – for the most important of them), multi-level subtitles, highlight individual fragments of text in bold, italics.

Ideally, there should be only one on the page, which contains two or three-second level headers, in which, in turn, lower row headers are “nested”, and so on. The structure of the document should look something like this:

It is important to observe the measurement: too many bold headings and keywords will alienate readers just as much as unstructured monotonous text. Thoughtful formatting of the text allows the reader to perceive the text better, and headers with important words are a positive SEO as well. And the best way to structure your content on the site – to think mainly about people and texts online

Images

Most visitors are looking for information about images, and these people are a lot. Google and Yandex provide everyone with a picture search, where the use of keywords everyone else is everyone.

And if there is research, then there is traffic that can be used for their own purposes. Proper optimization of images for search engines will lead to results.

The most important attributes of images are ALT and Title. When uploading a photo for a post, all you need to do is save these attributes and your photos will lead to a visit.

To do this, analyze all the important images on your site, thematically related to the image cottage. For full indexing, the names of the graphics files, as well as the value of the alt tag attribute, must contain words describing the image, and preferably the keyword of the page. When doing this, change the name of the file and the value of the alt attribute.

Separate words in a file name with hyphens not underscores. The alt attribute should be used for the description, not the title, and it should be short – no longer than 150 characters, with all important words at the beginning of the text.

Of course, it would be great if the images were unique, and not just the images that came first or found on the internet.

Ensure fast loading of all images published on the site: remove all unnecessary areas, set the size of the images not exceeding the minimum required, optimize them in a graphics editor using the built-in images.

5. Step Improve

AnalyticsMetrics to be easily in the search using those recommended: in the search

Seolib.ru – automatically monitors the changes of positions of the site results of all. Monitors site visibility, analyzes link quality, finds and corrects errors, quotes link quality, finds and corrects errors from 

Seopult.ru is an automated website promotion service. Provides recommendations for promotion and a convenient system of presets. Buy links, analyze work efficiency.

MegaIndex is another website promotion automation service. Express audit, traffic counter, a compilation of a semantic kernel and work with key queries, analysis of competitors, social and behavioral factors.

BotScanner – analyzes site visitor data and draws conclusions about its quality. Allows you to separate bots and casual visitors from “useful” users. Evaluates the effectiveness of different traffic channels. Provides information visually.

6. ExternalOutbound Links

 

StageSEOOften times, site owners are attentive, but you also need to take care of outbound links.

Avoid links to low-quality sites and bad reputation sites: they will be negative sites and bad reputation sites: it will be negative If the placement of such a link is still unavoidable, use the rel attribute in the tag with the value of inevitably, use the rel attribute in the tag with the value use of

Link to external content that is thematically related to your page, and the link text is statically linked to your page, and the text of the link contains links Check

Backlinks

, and we have to take into account the number and, although the trends show a decline in the future.

Use services such as MajesticSEO, Open site explorer, Ahrefswebmaster, or Yandex and Google tools to find out how many sites are linked to your site, if they are thematically linked to your site, etc.

If you are linked to many quality sites in the same industry, you earn profits. If these sites have a bad reputation and contain poor quality content, search engines will start to treat your site as such.

Try to buy backlinks, the text of which contains the keywords you need. In this case, you must adhere to the measure: a large number of inbound links from other sites with the same or similar anchor text may lead to a decrease in the position of your site in the search for keywords contained in the text of the links. The total mass of backlinks to your site should appear natural to search engines: let some links have keywords in the anchors, others have calls to action like “click here” “Read” or any other neutral text.

RotaPost – buying links and reviews, advertising in blogs. The platform connects advertisers with owners of blogs, microblogs, and news sites willing to contribute their resources to post links.

B2Blogger is a service for sending press releases in Runet. Help distribute press releases on thematic resources, distribute posts to news aggregators Yandex and Google. It is well-indexed by search engines and monitoring services.

Pr. Sape is an exchange to buy eternal links. The service offers varying degrees of customization, from manual automation to full automation. Allows you to set up budgets, filter sites, control link indexing, buy links on social networks.

Miralinks is an article marketing tool. Manually publish articles on manually verified sites. There is a bank of ready-made articles written by the editors of the company.

GoGetLinks is another exchange for buying perpetual links. Offers to place links in notes, contextual links, and links in images. Automate processes, create notes.

SEO Analysis of Competitor Sites

An important part of the audit is looking at the SEO strategies and mistakes of your competitors.

Defending AgainstSEO

NegativeOptimization Negative search engines combine a set of black and unethical methods and technologies used by competitors to lower your site’s ranking in search results or even ban it from search results. search engine. Here are some ways to achieve these goals:

  • hack and compromise the site;
  • spread thousands of spam links on your site through comments;
  • create clones of your site and copy content without your permission;
  • create fake social media profiles, through which negativity spreads against your business and your website;
  • removal of trusted links from other sites to your site.

Conclusion A good SEO

the audit will not only make your site more search engine friendly but also increase its usability and value to visitors.

Provided it is performed correctly and regularly (at least twice a year), auditing significantly increases traffic – both to search engines and other types of it, as users love them. convenient sites and come back to them again and again.

Leave A Reply

Your email address will not be published.