497 SW Marine Dr #447, Vancouver, BC V5X 0C7

White hat SEO techniques list

White hat SEO techniques list

If we want to identify the main pillars for SEO knowledge, we must mention content production, link building, on-page SEO and technical SEO. White hat SEO includes any activity on the website code and its technical preparation to better understand the Google bots of the content and link building of the website.

Simply put, in SEO, we tell Google and its crawlers what the content of our web pages is and how the links between the pages are.

White hat SEO techniques list

1- Every page of the website should be unique and valuable

In short, useful and without additional margins, we must say that every page of your website should have two important features:

  • Be unique. This means that it is not the same as other pages on your website.
  • Have its own value. It means providing users and searchers with the answers they need.

But sometimes we get stuck on the specific value of each page. To better understand the problem, let’s give an example. Suppose you have three dedicated pages for topics X, Y and Z. At the same time, you will notice that there are pages on your site that are very close to these three pages in terms of content. Each of the following conditions can be examples of this seemingly duplicate content:

  • A page with a theme consisting of X and Y themes
  • A page with a theme consisting of Y and Z themes
  • Another page with topics X and Y with a slight difference in the title and topics covered in it
  • A page with a compound theme of Z and X that has no content value but for some reason you have to keep it

What is the best solution after encountering such pages? What can be done to keep visitor traffic focused and search engine value focused on the X, Y, and Z homepage?

You should use one of these three steps to deal with such pages:

Using a canonical tag

In different cases, your website may have pages that have the same content but different URLs. For example, there are two mobile versions (m.aaa.com) and normal (aaa.com) of the website; Or the http and https versions of the site, both of which are indexed by Google. This incident and the conditions that we mentioned about pages X, Y and Z make it difficult for Google to choose the reference page. What will Google do in this situation?

White hat SEO techniques list

Google considers these pages to be separate pages (due to different URLs) but with duplicate content. This calls into question the basic principle of white hat SEO, which is to have unique and valuable pages.

The canonical tag has a simple but very important function. If there are several pages on your website that are close in content or, as we said, duplicate content is placed at different URLs, you can use this tag to indicate the target page to Google.

Focusing on duplicate content will save your site from being overwhelmed by Google. Using rel = canonical tag allows you to specify priority and importance between duplicate pages and practically no duplicate pages on your site in Google’s view. In this way, in addition to escaping the Panda algorithm, you direct the focus and inbound traffic related to your desired keyword from the search engines to your main page.

Remove duplicate pages

The second option is to delete pages that are duplicate in content. Some pages that have duplicate or duplicate content may be of little value to users and, of course, to you. In this case, removing them from the website makes sense and will ultimately benefit your website. (If you would like to know how Google deals with copy content, we suggest that you read the Google Panda Vancouver SEO article .)

Block access to Google bots

The third solution is to restrict the access of Google bots to different parts of the site. There may be pages on your site that make sense for users to search from within your website, but indexing them on Google is unnecessary and useless. In this case, restricting Google from using the robots.txt file or metadata is one of the options before you.

2- Easy access of bots to website pages and high loading speed

Never underestimate the importance of your web page loading speed. The second item on the SEO checklist that six dongs should be aware of is comprehensive code optimization to speed up page loading. In the meantime, the most important optimizations you need to do are optimize the image size and speed of the site in receiving the response from the server.

By doing these optimizations, two important things happen on the website. First, Google and its bots will be faster at reviewing your website, and as a result, the time budget that Google anticipates for reviewing your website will be used more efficiently. And secondly, users will be more satisfied with the faster performance of your website and will have a better sense of the experience of working with your website, which will ultimately benefit your website.

More satisfaction and a pleasant experience of working with the website will have good consequences such as receiving links, sharing by users and returning to your website. User satisfaction is an interesting and important issue that is often talked about alone.

Another point to consider in this regard from the SEO Technical Checklist is the correct display in simple, text-based browsers. Sometimes JavaScript code, flash files, and other scripts get in the way of display in some browsers. Your website should not show weakness in displaying the main content to Google bots and users in such cases.

Simply put, Google does not need the details of all your scripts, images, or videos to rate your pages. As a result, your website should be able to load all of your pages in HTML and text format, well and quickly.

3- Examining low-value, duplicate content and content loops

Before offering a solution for such pages, it is good to know these three types of content better:

  • Low value content or  Thin-Content:  it is a content that your website does not offer significant value and unique users. In other words, its content does not cure users’ pain.
  • Duplicatecontent: Content that has been copied to more than one page of your website for any reason. Consider, for example, two pages for both a “mobile” and a “printable” version of an article that are copies of each other.
  • Crawler traps and content loops: Some content traps Google bots and crawlers in a loop of infinity for a variety of reasons. For example, content that is linked to a calendar and its content is regularly linked to the coming days and years.

In the case of low-value content and crawler robots, the most common solution is to delete the pages. But another solution that we mentioned in the first case of this checklist is to use a canonical tag to prioritize the content.

With the help of this tag, it can be assumed for Google that the printed version of the article is not your main goal and the value and inputs of Google should be transferred to the page of the mobile version of that content.

4- Shallow linking to access the best site articles with the least clicks

The structure of website linking should be such that the user is able to  access your most valuable articles quickly and with a  minimum number of clicks .

We guess you are not very interested in discussing math and statistics, but a math example will give you a better understanding of this from the checklist. Suppose there are 100 links to your category and internal pages on the main page of your website. Within each of these pages there are 100 links that lead users to pages with more depth. So far, users can access one in ten thousand pages within your site with two clicks.

If you are careful, this number will reach 1 million pages, which is a significant number for any website of any size. Although this form of linking does not happen in reality, even if you get a little closer to this way of link building structure, the conditions are ideal and you are able to get the user from your homepage to your important pages and valuable content with the least number of clicks.

The best way for users to access the important pages of your site will be three to four click steps that you should try to implement in the sitemap.

If you think your site is structurally complex, using a sitemap will fix a potential Google problem with your site structure. Search engines will have no problem using HTML sitemaps that specify the structure and layout of your site pages.

5- Make the web pages responsive and optimize for any internet speed

This item from the SEO technical checklist is fortunately one of the items that has received more and more attention in the last few years. Responsive version is the version of the website that has a standard and correct display on all devices, from desktop to small mobile screen. In designing and implementing the responsive version of the website, paying attention to the type of device internet and optimizing for different internet speeds from 2G to 5G is another important and necessary thing that must be observed. Mobile first index algorithm is the main representative of Google for reviewing mobile friendly sites.

6- Correct use of 404, 503 and redirect codes 301

Status Codes http: The Status Codes tell Google the status and status of each page of your website. The most common status codes available on websites are 404, 503, 301, 302 and 200.

What conditions does each of the http status codes  report to Google?

If your server is temporarily down or you want to make changes to your site or server for a few days, code 503 will notify Google of these temporary conditions.

Code 404 indicates the disappearance of your pages. If you have an article or page that you have deleted, there is still a link to it in the Google results or in the internal link building of your site and your page is indexed by Google. In this case, the user will encounter a 404 error after entering this link. Google bots will also notice that the page is closed by referring to such links.

Status code http 301 is used to redirect or permanently move a page to a new address. Redirect 302 is also used to change the address of a page, but only temporarily.

The rest of your website pages that are running normally and have no significant and different status should be reported to Google with a status code of 200. Code 200 indicates the stable and flawless status of a page.

You will rarely come across other website status codes such as 20x, 30x, 4xx and سری series codes for which there are very special conditions. If you are using different SEO tools, such as the MOZ tool or Screaming Frog, and you come across different http status codes than what we described in this section, it is better to follow the description of the status and how to solve the possible problem through the same tools do.

7- Using SSL and launching the HTTPS and secure version of the website

Introducing a secure version of https to Google seems smarter than an insecure version of http. But if you want to finish the argument on this search engine, it is better to introduce the original version of your website (naturally the secure version and https) to Google as the first priority, using the widely used and valuable canonical tag.

White hat SEO techniques list

Until now, the secure version of https did not seem to be as effective as the http version. But day by day, with the change of algorithms and Google paying more and more attention to security and user satisfaction, the importance of using this version becomes more and more; Under these circumstances, it is likely that this point will not be so ineffective in the SEO of the site these days.

8- Basic use of subdomain and subfolder in the site

Multiple domains or one domain? The problem is! There is a general rule for SEO that it is strongly recommended to use a subfolder instead of a subdomain. Although there are exceptions to this, and some Google representatives have even stated that the two models are no different, the use of subfolders is much more common and the experience of working on different sites has proven to us that it will be more useful for SEO.

Let’s see what is the difference between subdomain and subfolder?

Use a subdomain whenever you create a new subdomain such as blog.vancouver -seo to categorize a section of the website, such as a blog. In the second case, instead of creating a new domain, use a folder to separate a section of the website; For example, what we have done in vancouver SEO site for the dictionary section:

  • https://bc-tech.ca/glossarys/google-algorithms/

In addition, unlike using a subdomain, there will never be a serious problem using subfolders. Using this method of organizing the site has been recommended by many experts with SEO experience.

What should be considered in building subfolders?

Perhaps the only thing to consider when making subfolders is the use of short lettering and sorting.

Note the following two types of URLs:

  1. https://bc-tech.ca/glossary/301-redirect/
  2. https://bc-tech.ca/301-redirect-glossary/

Using the first method has several major advantages over the second. First, Google does a great job of understanding the structure of the directory and analyzing the organization of web pages. In addition, shorter URLs are more popular with Google, and Google bots do not have a good relationship with separating words using dashes and creating long URLs that way.

Last but not least, creating a content folder and building URLs in this way, especially when there are several other pages in a subfolder, will help build a useful breadcrumb for your site. This is the last and of course one of the most important items in the SEO technical checklist.

Does choosing between www or without www addresses affect SEO?

No, the most important thing about choosing it is that we only have to choose one between these two addresses.

We suggest you in this case !!

Based on our experience with websites, sites that are selected with www, their algorithm is very similar to the subdomain that you will be particularly confused. But sites that use without www, no redirection will happen.

So the main thing is to select either of these two and redirect the other to the selected 301 address.

This is also the case with HTTP and HTTPs, which you should follow.

Four addresses of which only one should be selected:

  • HTTP
  • HTTPS
  • WWW
  • Without WWW

By selecting any of these addresses, you will redirect the rest of the addresses to the selected 301 address.

9- Using breadcrumb in websites with complex structure

White hat SEO techniques list

The cool thing about using this technique is that Google learns your site by examining BradCrumb ! The last thing we need to mention in the White Hat SEO technical checklist is organizing the site structure with the help of BradCrump and creating new conditions for Google bots to better learn your website.

One of the most important benefits of creating a breadcrumb is the proper display of your links on the Google results page. This issue, especially in the mobile version, due to the lack of space to display search results, is very important and will not be hidden from the eyes of Google and users.

Lastly, a non-technical tip for SEO

So far, we have reviewed 9 of the most important things that should be considered in SEO technical. Many of these cases can be solved and solved with a little coding knowledge on the site. Other items that require more specialized knowledge can be easily tracked and implemented by experts. As a result, we recommend that you do not ignore this powerful white hat SEO pillar under the pretext of little technical knowledge in the field of coding.

In the next part, we will go to link building in order to improve SEO. Using the link building, you can get rid of the SEO monster and greatly outperform your competitors. But the question is not whether to build links or not, but it is better to ask yourself the question how to build links?

Leave feedback about this