It's Quiz4Y

Guidelines for Google Webmasters All 34 items Supplementary information summary

The SEO industry has been in a radical change since 2012 . Recently, panda updates and penguins updates have been updated in quick succession.
With these updates, we’ve become more scrutinized for violations called spam, and we’re even closer to Google’s tendency to display useful search results for its users .

Even though the operator himself was not malicious, there are many cases in which the penalty and ranking have dropped due to the fact that he was doing something that corresponds to spam.

Therefore, SEO in Google , but it is, of course, to check the trends, the home page rules to be observed in order to operate a ( web master because it has also been published in the Guidelines) official, be sure to check if the operator Let’s keep it.

This time, I added a commentary and supplementary information on the guidelines for webmasters set by Google .

What are the guidelines for webmasters?

Webmasters and for guidelines, Google of search results Home are displayed on the page (before the home to be displayed page also included) is a rule to be observed.
Home along this rule page by performing the creation and operation, correctly Google you can get recognized.

The web master item for guidelines does not mean that “this is? The order increases do it”, SEO home of evaluation previous page will say that should be carried out in the operation, “your manners”.
In other words, it is difficult to make a proper evaluation if this method is not done.

Operators, Google information about the evaluation from the web master tool ( Google Search Console so you can know through), be sure the web master tool ( Google Search Console will help you make the registration of).

There are three main guidelines for webmasters

Webmaster Guidelines, the home page has been specified by dividing the rules to be observed in the operation to three.
The following is supplementary information for each item.

・ Design and content guidelines
・ Technology guidelines
・ Quality guidelines

Design and content guidelines

Design and content guidelines on, the search results from the home page to find the user and display method of optimal information to, browse user beneficial home to the page is a rule about the way of.

1. Create a site structure with easy-to-understand hierarchies and text links. Make each page accessible through at least one static text link.

▼ Supplementary information on user benefits ▼

Hierarchical structure is to organize, user home page can current position grasp on, it is easy to find the desired information.

Text link in explicit (such as breadcrumbs or menu) that anyone can understand easily by.

▼ Supplementary information on the benefits of spiders ▼

· Text link to the page it is said that is likely to information acquired by explicitly to the destination of the information.
For example, if it is a link to a page that provides web marketing know-how , the following text link .

・ By organizing the hierarchical structure, it is easy to judge and evaluate what kind of theme the page directory is specialized for.

・ Since dynamic links are not always judged, the risk of not being judged can be reduced by including at least one static link as insurance.

2. Have a site map with links to the main pages of your site. If you have a large number of links in your site map, we recommend splitting your site map into multiple pages.

▼ Supplementary information on user benefits ▼

・ It is easy to find the desired information because there is a page that allows you to get a bird’s-eye view of the entire site.
* The sitemap is not sitemap.xml.

▼ Supplementary information on the benefits of spiders ▼

Home page the whole of the table of contents site map page than, any theme specialized in page easily evaluated to determine whether directory of the

3. Keep the number of links on one page reasonable.

* It was said that the appropriate number was about 100 before, but recently it is officially said that there is no problem even if it is 100 or more.

▼ Supplementary information on user benefits ▼

· Link other by a small page difficult to move, it is an object in the proper number of adjustments to prevent the fact that hard to find in it too much.

▼ Supplementary information on the benefits of spiders ▼

・If there are too many links , it may hinder the collection of spider information, so adjust the number to an appropriate number for smooth migration (crawling).

4. Create an informative and useful site to describe your content clearly and accurately.

[ Supplementary information on user benefits]

・The information is intended for the user who searched from the search results, and the information that is specialized or specific and useful is described in an easy-to-understand structure.

▼ Supplementary information on the benefits of spiders ▼

・ The larger the amount of information, the easier it is to collect information, and the easier it is to understand (markup such as headlines), the more efficient the information collection.

5. Make sure your site includes keywords that users are likely to enter when searching your site.

▼ Supplementary information on user benefits ▼

・Easy to find from search results

▼ Supplementary information on the benefits of spiders ▼

• What theme of the home page becomes easy to understand why such the

6. Use text instead of images when displaying important names, content, and links. Google crawlers do not recognize the text contained in the image. If you need to use images instead of textual content, use the alt attribute to include brief descriptive text.

▼ Supplementary information on user benefits ▼

* Not listed because it is closer to the spider.

▼ Supplementary information on the benefits of spiders ▼

-Since the spider cannot recognize images, it will be easier to recognize by writing text in the alt attribute.

7. Make the description of the title tag element and the ALT attribute clear and accurate.

▼ Supplementary information on user benefits ▼

・ It will be easier to understand what kind of page it is from the search results.

▼ Supplementary information on the benefits of spiders ▼

• What theme of the home page becomes easy to understand why such the

8. Check for invalid links and correct HTML.

▼ Supplementary information on user benefits ▼

・ It is easier for users to use if there is no problem with the link.

▼ Supplementary information on the benefits of spiders ▼

・Smooth crawling is possible for invalid links and pages written in correct HTML.

9. When using dynamic pages (such as pages with a “?” In the URL), keep in mind that some search engine spiders may not crawl the same as static pages. I will. Shortening the parameters or reducing the number of parameters will make them easier for the crawler to find.

▼ Supplementary information on user benefits ▼

* Not listed because it is closer to the spider.

▼ Supplementary information on the benefits of spiders ▼

・ Since dynamic URLs are not always judged, it is easier to judge URLs that do not burden the spider.

10. See the images, videos and rich snippets notes.

* Regarding this, the points to note when using images, videos, and rich snippets are summarized below.

Image
https://support.google.com/webmasters/answer/114016

Video
https://support.google.com/webmasters/answer/156442

Rich snippet
https://support.google.com/webmasters/answer/2722261#2

Technical guidelines

Guidelines for technology, user systems and home provide support when the browse page is the rules to be observed in the technical part in the operation.
This will be explained with additional information.

11. Make sure that all of your site’s assets (CSS and JavaScript files) are crawled so that Google has a complete understanding of your site’s content. The Google indexing system uses the page’s HTML and its assets (images, CSS, Javascript files) to render the web page. To see page assets that Googlebot can’t crawl and debug directives in the robots.txt file, use the webmaster tools Fetch as Google and the robots.txt tester tools.

▼ Supplement ▼ Let’s allow spiders to crawl
CS S files and JavaScript files as well . If the spider cannot crawl (or control) using robots.txt etc., it may affect the index .

12. Allow search robots to crawl your site without any arguments to track session IDs or paths within your site. While these techniques are useful for tracking individual user behavior, they are quite different from the patterns that robots access. As a result, using these techniques, the robot cannot actually eliminate another URL that links to the same page, which can result in an incomplete index for that site.

▼ Supplementary ▼ Query parameters may be used when recording user behavior in access analysis such as
Google Analytics . At that time, if the same page is displayed even if unnecessary query parameters are added as shown below, it may be a duplicate page . Therefore, let’s normalize the URL . (Rel = ”canonical” and 301 redirects )

13. Make sure your web server supports the If-Modified-Since HTTP header. You can use this feature to reduce bandwidth and load by letting Google know if content has changed since Google last crawls your site.

▼ Supplementary ▼
If-Modified-Since HTTP header is a function (cache) that saves the information of the page visited once and displays it based on the saved information when it is visited again.

Google ‘s spider also uses the cache to collect information, so you can collect information smoothly if the server you are using supports it.
* Most of the general servers support it, so you don’t have to worry too much about it.

14. Take advantage of the robots.txt file on your web server. In this file, you can specify which directories to crawl and which to not. Make sure this file reflects the latest state of your site so you don’t accidentally block the Googlebot crawler.

▼ Supplement ▼
By using the robots.txt file, you can set restrictions so that spiders do not come to pages that you do not want to be displayed in the search results . However, be careful not to accidentally limit your spiders. (For example, the home page or the like, not come to itself)

15. Take appropriate action to ensure that your ads do not affect your search engine position. For example, Google’s AdSense ads and DoubleClick links block crawls in robots.txt files.

▼ Supplementary ▼ If
AdSense ads or DoubleClick links are installed, it means that robots.txt is used to control it in order to prevent passing ratings and unnecessary migration of spiders.

16. If you deploy a content management system, make sure that it creates pages and links that allow search engines to crawl your site.

▼ Supplement ▼ When using a
content management system ( CMS ) , if a parameter such as “& id =” is included as a dynamic URL , the spider may not be able to collect information smoothly, so perform URL rewriting etc. to URL Is to simplify.

17. Use robots.txt to control search results pages and other auto-generated pages that are of little value to users accessed by search engines.

▼ supplement ▼
For example, the home page is displayed after using the search function within the search results page is, you want to find in the same conditions the user does not have a value only.
In addition, because it is automatically generated, the more pages there are, the more it will lead to the spider collecting unnecessary information.

18. Test your site to make sure it looks correct in each browser.

▼ supplement ▼
of any environment user home even page is that to make sure the display differences in between the browser (browser check), as is seen.

19. Monitor site performance and optimize load times. Google’s goal is to provide users with the most relevant search results and great convenience. Fast-viewing sites increase user satisfaction and improve the quality of the entire web (especially if your internet connection is slow). That’s why Google expects webmasters to improve their sites and speed up the entire web.

Webmasters are strongly encouraged to regularly monitor site performance using Page Speed, YSlow, WebPagetest, and other tools. For more information on tools, resources, and more, see Web Acceleration.

▼ supplement ▼
home page if there is a long load time, user lacks convenience for the page will be. For this reason, and optimized to read time is shorter, the user home increase the satisfaction page that let in.

Quality guidelines

Quality guidelines are rules about camouflage and fraud, primarily aimed at raising ranks, called spamming.

Google is also the official that has been described in, because the contents of such guidelines and basic policy on quality has been described in detail, the home page is especially important item for the operator.
This will be explained with additional information.

[Quality Guidelines-Basic Policy]

◆ Create pages with the highest priority given to user convenience , not search engines . Don’t fool users .

◆ Do not cheat to raise the position on search engines . When explaining the measures you have taken to sites competing for rank or Google employees, it is a guide to judge whether there are any ambiguities. Besides, the user whether or not helpful to, the search engine whether or not to do the same even if there is no, please try to see the point of such.

◆ Think about how you can make your website unique, valuable, and attractive. We will differentiate ourselves from other sites in the same field.

Based on the above basic policy, let’s check whether it corresponds to the following spam acts.

20. Automatic content generation

▼ Supplementary ▼ Based on the fact that the crawler recognizes
text information, it is a spam act that intentionally raises the ranking by listing meaningless sentences or keywords that you want to display at the top. It ‘s an old method that has no value to users and was mostly eliminated after the panda update .

21. Participation in the link program

▼ Supplement ▼ This
is the most prevalent spamming activity in Japan. It refers to the act of intentionally increasing the number of links since the time when links ( external links ) were thought to have the greatest effect on search ranking . For example, link buying in the money, mutual link excessive to say that the link to win, anchor text stuff the keyword you want to rank rise to, link activity in general, such as to register to the community in order to win is true.

22. Creating a page with little or no original content

▼ Supplementary ▼ Refers to a copy page that duplicates the
content or a page (doorway page ) that only guides the user to the product . Especially for copy pages , affiliates and summary content (2ch summary, etc.), whose content itself tends to be duplicated, tend to be applicable.

23. Cloaking

▼ supplement ▼
1 one of the home page for the spiders and the user is displayed in the content is the act of changing the.
For example, while the user is shown a page with a well- designed design ( Flash, etc.) , the spider is shown a page that is conscious of only SEO measures (including a lot of countermeasure keywords) and intends to raise the ranking. Is to do it. Therefore, users and spiders must see the same content .

24. Illegal redirects

▼ Supplement ▼ The original usage of
redirect is to move the user who tried to access the page to another page . For example, the home page old at the time of the transfer page visited to the user a new page to redirect you used when moving in.

On the other hand, the spider does not follow the redirect , so it collects information on the original page .
Therefore, the impersonation that applies this method to change the content displayed by the user and the spider as well as cloaking is an illegal redirect .

25. Hidden text and hidden links

▼ Supplement ▼ It is an act of arranging text containing keywords that you want to disguise as invisible to the
user and raise the search ranking, or arranging links . The following items are applicable.

And white of white in the background text to use
, text the image of put behind
– CS using the S text to place the outside of the screen
, the font size is set to 0
, a small one character (in paragraph hyphen, etc.) only a link in the link Hide

26. Guidance page (doorway page)

▼ supplemental ▼
content as well as automatic generation of a specific page only for the purpose of inducing the page and home page is the entire induction page (doorway page will be).
For example, only created in another each prefecture, it does not change the content content home there is a page , such as is true.

27. Unauthorized duplication of content

▼ supplement ▼
Home someone else’s page from, was reprinted and full copy unauthorized content applies is. In addition, even if the wording or wording is slightly changed, it corresponds to unauthorized copying of the content .

28. Affiliate Program

▼ Supplement ▼
Refers to low quality affiliate sites. Specifically, it is an affiliate site that contains only the information disclosed by the distributors that carry out the affiliate program and is not unique.

As a side note, not all of the affiliate site apply, posted a comparison information or a unique impressions content is, users of value to the page does not apply to become a.

29. Keyword abuse

▼ Supplement ▼ The act of stuffing
keywords that you want to raise the ranking into the content , title tag , and alt tag is applicable. In many cases , pages that were overly conscious of SEO measures like this fell in rank due to the penguins update .

30. Creating pages with malicious behavior

▼ Supplement ▼ Refers to actions that the
user does not intend. For example, downloading without the user ‘s consent, installing a computer virus , or changing the settings of the browser used by the user .

31. Abuse of rich snippet markup

▼ Supplement ▼ The following items are prohibited when setting additional information (author name, product sales information, etc.) on the page called rich snippet displayed in the
search results .

– user does not appear at all on the content to mark up.
-Mark up irrelevant or misleading content (false reviews, content unrelated to page content, etc.).

32. Sending automated queries to Google

▼ supplement ▼
this item, the home page does not give effect to. The main reason is that the act of automatically checking the search ranking using the ranking check tool is prohibited.
If you keep sending queries to Google using the check tool , your search results may become unusable.

Not many people are aware of this item, so be careful when using tools that automatically check your ranking.

Other notes

33. Monitor your site for hacks and remove hacked content as soon as you find it

▼ supplement ▼
Home infected with hacking and virus page , the user you might to suddenly order also dangerous and to order decline for. If such a situation arises, deal with it immediately.

34. Prevent user-generated spam from appearing on your site and remove it when found

▼ supplement ▼
home page in the publisher unintended way, there is also a case that would correspond to the spam act. For example, if you have a comment section on your blog , it can be abused by methods such as comment spam, so if you have such a comment or post, deal with it immediately.

Summary

Unfortunately, the guidelines for webmasters published by Google are often indecipherable because the Japanese translation is not accurate.
Therefore, some items also include opinion-based supplements.

Also, please note that the guidelines themselves may have been updated without your knowledge, so the content is as of October 15, 2014 at the time of writing.

Leave a Reply

Your email address will not be published. Required fields are marked *