What is the website of the S log to help optimize

server resources, frequent access spiders are helpful for the website, but the resources consumption is very big, so to find a good space to put, or a server down, it may not inform you in case you delete the site off.

love Shanghai Club: see fake spider’s identification (in order to steal other data), the most important is the love of spiders in Shanghai are Beijing IP over there, if any place is not the real spider, also have fall right and be K spider, careful check oh.

is the search engine spiders crawl the content sent to the robot, understand the number of spider crawling our website to know whether the search engines love, and what this site did not continue to crawl, can the number of contrast, contrast contrast before operation around to know how many spiders is what is caused by the website chain or website updates, so as to adjust the "spider love, original content, if is some paste over direct copy reprint content, maybe next time the spider will not come again, feel the mirror this site is a site station.

IIS log is a very important thing, because there you can query to search engine crawling robot, can also realize their website, some users can analysis to the background, not necessarily with traffic statistics code to calculate, but also some business website IIS log space is limited, too to inform the open can, some do not support, can also download some code to install the IIS website, log stub points of space, capacity is not large, will be a sudden excess, so the space to find a good point to put the website. I talk about web log analysis under what help to optimize

If love is A lot of spiders can increase the access to the

for harmful spider if we shield (shielding spider IP) spider is many, if which day found many times a unknown spider IP website was drop right or be K, then we have to ban the spider IP access.

, understand the search engine spiders crawl number

two, the web page to grab

grab the home page, the snapshot is the next day, the inside pages frequently it is also the second case, if a part of the page that is not grasping, check whether the prohibition of spider crawling, when the chain can do most of the home page, the inside pages to do right otherwise, it is very low, we can know what the site appears problem, mainly to what spider climb up to the page or what page is what page has frequently is not up to, we should combine the analysis, the directory included more less compared to that (to find space to provide IIS to check the change log), search engine is not the same in different periods up to the situation, because of the reproduced, or because.

?

Leave a Reply

Your email address will not be published. Required fields are marked *