How Google Indexes our site

tecktalk

Account Disabled
A "bot" is a piece of software from a search engine that is built to go through every page of your site, categorize it, and place it into a database.
Google has three well known bots: The Adsense bot, the Freshbot and the DeepCrawl.
The Adsense bot is used for publishers who have Adsense on their sites. As soon as a new page is created, the JavaScript within the Adsense code sends a message to the Adsense bot, and it will come within 15 minutes to index the page so that it can serve up the most relevant ads.

The Freshbot crawls the most popular pages on our website. Sites like Amazon.com and CNN.com have pages that are crawled every ten minutes, since Google has learned that those pages have that amount of frequent changes. A typical site should expect to have a freshbot visit every 1 to 14 days, depending on how popular those pages are.Freshbot visit is that it finds all of the deeper links in your site. It places those links into a database so that when the DeepCrawl occurs, it has a reference.Once a month, the DeepCrawl bot visits your site and goes over all the links found by the Freshbot. This is the reason why it can take up to a month for your entire site to be indexed in Google - even with the addition of a Google Sitemap.
 
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses software known as "spiders" to crawl the web on a regular basis and find sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when our spiders crawl the web.
To determine whether your site is currently included in Google's index, just perform a search for your site's URL. For example, a search for [ site:www.google.com ] returns the following results: http://www.google.com/search?hl=en&q=site:www.google.com+
 
Kind of dumb how you copied it, but aren't these bots called spiders? The spiders that crawls the web to find new websites?
 
Consider creating and submitting a detailed Sitemap of your pages. Sitemaps are an easy way for you to submit all your URLs to the Google index and get detailed reports about the visibility of your pages on Google. With Sitemaps, you can automatically keep us informed of all of your current pages and any updates you make to those pages. Please note that submitting a Sitemap doesn't guarantee that all pages of your site will be crawled or included in our search results.
 
Very good info.. He should have referenced where he got it from though...

Anyway creating a sitemap is pretty easy anyway.. especially if your website consists of less than 500 pages... You can create a free sitemap at xml-sitemaps.com ...
 
Back
Top