<META NAME="ROBOTS" CONTENT="NOINDEX">
Artashes, the easiest and most reliable way to protect certain url from search engines is to make those pages password protected so that only authorized people can view it. A simple login page can be coded with asp or php.
i didnt know that could be done thanksXcel_Hosting said:Definitely. In your robots.txt file, add a line (or a few lines) with the statement "Disallow:"
Here's a good site that shows you how it's structured.
http://www.searchengineworld.com/robots/robots_tutorial.htm
You can even restrict certain pages from certain bots.
This is used all the time to protect scripts, images, and a variety of info that people don't necessarily want so easily available.
Given the age of this thread, its closed.storagedump.com said:i didnt know that could be done thanks