Ok guys I own the site www.top200.org. Currently there are multiple websites using bots to spam my lists with hits causing too much CPU time to be used and my provider shutting down my servers temporarily each time. I currently use Aardvark Top Sites script and I've heard of others having the same problem. A solution was posted to make a .htaccess file that prevents anyone, including bots from certain urls from accessing your site. So I created a .htaccess file and yet apparently the CPU time is still being overloaded because my lists are always down.
Please read the following code and tell me if this is the correct format for a file named .htaccess to block referred visitors from these certain sites from visiting my site.
Once again, the file is titled .htaccess and is located in every folder on my FTP:
What you need is to use a robots.txt file located only in the root directory. Once you have a proper robots.txt file uploaded, you can have it checked here: Robots.txt Checker
Also, if you have a Control Panel like CPanel or VDeck to use with your site, you need to check through and find the Domain Blocker utility or however it is named for your CP. Enter the IPs there. You can block entire sections of an IP range or just the one IP address.
You will only be blocking the 'Bots that obey the "rules" as the bad 'Bots will plow on through your site. Then you may have to try different methods depending on function of 'Bots like cruising to the HoneyPot project for taking care of Spam 'Bots.
Last edited by Major Payne; 01-18-2008 at 11:30 PM.