Google 403 error
Hosting company said they did not reject googlebot.
File robots.txt was just added yesterday.
sitemap also was just added yesterday.
htaccess file content as below:
# Use PHP5 as default
AddHandler application/x-httpd-php5 .php
RewriteRule ^News-([a-zA-Z0-9-|%]+)_([0-9]+)\.html$ news_show.php?fp=$1&id=$2
RewriteRule ^Product-([\.a-zA-Z0-9-|%]+)_([0-9]+)\.html$ product-detail.php?fp=$1&pid=$2
ErrorDocument 400 /400.shtml
ErrorDocument 401 /401.shtml
ErrorDocument 403 /403.shtml
ErrorDocument 404 /404.shtml
ErrorDocument 500 /500.shtml"
Can someone please shed some light here.
It can be difficult to get hosting companies to check on issues with Googlebot. Customer support staffs will often dismiss complaints about Googlebot because they know their company doesn't have any explicit policy to block Googlebot, but their systems can be configured in a way that causes problems anyway. A common instance would be where the hosting companies methods of preventing Denial of Service attacks inadvertently blocks Googlebot. You might try setting a slower crawl rate in Google Webmaster Tools under Configuration->Settings by selecting "Limit Google's maximum crawl rate". It will probably take several days for this to have any effect, if it helps you at all, but it's probably the only thing you can do on your own. If it fails, and you keep getting 403 errors for Googlebot, I'd suggest you contact your hosting service again or change hosts.
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)