It can be difficult to get hosting companies to check on issues with Googlebot. Customer support staffs will often dismiss complaints about Googlebot because they know their company doesn't have any explicit policy to block Googlebot, but their systems can be configured in a way that causes problems anyway. A common instance would be where the hosting companies methods of preventing Denial of Service attacks inadvertently blocks Googlebot. You might try setting a slower crawl rate in Google Webmaster Tools under Configuration->Settings by selecting "Limit Google's maximum crawl rate". It will probably take several days for this to have any effect, if it helps you at all, but it's probably the only thing you can do on your own. If it fails, and you keep getting 403 errors for Googlebot, I'd suggest you contact your hosting service again or change hosts.