****! Sorry about forgetting to post my solution here - it actually stems from the idea you gave me.
Essentially what I did was create a sitemap and robots.txt file and uploaded them to the website. These two files both point to the actual PHP pages.
At the top of each of the PHP pages is a small script. This script detects whether the user is a standard user or a search engine. If it's a search engine then it adds a navigation element to the top of the page - to link all the pages together.
So now when a search engine bot crawls my website it reads the PHP script as if it's the actual website. (well it is the actual website, it's just not viewed from there).
So, for example, the contact part of my website is actually:
"mywebsite/#contact"
But in the google listing it's:
"mywebsite/pages/contact.php"
Now this same script which detects for search engines actually works in reverse too. If you're visiting the page without a certain "GET" variable set then the PHP script modifies the header and sends you to the nicely, and un-search engine friendly, site.
So it kind of works:
Google -> pages/contact.php -> user agent is real user -> get variable not set -> redirect to propper site
propersite -> ajax load: pages/contact.php -> GET variable set -> give ajax the response it wants
This approach also lets me easily display a mobile site under the exact same url.
It's also obviously working to - the site's not been published long and I'm already on page 2 of google for searching just my name! 