The huge corporate site that I manage has very low site rankings but very high keyword results, as in over one hundred major industry keywords that show up on page 1 of Google search. I must be doing something right.

Here is a theory I would like to throw out for commentary. The major search algorithms must be searching for ways to match what the user would want to find. I'm sure that Google has hundreds of researchers just looking for that .01% improvement. A big commercial site is likely to accumulate a lot of old pages that you don't want to delete, as that would make your customers unhappy when they can't like to a manual or spec sheet for some older product that they depend on, yet will be predictably less valuable over time as a general rule.

What I've been doing of late is to go back to older pages and refine the html/CSS syntax as well as structural and occasional content rephrasing. Even though the actual value added may be small, I theorize that the search engines will see this as an update and therefore add some incremental boost to the keyword searches on the basis that this is modern and not legacy. I.e., it won't get screened out if the user has limited the search to the past year.

Similarly, I look for ways to improve internal linking, adding structure on the basis that this will be used as evidence that the site owner is doing due diligence and therefore ought to be rewarded with a higher keyword rank.

Comments?