I can't think of a simple way to do this to be brutally honsest.
I would write a "robots" style script that pulls the HTML of the page as text then reads it and parses it and logs all of the relevant information. If you wrote this in an object orientated way, you could spawn new "robots" to follow each link on the page and retrieve the data and post it all back to a database for analysis later?
Is this the answer you were after? let me know if I've missed the target and I'll try and help you further.