Ok, I've tried the script:
The problem is… I can't get it to browse as a browser's agent and it keeps connecting as a "robot", and relying on the robots.txt file, failing to index the pages marked as disallow… or at least so says the error message: "File checking forbidden by required/disallowed string rule".
I tried to change some if conditions, to make it NOT to find the robots file, or ignore it, but it didn't work. I also tried a mod I found online to "ignore robots" but it did the same, except there was no error. it just ended. Sphider-plus (1.6) did the same.
If anyone knows how to hack it, I'd appreciate the tip.
Thanks.