For ages now, I've been developing a tool / jQuery plugin that has certain features, not worth mentioning in this thread.
I'm now contemplating on trashing vast parts of it, except one feature, which a webmaster that uses it really cherishes:
- Increasing site PageRank by altering selected outgoing links (only some of them - not all of them).
(We've measured, that the technique really works)
Most notably, links to high PR sites needn't forward PageRank.
Quite a few webmasters would prefer to keep their PR on the site, rather than throw it at e.g. Twitter / Facebook.
Question 2: I would prefer to use a generic JS technique and maybe simplify my existing tool to just be a pure jQuery plugin or conventional JS, that does nothing but mask these links.
What syntax do you reckon would be the best?
Regarding question 2 - please let me state some requirements:
- Search engines have no way of parsing the JS links - otherwise, this could result in either PR being forwarded, making the solution useless, or even a penalty.
(That can partially be achieved by disallowing the spider to crawl the JS in robots.txt)
- Arbitrarily complex links can be substituted generically, for example the following standard "Twitter Button" link:
<a href='http://twitter.com/share' class='twitter-share-button' data-url='... data-count='horizontal' data-via='...' data-related='pubcon' data-text='...'>Tweet</a>
A solution to question 2 that comes to my mind is simply substituting '<' with '[' and '>' with ']', delivered as text nodes similar to what "BBCode" does prior to server-side substitution.
That could be parsed rapidly and with a simple algorithm (e.g. using a simple RegEx for the substitution) in pure JS, allowing for complex links to be replaced.
What do you think?
Thanks in advance and kind regards
Can you clarify what your asking?
uhh, search engines DO understand JS, as they have for years...
Create, Share, and Debug HTML pages and snippets with a cool new web app I helped create: pagedemos.com
Thanks very much to all of you!
So the bottom line is, that search engines don't encourage such practices and it is probably against Google Guidelines, right?
I was surprised, that while googling for for something similar, not many significant hits were to be found.
Is demand for such a technique so low?
Just wondering, when thinking about the perfect search engine bot, I would have it look at the page after its load in a suitable virtual machine.
(do you know what I mean: comparing the raw HTML to post-load HTML)
They might not be doing it yet, but do you reckon they'll do that in the future?
I personally don't think parsing JS is reliable from search engine point-of-view, because you could e.g. defer some work to a central server, in combination with JS, that initiates and delegates the job.
That would be completely invisible for any spider.
So iFrames are an option, too, but I suppose that if someone has JS disabled, things get messy (or what does the user see?).
Afa the solution I was thinking of is concerned, the alternate links would be visible to users who have JS disabled - at least one disadvantage.
I'm just amazed that there is no official way to link to content without forwarding PageRank.
(that used to be the case with the old "nofollow")
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)