www.webdeveloper.com
Page 2 of 5 FirstFirst 1234 ... LastLast
Results 16 to 30 of 68

Thread: "nofollow alternative"

  1. #16
    Join Date
    Jan 2011
    Location
    Munich, Germany
    Posts
    237
    "They've each got a myriad of engineers working on algorithms to sift through and categorize the web in the most meaningful way possible."

    I would like to back-up how difficult it is for a search engine to figure that out:

    The tag <an... is replaced by PHP, not JavaScript to <a... i.e. server-side!

    It's virtually impossible for a search engine to figure out...

  2. #17
    Join Date
    Jan 2007
    Location
    Wisconsin
    Posts
    2,120
    Quote Originally Posted by arvgta View Post
    .. it is of course very important that the search engine doesn't have a hope in figuring out whether this is a link or not.
    It's silly and absurd to think that Internet Explorer can figure out whether something on a page triggers navigation and that Google cannot.

  3. #18
    Join Date
    Jan 2007
    Location
    Wisconsin
    Posts
    2,120
    Quote Originally Posted by arvgta View Post
    "They've each got a myriad of engineers working on algorithms to sift through and categorize the web in the most meaningful way possible."

    I would like to back-up how difficult it is for a search engine to figure that out:

    The tag <an... is replaced by PHP, not JavaScript to <a... i.e. server-side!

    It's virtually impossible for a search engine to figure out...
    Wait ... server-side replacement? How the devil do you expect a server-side replacement is going to trick a search engine? They'll see the replaced (proper) version of the link straight-up!

  4. #19
    Join Date
    Feb 2012
    Posts
    218
    Arvgta, at this point you should stop, as you don't know what are you talking about.
    Ok, with this topic you got over 100 posts. Happy now?
    It started interesting but you don't listen to others.
    Nofollow does exactly what you need and must be used in that way. Stop working around because you will get on the black list.
    Use the web as it is and focus on good content.

  5. #20
    Join Date
    Jan 2011
    Location
    Munich, Germany
    Posts
    237
    Well that's excellent (technically)

    It leads us to the discussion whether a spider is capable of viewing the situation after page load, like "FireBug" does.

    I don't know at the end of the day, but they certainly would have to - if they wanted to get on top of these things!

    If you go by what is in Google Cache, then certainly not, or by the "Spider View Tools" around...

    Do you know? (Interesting)

  6. #21
    Join Date
    Jan 2011
    Location
    Munich, Germany
    Posts
    237
    Hi hyperionXS,

    our posts sort of clashed I think.

    No worries - I'll stop posting soon :-)

    But I do find it a very interesting topic whether a spider of Google's is capable of seeing the situation like in "FireBug".

    (They should really and compare it to before)

    Ok, I'll shut up for now - sorry ;-)

    (glad you found it interesting in the beginning)


    Kind regards

  7. #22
    Join Date
    Jan 2011
    Location
    Munich, Germany
    Posts
    237
    The tag <an... is replaced by PHP, not JavaScript to <a... i.e. server-side!

    ...and then fed to JavaScript via Ajax request, from which it is pulled, if JavaScript is enabled.
    (I think that was irritating you - of course it's not a simple server-side replacement, which any search engine would notice straigt away)

    Anybody else got any opinions on the approach?:

    Code:
    <an href="http://www.facebook.com">Facebook</an>
    It's a pity the thread died due to that small detail...

  8. #23
    Join Date
    Jan 2007
    Location
    Wisconsin
    Posts
    2,120
    98% sure it didn't die on account of any particular detail. No serious developer is going to spend their time trying to control link juice to the extent you wish to control it. And most of us think it's actually a terrible idea even to try.

    The thread is dead because it had nowhere to go from the get-go -- aside from some of us trying to convince you this is an utterly useless and silly endeavor.

  9. #24
    Join Date
    Jan 2007
    Location
    Wisconsin
    Posts
    2,120
    To be clear, hiding links from search engines, should you wish to do so, is so absurdly easy that there doesn't NEED to be a thread in a forum to bounce ideas around. You haven't stumbled across a silver bullet; you've stumbled across weird, horrendous scheme that does no one any good, despite how marvelous you might think it is.

    To be even more clear, there are literally hundreds, if not thousands, of ways you can hide links from search engines, many of which are infinitely more clever than yours. No serious business is interested in this. No serious developer or designer is interested. The ONLY interest you'll find on the matter is from sketchy SEO groups -- most of which will already employ more clever and successful techniques.

  10. #25
    Join Date
    Jan 2011
    Location
    Munich, Germany
    Posts
    237
    Alright, no need to get that emotional

    What's your favorite way of hiding a link then, if you could nominate only 1 then, that is simpler and more elegant to the user, if done on a large scale?

  11. #26
    Join Date
    Jan 2007
    Location
    Wisconsin
    Posts
    2,120
    Just trying to stress the extent to which you should NOT be attempting this -- doesn't seem to be sinking in. BUT, if you're really that interested in getting yourself blacklisted ...

    ... I'd say the simplest (2nd simplest), most fool-proof mechanism would be to structure each link like so:

    <a href="#http://facebook.com/whatever">facebook</a>

    Create a JavaScript method that performs a search-replace on all links onmousemove and onkeypress. This goes in a script tag at the bottom of your page:

    PHP Code:
    var fixLinks = function() {
      var 
    links document.getElementByTagName('a');
      for (var 
    i in links) {
        if (
    links[i].href.substr(08).toLowerCase() == '#http://') {
          
    links[i].href links[i].href.substr(1);
        }
      }
      
    document.body.onmousemove null;
      
    document.body.onkeypress null;
    }

    document.body.onmousemove fixLinks;
    document.body.onkeypress fixLinks
    Search engines will see the links as though they're just standard, local page anchors with have no associated link juice, since it's not likely that a search engine will bother to trigger the mouse move or keypress events. But, for most of your visitors, the first step to accessing a link is either a key press (tab) or a mouse move, which immediately triggers link-correction. And, to avoid processing all links repeatedly, our link-fix method removes the onmousemove and onkeypress methods from the document body.

    Have fun getting yourself blacklisted ..

  12. #27
    Join Date
    Jan 2011
    Location
    Munich, Germany
    Posts
    237
    I don't think my approach gets anyone blacklisted.
    It's been around for a while now in different forms and I've never seen a penalty.
    On the contrary, it's good for the site's PR and hence the SERPs.

    Thanks for your suggestion!

    I would argue that that increases the number of links counted on the page.
    The search engine will have no problem in recognising these as links, which I thought was the first pre-requisite of any "nofollow alternative"...

  13. #28
    Join Date
    Jan 2007
    Location
    Wisconsin
    Posts
    2,120
    Quote Originally Posted by arvgta View Post
    I don't think my approach gets anyone blacklisted.
    It's been around for a while now in different forms and I've never seen a penalty.
    On the contrary, it's good for the site's PR and hence the SERPs.

    Thanks for your suggestion!

    I would argue that that increases the number of links counted on the page.
    The search engine will have no problem in recognising these as links, which I thought was the first pre-requisite of any "nofollow alternative"...
    You can think all you want. Doesn't make you right. You'd do well, like the rest of us, not to pretend Google (and the other engines) are bluffing. Hell, a quick search for "google blacklist" yields a documented incident on the 1st page, wherein an entire domain was blacklisted, seemingly for something as trivial as having a domain alias: http://answers.google.com/answers/th...id/310942.html


    On the 2nd matter: Sure, they could count to the number of page links, but if that's the case, they're all inward pointing anyway. My guess, however, is that google doesn't count page anchors -- no significant reason to do so.

  14. #29
    Join Date
    Jan 2007
    Location
    Wisconsin
    Posts
    2,120

  15. #30
    Join Date
    Jan 2007
    Location
    Wisconsin
    Posts
    2,120

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
HTML5 Development Center



Recent Articles