How the nofollow tag makes search results worse
A few years ago, Google used its muscle within the search engine world, and on the Internet in general to push through the acceptance of a new concept regarding links to webpages.
Google implemented, and the other major search engines followed, a system where any link tagged with the attribute “nofollow” would not count as an endorsement of the linked site.
To understand how this perverts the very search rankings that are the lifeblood of Google, a little background is required.
Google Power Juice and Links
In its crudest form, the concept behind Google’s search engine rankings is that a site with more links to it is considered to be a more important, or authoritative site than one with less links pointing to it. The idea is that the more people who think a webpage or site is worth reading, the more likely it is to be worthwhile.
To avoid an automated army of websites linking without regard to certain web pages for the sake of boosting their rankings, Google has a bit of a feedback loop. While the raw number of links matters, links from other important or authoritative sites carry more weight than those from less important sites.
Unfortunately, this loop of authority apparently wasn’t strong enough to stop those who were willing to collect huge numbers of raw links regardless of value from rising in Google’s search rankings. After all, you can’t assign too much weight to any one authority or else that site becomes the one who determines the rankings. But, if you can’t give good sites enough weight to outweigh the efforts of bad sites, then the ranking suffer.
Blogs are Google’s Achilles Heel
While the number of incoming links to a webpage or site is a large factor in determining its authority, there are many other factors as well. Things like how often a site is updated, or how certain words are used on that site, or which and how many pages of a site are heavily linked all play a major role as well.
This leads to blogs being overly weighted in the Google math that determines search engines. While blogs tend to be updated frequently, and each post is given its own page which can be directly linked to, such tactics are actually harmful to potentially more authoritative sites.
For example, consider a business or entity in a regulated industry like securities trading. A mutual fund company, for example, has strict regulations on what it can and cannot say in a forum like its website. Thus, content must be reviewed by its legal department and perhaps even be approved by a governing body like the SEC. And, for those that do visit the website the home page is the best place for them to first visit. After all, topics like investing can be complicated enough without being thrown in on a webpage that isn’t designed to be a first read.
And so, a blog about mutual funds has many more of the qualities that Google likes to see than an actual mutual fund company does. While this may or may not be a good thing, the truth is that blogs have a heavy influence in any non-obvious search.
The problem for Google is that most blogs allow readers to make comments on individual posts or the site in general. These comments are usually displayed at the bottom of a page of content and often ask the commenter to leave their name and their internet address, which in the original spirit of the Internet is liked automatically by the blogging software. And therein, lay the foundation for Google’s destruction.
By commenting on an enormous number of blogs, a web spammer could get a huge number of links to any site they wanted. And, unlike any phony site the sites the links were coming from were legitimate websites, often with great authority on their topic.
Of course, no one who cares about their website wants to see it spammed and so various tools were put in place to stop or slow such spam links. But, not everyone keeps going on their blogs even if they don’t take them down and the systems in place weren’t enough to overcome the spammer’s aggressiveness. So while good and legitimate sites kept comment spammers at bay, those that didn’t were causing too much ranking power to flow to spammer sites risking Google’s rankings.
The sledgehammer Google pushed through to kill the fly was not to fix its algorithm to somehow better account for “comments” but rather to implement the nofollow tag. Basically, and link that was tagged by the website as nofollow would work as a regular link for human beings, but would not “count” for search engine ranking purposes. Shortly thereafter, Google’s will was done when the major blogging platforms made comments to be nofollow by default, thus ensuring that even the least maintained of blogs would not pass their ranking power onto the comment spammer’s sites.
To mix some metaphors, the fly in the ointment is coming home to roost. While the nofollow tag was supposed to keep bad sites from receiving the endorsement of other websites, it has given so called SEO experts a new tool to use to manipulate Google’s rankings from within.
Now, a website can write an authoritative and highly informative article about credit card rewards programs, for example. In doing so, the author of the webpage can provide links to all the right places like, Citibank, CapitalOne, and even competing resources like creditcards.com . But then, just one little link on that page could point to another site, perhaps the author’s own credit card rewards program ranking site, or perhaps to a site that pays well for referrals.
Under the pre-nofollow system the power of that site to influence the authority of those to whom it linked would have been split between all of those links. This significantly lessens the power to manipulate where the so-called Google juice goes, because the author must link important sources or risk looking uninformed or devious. So, while the slipped in link still benefits from the page, so to do the real resources and since they will also get numerous similar links from other places, they will continue to earn their rightful place in the rankings.
But, now, thanks to the nofollow tag, the author can provide the links that readers expect, while funneling all of the ranking power to his desired link, all without the reader knowing unless they specifically install a special tool or look at the underlying code.
Already, SEO consultants, websites, and others are recommending that all links be nofollowed by default and only the links the author wants be manually set to be followed. So, Google has taken the ranking power out of its hands and put it where it doesn’t belong.
Another fix is, no doubt already in the making. But, will it be any better than the last fix, or is it time to admit the core algorithm has seen its best days?
Oh, and by the way, the comment spammers? Do a Google search for “blogs that do follow” and guess where they post their comments now?