Irishwonder

Julia Logan: Are link intelligence tools becoming obsolete?

Ever since Yahoo! stopped supporting the linkdomain: search operator back in 2010, link intelligence tools have become a necessity for most online marketers. With links being an integral part of any ranking algorithm (up to this day, despite all the changes and updates), knowing what links to a site is a clue into ranking successfully.

Traditionally, link intelligence tools have been used for competitor research – if you know what a successfully ranking site is doing (and doing = building links), you know what you can do to achieve the same results. As of lately, since links became not only a ranking factor but also a large penalty factor, a different need arose for these tools: controlling the quality of your own site’s link profile.

However, when we look at a site’s backlink profile, are we really seeing the complete picture?

Take a look at some highly competitive SERPs and then check the backlink profiles of the top sites ranking on the first page – you will likely see thousands if not hundreds of thousands backlinks. But every once in a while, you can spot a link profile like this:

1

Seriously? 620 links off 36 domains – and this site is ranking for a really competitive keyword? How did they do it?

We are missing one very important point here. Look further, and you will likely see a breakdown like this:

2

Redirects, here is the clue. And never mind that you are only seeing 2 of them, in all likelihood there are more redirects in this site’s inventory (I would not even call it a link profile) than actual links. Problem is, no existing link intelligence tool is good at detecting them. Why?

Backlink tools are just that – tools designed to discover a site’s links. Redirects are a completely different animal. There are various ways to redirect URL A to URL B, including but not limited to: 301 and 302 (implemented via .htaccess), meta refresh, Javascript redirects, all sorts of conditional redirects and so on. To add to the hassle, there can be multiple redirects connected into a chain where URL A redirects to URL B which then redirects to URL C (this can go on for any amount of steps). Link intelligence tools have not been built for the task of discovering, dissecting and tracking down all different kinds of redirects, and although some of them are trying to catch up, this is not a trivial task and a lot of ideological as well as algorithmic questions have to be answered to build a suitable working solution.

The linking landscape has changed a lot in the last few years since Google has started its war on links. Ironically, the idea of punishing for low quality links has brought about an unseen amount of workarounds – not cleaner SERPs as Google was hoping.

Various redirects are just one of them. They haven’t been invented recently of course – but never before have we witnessed such a widespread use of redirects. To avoid penalties, some website owners have started using secondary throwaway domains, building links to them and then redirecting these domains to the main domain. If a site was caught by Google and penalized, all its owner had to do is remove the redirect. When buying aged domains with strong link profiles has become cheaper, easier and less risky than trying to do clean linkbuilding to your own site, more online marketers started doing just that – buying aged domains and redirecting them to their money sites. When redirecting a penalized domain to a new one began being perceived as a way to get rid of a penalty, this yet another reason to use redirects has further spurred their popularity.

But it is not just barely detectable redirects that dim the picture as we look at sites’ backlink profiles. The ability to disavow links, introduced by Google in October 2012, has been an enormous game changer. Initially promoted as “webmaster’s best friend” for cases when link removal (e.g. when trying to remove a penalty) was not possible, it soon started getting used and abused in ways never imaginable before. Good sites are getting disavowed along with bad sites, whole domains as well as individual URLs, and it’s all based purely on each site owner’s subjective judgement and often complete lack of understanding. The impact of the disavow tool is likely much larger than most people are willing to admit. Essentially, we are dealing with a giant black box: we do not know what goes into it, we don’t know what rules are applied inside, we don’t know if/when our disavow requests are granted, we don’t know how many disavow submissions make a site “bad” in Google’s perception and how it can influence all the other sites linked to from it, not just the site on whose behalf the disavow request is submitted. By looking at a site – any site – there is no way to tell whether or not it has ever been disavowed. And that’s where it becomes scary.

The World Wide Web has been called the World Wide Web for a reason – essentially, it IS a giant web where all things are connected to each other. It is built with links, links between URLs are its living matter.

3

Long before links became a ranking factor, they became the building material of the web. Long after Google is no longer around, links will still be there. EVERYTHING is connected to EVERYTHING and depends on EVERYTHING. Destroy links, and the web will be destroyed. Come up with as many alternative ranking factors as you can, links will still be there and they will still be important because they make up the World Wide Web.

Policing the links is the worst thing anybody could ever come up with. First assigning the links a meaning that was never meant (links as votes), then distorting that meaning and making it impossible to even approximately evaluate it. Unless Google decides to open its black box and share the data, we will never know what is really going on inside – and no link tool can help us here. Google itself could probably provide an adequate link tool presenting clearly all these data – but Google historically has been notorious for hiding the link data rather than sharing them.

Does all of this mean link intelligence tools have now become obsolete? Absolutely not.

As I mentioned above, one of the important uses of link tools is controlling one’s own link profile. Getting penalized is easy, removing a penalty is a tedious job never guaranteed to bring the desired result, and it is solely the responsibility of a site owner to keep a close eye on what is going on with their link profile.

Case in point: negative SEO attempts. Just to be clear, negative SEO is a very real threat, although it is not only about links, and not every negative SEO attempt is successful. But because of a few articles that have been making rounds, many people believe negative SEO is as easy as pointing a few spammy links at a site, and that type of negative SEO attempts accounts for the largest part of all negative SEO ever becoming discoverable.

Link intelligence tools can be very helpful in detecting such link based attempts. A sudden spike in link acquisition coupled with unrelated to the site’s topic anchor texts are usually a good indicator of some of the more uncanny attempts:

4

5

Moreover, these signals, coupled with a specific URL being linked to by the attacker, often serve as a “signature” allowing to trace it back to its originator.

Link audits are a vital procedure that should be done not once or twice but regularly as a prevention measure. Most site owners only decide to audit their links after a penalty has happened or a site’s rankings have dropped for some other reason, when in fact this is already too late. Timely detection of an issue can help solve it faster and easier.

But any tool is only as good as the person using it. Link audits should never be run by inexperienced interns or outsourced on the cheap. I have seen link removal requests resulting from poorly done link audits, ridding a site of its best links – not by some evil competitor but by the site’s own SEO team. Hiring an experienced consultant, ideally somebody specialized in link audits, is bound to be an expensive procedure, but there is no other way to do it properly. No automated tool kits, no amount of articles written on the topic of link audits on SEO sites can replace the experience, sharp eye and often even intuition of a true professional. We are dealing with fragmented information scattered all over the place in no logical order, and unless you can put it all together and read a site’s link profile like a book, don’t even attempt it.

The purpose of a good link intelligence tool is much more than just scraping the web and detecting the HTML syntax for links. A good link intelligence tool should be also capable of detecting and analyzing connections, dependencies, relationships between sites, evaluating links and other factors influencing the site and coming up with adequate metrics connected to existing search realities formally describing the results of these evaluations. The best link intelligence tools – such as MajesticSEO – are already evolving in this direction, but it is still a long and complicated way they need to make to get us closer to understanding the web better. The scientific foundation for this ongoing development is just beginning to be created and will no doubt involve the brightest minds.

 

This article by Julia Logan, aka the Irish Wonder, was first published on SiGMAgazine, Issue 1. Cover image credits: Jackie Hole. Cover image source: www.stateofdigital.com

Latest News