Now Reading
Beware of New Negative SEO Methods

Beware of New Negative SEO Methods

by Philip Armstrong17th December 2014

When you think of Negative SEO you generally think of some black hat marketer buying loads of spammy links and directing them towards your website.

But with some of the most recent updates to Google’s algorithm (which take into account on-site factors, such as website speed, and perceived user experience), there has also been a development in the methods being used for Negative SEO.


No Spammy Backlinking Required


These new (or newer) Negative SEO techniques do not take advantage of the part of Google’s algorithm which analyses a website’s link profile, and therefore do not work by building poor quality links to a website.

They take advantage of:

  • the part of Google’s algorithm that measures a site’ speed/ loading time, and
  • the part that measures a site’s click-through rate in the search results

So, what you’re probably wondering right now is:

How are they doing this?

And secondly, how do I defend against it?

Let’s break down the process, and answer both these questions:

Overloading your server by heavy crawling & Hotlinking


Overloading your server is achieved firstly by heavy crawling of your website using different IPs. This is quite often done by using a crawling bot and virtual private network/ proxies, and is similar to a low level DDoS attack which does not crash the server, but slows it down and clogs it up with constant requests.

Secondly, by Hotlinking the images, videos and/or other files on your website the attacker can steal your bandwidth. Hotlinking occurs when the attacker creates a link to a file on your website’s server, for example by using an <img> tag to load a JPEG from your site on one of their pages. Most of the time when someone is linking to and loading an image from your website on their website, there is no malicious intent behind it (they simply like your image and would like it on their website too), and there won’t be a negative effect. The problem however, arises when there are many web pages which are using your images/ other files this way. All of these webpages make a request to your server in order to load the image, and this uses up your bandwidth. By doing this on a continual basis, a Negative SEOer can make your website load slowly.

For more information about Hotlinking, and how to stop or prevent it, click here

How can we defend against it?

By running a crawl on your own site throughout different times in the day, you can measure the response time of your servers. If you notice an unusual dip in response time, then this can indicate that there is a problem.

You can prevent your images and other files from being Hotlinked by using custom code in your HTAccess file. Also any files which are being Hotlinked can be renamed or deleted to stop the attack.


Lowering Your Click-Through Rate

By creating a bot which is programmed to search for your main keywords in Google, and then click all of the sites on the page except yours, attackers can artificially lower your click-through rate (CTR) as compared to your competitors. This can lower your position in the search results as website CTR is a ranking factor, and the lower your CTR, the less likely you are to appear in the top results.

How can we defend against it?

This attack can be very difficult to detect, and can be very difficult to deal with once detected. If the attacker is using multiple and constantly changing IP addresses then it can be very difficult to prevent future attacks. Some marketers have discussed the potential for using a CTR bot of your own to counter balance the effects of the attackers bot (like Bartosz Goralewicz, who has conducted a case study on the different Negative SEO strategies). However, artificially manufacturing a high CTR for site is against Google’s guidelines, and could wind up getting your site deindexed!

So what does this all mean?

With the release of the Penguin update, it become essential for webmasters to protect their websites from malicious link spam from Negative SEOers, using Google’s disavow tool. But with the prevalence of linkless Negative SEO, protecting your site is going to require far more technical knowledge and vigilance.

If the use of such tactics becomes widespread, Google may need to stop using user experience as a ranking factor, or perhaps come up with a way of detecting whether the activity has come from real users or bots.

Enjoyed this Article?

Signup now and receive articles like this directly to your inbox!

We will never give away, trade or sell your email address. You can unsubscribe at any time.

What's your reaction?
Love It
Hate It
About The Author
Philip Armstrong
Philip Armstrong is a content writer for the Just SEO newsroom. Having served as an Adword's manager for a number of paid search campaigns, he is an expert in spending money to make money, and regularly contributes in-depth articles on the latest news and updates on pay-per-click (PPC) matters.

Leave a Response