A few months ago, around the beginning of May, there was a lot of chatter in the WebmasterForums about a major Google update. The truth is that May was a very difficult month for SEO professionals, small business owners, bloggers and online entrepreneurs, because it was marked by two major algorithm updates. Nevertheless, Google did not announce anything, and they also denied any eventual change in their algorithmic parameters. First there was an algorithm refresh on the 8th of May, which resulted in significant traffic drops for major websites, and certain growths for content oriented pages.

The first thought that came to people’s mind was that Penguin had something to do with it, but Matt Cutts, denied this statement. The second major update was the release of the eagerly expected Penguin 2.0, which was actually the fourth Penguin update, but because it was announced that it would use second-generation technology it received the nickname 2.0. As a matter of fact, we soon learned that Penguin 2.0 went a lot deeper than its 1.0 predecessor.
As you have probably already guessed, the second update meant that webmasters would have to focus more than ever on removing unnatural links and promoting ethical SEO practices. But what did the first update mean? On the 8th of may thousands of mails started pouring in, and in almost each of them, webmasters would state that their organic traffic and unique visitor count dropped dramatically, and they did not know why. Because Google did not confirm anything, this update was dubbed the Phantom update.
Search Engine Watch has heavily analyzed our ghostly new friend and discovered that it focused on targeted content, and not links. This means, that now, more than ever, content has become important for SEO. You can imagine what a catastrophe this meant for thin websites with low-quality content and unnatural links, especially considering that unbeatable duo, the Phanteguin, struck almost simultaneously.

The Phanteguin – A Scary Duo You Wouldn’t Want to Mess With
At the moment, the world of SEO specialists is in an uproar. Many of them are thinking of forfeiting their campaigns for the simple fact that there is no consistency in what they are doing. What they mean by this is that with every update that Google rolls out, all of their efforts are rendered futile. We partly agree with this statement, but there are also a few considerations that one should keep in mind.
First of all, Google has declared that it encourages ethical SEO practices which enhance the overall usability of a site, and which add value for average internet users. In other words, as long as your efforts are channeled on the creation of high quality content, optimization for mobile usage, and other techniques that will make it easier for visitors to enjoy your site, and for crawlers to index your pages, then Google will agree. Nevertheless, it can be very frustrating to find out that strategies which worked two months ago, are no longer useful today, so we can relate to the anger and fear voiced by so many webmasters.
We could only talk about the Phantom update, but that would somehow make for an incomplete article, because having rolled-out so close to one-another, their effects were devastating as a combo. This is why we would like to dig deeper and attempt to understand how exactly the Phanteguin combo works.

You are probably already aware that Penguin 2.0 is extremely strict when it comes to unnatural links, web-spam and thing pages. Furthermore, the fourth update (which was soon followed by an even more severe fifth update) goes a lot deeper than the Penguin 1.0. “Taking any page on your site into account, in the past, Penguin 1.0 only analyzed your homepage links” explained Matt Cutts. So you can now expect for every page on your site to be analyzed by our ferocious winged buddy. Nevertheless, the major patterns are still the same: unnatural links, using exact match anchor text from low quality sites will not be forgiven.
Main Characteristics of the Penguin 2.0 Update:
- deeper update it slaps blogrolls, splogs,
- private networks, comment spam, and spammy directories
- targets more than the homepage
- focused mainly on links
- does not take additional webspam factors into consideration
- types of link sources remain consistent
If we were to analyze the Phantom “algorithm” we would soon realized that it is quite different from Penguin, and it actually looks more like the Panda algorithm because it is heavily focused on content. After its released, on the 8th of May, although there was nothing officially announced, even Pete Meyers from Moz ended up posting Phantom in the official Google Algorithm Change History. In a way, Panda greased the skids for Phantom. Sites that were already down due to content problems received the finally kick, while sites that somehow escaped punishment from Panda, finally got slapped.
How Can You Tell if You’ve Slapped by Phanteguin?
There are so many algorithms at the moment (especially considering that the Hummingbird was released a few months back), that it can be difficult to determine what exactly went wrong. Furthermore, most business owners have nothing to do with SEO, so they usually have no idea what to make of unnatural drops in traffic, or PR decrease. Sometimes it is easy to find out what algorithm slapped you, sometimes it’s harder, and this means you must dig a little deeper.
Tip: Whenever you are analyzing drops in traffic, try to also note the dates when they occurred, and relate them to the recent algorithmic changes (which can be found on the Moz Algorithm History Log).
Another brilliant tool that can significantly help you, but it has not been used as much as it should have, is the Webmaster Tools, where you can analyze the Search Queries Reports in order to display your impressions and clicks over the past 90 days. Chances are that you have registered drops on both dates (8 May and 22 May), which means that you were hit by the Phanteguin.
Google Phantom Update Findings
Only because the Phantom update was not officially announced by Google, doesn’t mean that it didn’t happen, nor does it mean that we cannot analyze data from different sites in order to understand what it actually does, or punishes. We have compiled the key findings from Hmtweb and Search Engine Land in order to give you the bigger picture.
#1 Cross-Linking (Network-Like)
Have you ever heard of sister websites? Certain webmasters invest in multiple websites, which they then link to one-another in order to improve Page Rank and Domain Authority. As you can imagine, this is something that Google does not like, and it has left it to the Phantom algorithm to punish such practices. To make it worse, if you are doing this with the exact same anchors, you will definitely be punished.
“So if you own a lot of domains, and you are cross-linking the heck out of them using exact match anchor text, you should absolutely revisit your strategy. The Phantom update proves this point.”
#2 Scrapped Content
As we have already mentioned, Phantom is focused on content. It wants high-quality content, and lots of it. So if you are using duplicate, spun or scrapped content, you are in for the surprise of your life. We are not talking about large amounts of such content, but just enough to raise suspicious. It seems that if you are providing with excerpts from certain sites, but you also provide with a Source link for visitors to check out, you will not have problems, but if you are taking large chunks of text and not crediting the source you might be punished.
#3 Phantom and Location Filters
If you want to solve the dire SEO issues on your site, you should make sure that you are looking in the right direction. Remember, Penguin is focused on links, while Phantom and Panda are more interested in content. When reviewing your analytics reports make sure that you add location filters, because it seems that search queries reporting from the US were more susceptible to this update, than sites that receive search queries from foreign countries. I strongly recommend that you read this article for more information regarding location filters and Phanteguin behaviour.
#4 Link Source vs. Destination
Glenn Gabe analyze a highly authoritative site that registered unnatural amounts of external links going to various sites. Naturally, he wondered how it was possible for such an important site, to maintain its traffic, even if it was violating one of Google’s guidelines with its external linking situation. (although it is ok to credit sources, and put links to other sites on the net, it is recommended not to give the do-follow property to all of them). He suspected that the reason for which its traffic dropped on the 8th of May, was due to its large amount of unnatural links.

Are You at Risk of Being Slapped by the Phantom?
With so many parameters and algorithms to consider, it may be very difficult to find out what exactly went wrong, and what the reason for your penalty was. However, this isn’t that important. The most important thing is to understand what exactly it is that you have to fix, and to try and fix it as soon as possible. G.G. Has discovered another interesting thing, after analyzing several sites. It seems that the websites that were slapped by the Phanteguin algorithm, had been slapped by Panda in the past.
This means that clearly Google had a problem at one point with the content on their pages. In other words, if you have some Panda baggage with you, you should expect a slap sooner or later (in the eventuality that you have not made the necessary changes in the meanwhile). The bottom line is that a combination of unnatural links, thin articles and content that is potentially spammy can cause your page to suffer at the hands of this ghostly foe.
What Exactly is the Phantom? Can we Call it an Algorithm?
The fact of the matter is that Google did not announce anything in regard to the data refresh on the 8th of May, so it may as well be a new algorithm, or maybe, just maybe, an update to the Panda algorithm. But because officials have refrained from giving any statements, we can only assume that it is a dangerous, new “foe”, and we must prepare ourselves for it. The best way is to analyze any strange occurrences on that date, and fix any possible issues. There is also talk that this data refresh may have been a pre-penguin change, and some people even think that we will soon have to deal with a two-headed monster. If that’s the case, then the results may be disastrous for people who do not play by Google’s rules.
But before you start running for the hills, consider this prospect: none of the sites that were hit were 100% clean. They each had their own set of problems that required solving. In other words, as long as you have invested in long-term, ethical strategies, you are probably in the clear. Furthermore, in order to better understand the effects of the Phantom “algorithm”, we must always put it in relation to the Penguin algorithm.

Key Characteristics of Penguin 2.0:
Since Penguin 2.0 rolled-out (and recently -4th October- the fifth Penguin was also released), various sites have been struggling, because they have encountered serious drops in traffic. This is also because their pages had been hit by two algorithms at the same time. Here are a few things about Penguin 2.0 that you should be aware of:
Deeper Yes, Broader No
Matt Cutts announced that the fourth Penguin will be bigger, and more advanced than its predecessors, and this is also the reason why it was dubbed Penguin 2.0. You probably already know that Penguin targets suspicious links, but this update promised that it would go beyond the landing page, and analyze every subcategory on a website. In other words, if you had a spammy link profile, you were susceptible to being slapped. Slaps can sometimes mean more than 80% organic traffic drop. In order to better analyze the gaming of links, Google engineers decided to analyze each and every page on a website, and this seems perfectly fine.
Unnatural Links are Still a Problem
Unnatural links like comment spamm, article sites, spammy directories, link networks, and blogroll links seem to have remained the same. Basically, sites with malware have not been taken care of yet, and if you think about it, nobody in their right mind would let such a problem slip. This can only mean that there is something suspicious going on, and Google hasn’t done anything about it yet. Nevertheless, the Webmaster Guidelines warn about certain link schemes that are prohibited. You should definitely take them into consideration.
Wrapping Up
The two major updates from May are definitely a reason to be concerned. However, as we already stated earlier, as long as you regularly monitor your data and traffic, and correlate spikes with algorithm releases, you can definitely understand what exactly hit your site. Naturally, there are also cases when pages become collateral damage, but this did not seem like the case with these two updates.
Now more than ever, webmasters and bloggers must concentrate on creating high-quality, insightful content, that will truly benefit the average internet user. Corellated with social media signals, mobile site optimization and other practices that enhance usability, you will be well on your way to an amazing page, that will not be hit by any future data refreshes.