The world of Search Engine Optimization could be facing another website-shaking seism as news begins to surface that the search giant’s link purifying algorithm, Penguin, is approaching a long-awaited update.
When Penguin 1.0 arrived in May 2012, it had a big impact. Aimed at blocking the artificial enhancement of search rankings by websites that flouted Google’s Webmaster Guidelines through “link schemes” (i.e., manipulative “black-hat” SEO methods that effectively amount to spamdexing), some have complained that it actually hit legitimate targets. Like, for example, sites owned by small businesses that had used legitimate SEO methods but were simply a little clumsy in the use of anchor text.
Writing in SearchEngineLand.com, online marketing expert Eric Linge castigated Penguin for its infrequent updates lately: there were five between April 2012 and October 2013, followed by radio silence ever since. That means that a huge number of websites that have fallen foul of the algorithm haven’t been given the chance to “clean up their act” and comply with Google standards for the best part of a year (it can take a matter of weeks to rectify the faults – being taken out of action for 10 whole months may seem a tad punitive). Linge also holds bad SEO agencies responsible for many of these Penguin casualties, who he claims built dubious links for unwitting webmasters.
But Google’s war on dodgy links is continuing: last weekend, it sent out link penalty notifications to a German linking network and a separate European one, advising webmasters on its official Google+ page that they should consult its Webmaster Guidelines on link schemes (http://goo.gl/wul1gs) to make sure their sites “are not engaged in deceptive linking practices!”
And now Google webmaster trends analyst John Mueller has dropped a big hint that a new Penguin update is coming soon.
Asked by webmaster Josh Bachynski this week about algorithm updates, Mueller revealed that Google engineers were working on Penguin, and that the soon-to-be-released update would be more than a “tweak” because the algorithm needs “a complete rerun” of data. That means making sure that the right kind of data was being used, he added.
But when? In Mueller’s words, “[I] t’s been quite a while now, so I imagine it’s not going to be that far away, but it’s also not happening this morning.”
You must be logged in to post a comment.