Eric Blair & Michael Edwards
Just over a month ago, Google announced that they were changing their algorithm in order to weaken the search engine rankings of sites they deem to be “content farmers.”
Whereas most of Google’s algorithm changes are barely noticeable, the current change that they have been working on since last January will affect 12% of U.S. searches.
There has been much debate about what “content farming” is, and Google has done little to offer a clear explanation, simply stating, “low quality” or “shallow” sites would be affected. This is similar to the vague definition of pornography — you’ll know it when you see it.
The problem with such a vague approach to what is a strictly defined algorithm is that it leaves too much room for a human interpretation. And as we have seen, Google has been exposed as having connections to U.S. intelligence agencies, which doesn’t bode well for alternative news sites that aggregate anti-establishment stories from around the web. Given the other censorship threats facing the Internet, it seems those who might be critical of Internet control and real-time surveillance of average Americans are being targeted.
One definition of content farming sites comes from Danny Sullivan at Search Engine Land:
- Looks to see what are popular searches in a particular category (news, help topics)
- Generates content specifically tailored to those searches
- Usually spends very little time and or money, even perhaps as little as possible, to generate that content
Counter Markets Newsletter - Trends & Strategies for Maximum Freedom
The first point is particularly troubling for alternative news, since these are the sites who often scour mainstream news to discover which topics are of popular interest so that competing commentary can be offered on a given issue. Even in the area of “help topics” there are many alternative news sites, such as our own, who focus on tips for survival, protection from economic crisis, advice for privacy protection and personal security, etc.
Again, with the overshadowing definition of “shallow” content, who is deciding this? Furthermore, point two addresses tailoring content for specific searches, which sounds a lot like the “Google Bombs” introduced by Alex Jones and implemented by others as an effective way to compete with the mainstream media pablum, which focuses heavily on celebrities, sports, and other truly shallow and low quality content.
And, finally, point three seems to penalize blogs and other low-cost means of sharing opinions, as if not having a mainstream media budget automatically implies low quality, when provably the reverse is often true. Those who research information and present their own opinions as to the significance of what they have studied generally are doing so out of a passion to expose lies and direct their fellow man to the truth.
Google’s punishment of those who re-post material as an essential tool for sharing information appears to now reduce news aggregators to the status of plagiarists within the algorithm. There are many alternative news sites and blogs which have original material that they freely share, in part or in full, purely to support one another in disseminating the truth. We all know what plagiarism looks like and a link back to the original source should not, for instance, be grounds for labeling a site as shallow.
Google needs to address the vital tool of sharing information, as well as to more clearly define their algorithm in upcoming press releases, or we can only conclude that they have begun to wage war on news sites who aggregate information to present an alternative to establishment media.