[This article was originally published on Excerptz on July 29, 2011, but I have been notified that the site is closing down, so I am republishing here.]
On February 24, 2011 quite a few article directories were subjected to a reduction in traffic by the Google Panda update. Many sites were hit, and there was suddenly a lot of talk about “content farms” and how they needed to be eliminated from search results. What are content farms? It wasn’t immediately clear to all.
One definition of a content farm that was quite prevalent at first was that it was a site whose authors edited their own work, had no editorial guidelines and were paid through revenue sharing. This impression was reinforced when sites with up-front payment and stringent editorial supervision of writers were left standing, while those that relied on revenue sharing and self-editing were dropped from the rankings. (For a comparison of two such sites on PubWages, click here.)
What should an article directory do to avoid the stigma of being labeled a content farm? Should lots of new rules be implemented, to make sure that content is fresh, unique, correct, authoritative and grammatical? Should revenue sharing be removed from the site? Should only native English speakers be allowed to write there? Should the subject matter be limited to a single niche in order to ensure high keyword density?
Or should there be few rules and better behind the scenes manipulation of such invisible arcane coding as do-follow and no-follow by the webmaster? Watch this video interview of Dani Horowitz in order to make up your own mind:
In the above interview on WebProNews, here’s what I learned that I did not know before from listening to Dani Horowitz:
- Many of the sites hit by Google Panda did not have low quality content.
- Several technology forums were hit.
- RSS feeds triggered the anti-duplicate content bias of the Panda update.
- When syndicators back-link to the source of the RSS, that used to be considered a legitimate back-link. But under Google Panda all those back-links were de-valued, reducing the rankings of an authoritative site.
- Changing the URL structure of a site can be helpful. (Subdomains.)
- Google now frowns on tag clouds that are more than 75 words. (Keyword stuffing).
- Search results pages are also not okay with Google anymore, so you have to add “robot.txt” to them.
- Use the no-index and no-follow meta-tags for anything you think Google won’t like. You get fewer pages indexed, but your ranking improves. (Forum posts with no replies, for instance should be de-indexed.)
- Google gives lots of credit now for social bookmarking, so those buttons are really important.
- Forums are important in the online landscape. They are not pretty, and they are user generated and the grammar is not always right. But sometimes the answer to a crucial question can be found there, and not in a well polished article. This means streamlined content does not always rule.
From this list, we can see that rules about grammar and presentation in an article directory are much less helpful than having a webmaster who is knowledgeable in SEO lore. Google’s algorithm has no way to tell what is quality content. It’s not just that marketers are gaming the Google system. Google itself is gaming the community of searchers, by using superficial indicators to determine the quality of content. A good webmaster can find out what superficial indicator is all the rage with Google today and do some invisible tweaking to help a site’s ranking. There is no need to gang up on writers with a new editorial policy while all this going on.
© 2011 Aya Katz