Nobody likes to see his original content published on other sites without permission, wouldn’t you agree? And nobody likes to see those sites ranking higher than his own site with the original content. Well, Google doesn’t really like that either. And it looks like they may have started another campaign to solve the scraping problem.
Why do people “steal” content?
First, for most people doing that it’s an easy way to “create own
content”. Second, it’s way too easy to put hands on other people’s work. With a few clicks you can copy text and images, make some small changes so that it fits your needs and insert everything in your own blogpost or article. Done.
But is it ethical? And is it legal? You decide.
Some experts say, Yes it is. They even sell the software to do just that. According to them, it’s a fast and cool way to save time and post good content.
Hmmm… Perhaps that’s why there are so many sites with content scraped together…
However. Watch Google’s Matt Cutts talking about scraper sites:
Why scraping can be a serious issue
Google is looking to get more first hand information about sites
using this “strategy”. They provide a form called Scraper Report asking people to report scraper sites.
To make one thing clear: the scraping problem has nothing to do
with curating content where you take a post or an article and re-
publish parts of it by adding your own unique content, mentioning and linking back to the original source.
As per now it’s not clear yet what Google plans to do with the data they get from the Scraper Report. Whether they plan to ban scraper sites or another algorithm change, no one knows. So please read carefully the Google Webmaster Guidelines and make sure that
your site follows them before taking any action. Don’t let the scraping problem become a serious problem for you.
Was this article helpful? Thanks for liking it!