It is a well known fact that more than 50% of the content on Internet is just crap because when you start searching some meaningful information you will often stumble upon many half baked, poorly written spined off articles which have been posted to misguide superficial or ignorant users. Its indeed a fact.
As an online marketer , we often see this trend. A reputed site publishes some well researched meaningful information and many other publications start presenting the same information in other forms by re-writing them. Some will be more interesting than the original.
Mashable has this wonderful ability to make some common news turn into interesting topic by writing compelling headlines. Sometimes , I have seen Mashable going over zealous and writing misguided headlines but we all know that “To err is human.”. Wait for some more time for a software from Google which will write articles by gathering information from real world.
Please don’t think, I am day dreaming. If Google can index whole of Internet, successfully test driver less car and Google glass. I don’t think the day is far when a software will be there to act as reporter and write articles without any help.
By the way, coming to the topic, Matt Cutts, head of Web-spam team says that 25-30% of content on Internet is duplicate. But don’t worry that is fine. I was wondering one day, since Google stores all information, there will be one day when all information will be duplicate because there might be a limit to sentence structures.
But I think that day will far. I will leave the calculation of the remaining days to computer science and linguistic experts at Ivy League colleges, Oxford and Cambridge.
Matt Cutts says, “ We don’t treat duplicate content as spam.” Good, you should not. SEO experts around will feel relived, I don’t think spammers ever care, whatever you say. 🙂
But spammers beware, according to Matt Google can penalize a site if its used duplicate content in a manipulative manner.
Now, no blabbing. Watch the video.