You may also call us on +91 98330 94626, Schedule a Call, WhatsApp Us or send us an email to hello [at] capsicum [dot] in to discuss your project further!
Posted By Nirav Dave
In a 2013 video, Matt Cutts, who then worked as a Software engineer at Google, answered a popular question put forth by one Gary Taylor: How does Google Handle Duplicate Content?
At the time, Matt Cutts talked about how around 25-30% of the content online seemed duplicate. He then gave an example of how bloggers and websites quote other content on their personal website who in turn link back to the original content.
He elaborated on the term duplicate content, making sure that we did not confuse all duplicate content with spam.
He also said that treating such content as spam would negatively affect search quality.
How does Google deal with this form of Duplicate content that is not SPAM?
It groups such duplicate content together, treating it as one piece of content. Suppose a user enters a search query that matches two pieces of content, one a bit similar to the other but not an exact copy. Each offering their own unique perspectives.
Does Google show both results?
No, it picks the content with the most to offer the search user.
Keep in mind, so far, we were talking about duplicate content that is not deceptive or manipulative. When it comes to websites and blogs that only churn out duplicate content, Google reserves the right to treat it as SPAM.
Seven years later, Bill Hartzer, an SEO Consultant,took to Twitter to ask John Mu, one of Google’s Search advocates, a question.
In his tweet, he asked John Mu if there was a percentage that represented the duplicate content on the internet. And whether Google even computes the percentage.
To which John Mu replied:
Refer to Google’s Podcast about How technical Search content is written and published at Google and more!
Gary Ilyers, John Mueller, and Martin Splitt, discuss the topic in detail. And as you can see from the following screenshots of the provided PDF, the discussion is quite similar to what we have seen so far.
Gary Illyes also goes on to explain how they detect duplicate content. It is done through checksums and therefore is not considered a metric with a percentage value.
When it was first launched in 1995, Internet Explorer steadily climbed to the top, capturing the mar...Read More
Before the continuous scrolling update for google search results page was introduced, a specific goo...Read More
Google is rolling out a set of new features for local search, along with new tools to assist journal...Read More
On 18th August 2022, Google Search Central announced that it will soon launch its newest search engi...Read More
On 12th September 2022, Google announced the release of the Google September 2022 Broad Core Updat...Read More
Google Search Console is known to track activity generated from local results. However, the Search C...Read More
What are Learning Video Rich Results? In Google’s May Core Update, Google rolled out numerous feat...Read More
Users interested in purchasing new products such as phones, laptops, hair care products, and skincar...Read More
Since its inception in 2002, Google’s Webmaster Guidelines have been the go-to resource for th...Read More
Capsicum Mediaworks LLP
46 Siddhachal Bldg, Office No. 2, Next to Cosmos Bank, Hanuman Road, Vile Parle (East), Mumbai - 400 057. Maharashtra. India.
9.30 am - 6.30 pm IST (Mon-Fri)
© 2009 - 2023 Capsicum Mediaworks LLP.
46 Siddhachal Bldg, Office No. 2, Next to Cosmos
Bank, Hanuman Road, Vile Parle (East),
Mumbai - 400 057. Maharashtra. India.
9.30 am - 6.30 pm IST (Mon-Fri)