Duplicate website got indexed: Caused rank drop?
-
Hi all,
We have replica of our website with exact pages and content. That website got indexed by mistake and allowed for bots for more than 10 days. Our ranking dropped now and we moved from 2nd page to 5th page. But previously we had this happened and didn't hurt much. We got punished now?
Thanks
-
I think it is not necessary to redirect the duplicate domain, just a faster way 2 show that the content is on the original only.
Depending on the topic and competitors, changing the title can make a huge difference and that pretty quickly...
-
Thanks for the answer. We do not like to redirect as we stooped it getting crawled by bots once we realised that it got indexed by mistake. We have changed the page title of the one of the top ranking page. Would that made the drop to homepage ranking for primary keyword?
Thanks
-
Ive never has that problem (or a client) so i cant say you are punished. I thought google wouldn't punish the original, just the copy - but if they link to each other it may (should) happen.
So - take the replic-domain - redirect everything 301 to the orignial - ask google for indexing the duplicate Pages (complete domain) with Google Search Console. That should work faster than just wait.
If nothing happen, it was something different
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Not Indexing Pages
Hi there! I have a problem that I was hoping someone could help me with. On google search console, my website does not seem to be indexed well. In fact, even after rectifying problems that Moz's on-demand crawl has pointed out, it still does not become "valid". There are some of the excluded pages that Google has pointed out. I have rectified some of the issues but it doesn't seem to be helping. However, when I submitted the sitemap, it says that the URLs were discoverable, hence I am not sure why they can be discovered but are not deemed "valid". I would sincerely appreciate any suggestions or insights as to how can I go about to solve this issue. Thanks! Screenshot+%28341%29.png Screenshot+%28342%29.png Screenshot+%28343%29.png
Algorithm Updates | | Chowsey0 -
Sizable decrease in amount of pages indexed, however no drop in clicks, impressions, or ranking.
Hi everyone, I've run into a worrying phenomenon in GSC and im wondering if anyone has come across something similar. Since August, I have seen a steady decline in the number of pages that are indexed from my site, from 1.3 million down to about 800,000 in two months. Interestingly, my clicks/impressions continue to increase gradually (on the same pace they have been for months) and I see no other negative side affects resulting from this drop in coverage. In total I have 1.2 million urls that fall into one of three categories, "Crawled - currently not indexed", "Crawl anomaly", and "Discovered - currently not indexed" Some other notes - all of my valid, error, and excluded pages are https://www. , so I don't believe there is an issue with different versions of the same site being submitted. Also, my rankings have not changed so I tentatively believe that this is unrelated to the Medic Update. If anyone else has experienced this or has any insight to the problem I would love to know. Thanks!
Algorithm Updates | | Jason-Reid0 -
Does anyone know what causes the long meta description snippet?
You know the ones I mean... Google have been infrequently displaying some meta descriptions as 3-4 lines long for some time now. But recently, I've been noticing them more. Not sure whether it's just a coincidence that I've been seeing more for my searches, or whether Google are displaying more in this format. Does anybody know what causes Google to prefer the longer meta description or extended meta description for some results?
Algorithm Updates | | Ria_0 -
My Website No Longer Appears in Mobile Google Search but Does in Desktop...Why Is This?
For a long time my website has appeared in both desktop and mobile search in Google. Yet recently it has stopped appearing in mobile yet still on desktop. Any ideas why this is happening and how to rectify it please? Many Thanks.
Algorithm Updates | | WSIDW0 -
Whats the best method to tackle website traffic drop?
ON 14th November following a DNS error (which seems to have been a google error as local servers were not effected and it happened to many people everywhere) my traffic started dropping. Within 5 days it was down 50%. In a panic to resolve the situation I thought it was because i was on a shared host and moved to VPS which was a disaster. Ive had server errors since. After desperately looking everywhere for news on the DNS error and whether there as algorithm change etc I decided i should have a look at my site @ www.mutantspace.com and see if there were any internal issues. In Google Webmaster forum a helpful moderator suggested that I do 2 things: 1. Deal with the fact that I had keyword stuffed my <alt tags="">. Basically i run an arts blog so have 8 - 10 images per post and was putting the same text in each one i.e [artists name][artform][name of artwork]. I stupidly didnt realise what i had been doing and have since been deleting my alt tags for every image except one per post. However i have 17,000 images so its going to take a while. </alt> 2. She also linked me to https://ahrefs.com/site-explorer/overview/subdomains/http%253A%252F%252Fwww.mutantspace.com%252F and wondered why i had such volatile inbound links. I dont know why. And cant figure it out As far as everything else goes I dont know what I could be doing wrong to deserve penalty - if it is a penalty. I dont back link so all my links are natural (from artists, galleries, art blogs, tumblrs, etc) I dont sell advertising (yet anyway) Having said that ive been told i have too many links on each page (i run a wordpress site and so have categories, etc) so im wondering if i should nofollow my categories? In short im wondering what advice anyone has on doing a systematic shake up of my site. Im currently doing the following: 1. deleting most of the <alt tags="">on my posts. Ive got back as far as 2012 and will keep going til theyre all done.</alt> 2. Redirecting all crawl errors 3. No Following more outbound links and links to social networks 4. Checking all inbound links to see if there is an suspicious domains 5. Sorting out the fact that ive had numerous server errors for the last 2 weeks (would that affect SERPS?) Is there anything else i can do? Should do? much appreciated
Algorithm Updates | | mutant20080 -
Recent Rank drop after Penguin 2.1?
Recently, a lot of pages from our website have moved from page one or ranking number one, to page ten or something. We got a manual penalty message from Google Team, we removed a lot of unnatural links pointing to our pages and disavowed the rest. This got the penalty removed and we got a message from Google confirming the same. Before the manual penalty we were getting about 140,000 visits per day, after the penalty about 80,000. However, after Hummingbird or Penguin 2.1 all our ranks have vanished. We are nowhere in Google for our primary keywords and we getting like 40,000 visits per day. Most are direct or from sources other than Google. We had another look at the links we disavowed, a list of about 11000 domains, we found about 3000 domains to be good. We fixed the disavow file about one week back, but no changes in traffic since. We checking the domains again to see if we have missed more good domains in there; yes, we have. There are still a very few good domains in there. But we are not touching the disavow list; waiting to see the change for the last submitted. We have a dedicated user base, good liking on Facebook, all the stats in Analytics speak good, about 40% repeat visits about 30% direct. About 3000 people search for the site using our brand name as reported in Analytics. I doubt the on-page optimization, the pages could be over-optimized. But the on-page factors for other pages ranking for the keywords are similar. The keyword density is similar, so are the usage of headings and stuff. We have not made any recent changes to these on-page patterns. Our team is not able to figure out what could have gone wrong.
Algorithm Updates | | Develop410 -
Rankings fluctuating by around 10 pages between night and day
Hi all, I'm experiencing something very odd with my website ranking at the moment. My homepage is fluctuating in rank for my main keyword by 10 pages every day and night. So, during the day i am on page 14, 15 or 16 for my main keyword yet by night i am on page 5 or 6. This trend has continued for the past 7 days now and i can't quite understand why this is. I'm using pagewash dot net to carry out manual searches and a ranking tool - both of which produce exactly the same result. Does anyone have any experience of this or why this is happening? My domain is around 8 years old and has around 50,000 pages. Any pointers would be greatly appreciated.
Algorithm Updates | | MarkHincks0 -
Are all duplicate contents bad?
We were badly hit by Panda back in January 2012. Unfortunately, it is only now that we are trying to recover back. CASE 1:
Algorithm Updates | | Gautam.Jain
We develop software products. We send out 500-1000 word description about the product to various download sites so that they can add to their product listing. So there are several hundred download sites with same content. How does Google view this? Did Google penalize us due to this reason? CASE 2: In the above case the product description does not match with any content on our website. However, there are several software download sites that copy and paste the content from our website as the product description. So in this case, the duplicate content match with our website. How does Google view this? Did Google penalize us due to this reason? Along with all the download sites, there are also software piracy & crack sites that have the duplicate content. So, should I remove duplicate content only from the software piracy & crack sites or also from genuine download sites? Does Google reject all kind of duplicate content? Or it depends on who hosts the duplicate content? Confused 😞 Please help.0