Getting rid of duplicate content remaining from old misconfiguration
-
Hi Friends,We have recently (about a month ago) launched a new website, and during the review of that site spotted a serious misconfiguration of our old terrible WP siteThis misconfiguration, which may have come from either sitemaps or internal links or both lead to displaying our french german and english sites on each others’ domains. This should be solved now, but they still show in SERPS: The big question is: What’s the best way to safely remove those from SERPS?We haven’t performed as well as we wanted for a while and we believe this could be one of the issues:Try to search for instance“site:pissup.de stag do -junggesellenabschied” to find english pages on our german domain, each link showing either 301 or 404.This was cleaned to show 301 or 404 when we launched our new site 4 weeks ago, but I can still see the results in SERPS, so I assume they still count negatively?Cheers!
-
Yep, this one I fixed just now as you send it.
I think the issue with wrong redirects is mostly me not spotting them all rather than a problem with the ones I already set not redirecting correctly.
I expect there to be thousand + wrong pages, but when I use site:domain.tld and a word in wrong language, for instance "evg" (french word for bachelor party) Google spots only up to 300 (suspiciously the same maximum amount for all sites).
-
Hey Rasmus - I honestly think it's an issue with the redirects. I would double check them.
I did just visit https://www.pissup.de/package/basic-survival-4/ and it looks like it's redirecting. Were you able to get those shored up? If you are still having trouble, I would contact your web host to make sure those are shored up.
-
Hi John
Yes, the idea is that https://www.pissup.de/package/basic-survival-4/ should redirect to a german equivalent where we have one.
It's strange that it isn't as it has not been more than a week since I uploaded all the redirects. Perhaps this is down to the site: search not providing all results, and perhaps if it's limiting the amount of results, when some are removed, it starts showing others that were not showing before?
-
Hey Rasmus,
Just so I understand - a url like this: https://www.pissup.de/package/basic-survival-4/, should not be displaying on the german site. The german site should just have german right?
I found that page doing the site search listed in your initial question.
What's interesting is that this page isn't redirecting. Let me know your thoughts. I have feedback but I want to make sure of a few things before I share it.
Thanks!
John
-
Hi John
Thanks for taking your time to answer!
The URL's were already showing 301 or 404 when we discovered them after launching new site
What we did so far was this:
- set up 301 redirect from pissup.com/german-url to pissup.com/english-equivalent where available or closest similar page
- added a sitemap with these URL's with the hope they'd be crawled faster
- Wait
We were advised it was better to redirect than to ask for removal. Do you disagree with this advice, and what makes you think so?
We're really seeing an increase yet for these issues in the SERPS. Some decrease by 5-10%, but some don't. Can it be because we are not seeing them all in SERPS, and in that case is there anywhere else we could find them (all url's indexed by google on our domain)?
-
Hey Rasmus,
In finding these index pages, I'm assuming that you did the following:
1. no-indexed the pages from the domain you are concerned about
2. dis-allowed them in robots.txt (just another step to help speed up things)
3. Used the URL removal tool in Google Search Console
Unfortunately, it does take time for Google to process these URL's out of the SERPS. Hopefully, you are seeing a decrease in the URLs shown in the SERPS
Also, don't forget to do this via the Bing Search Console too!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve this issue and avoid duplicated content?
My marketing team would like to serve up 3 pages of similar content; www.example.com/one, www.example.com/two and www.example.com/three; however the challenge here is, they'd like to have only one page whith three different titles and images based on the user's entry point (one, two, or three). To avoid duplicated pages, how would suggest this best be handled?
Intermediate & Advanced SEO | | JoelHer0 -
Complicated Duplicate Content Question...but it's fun, so please help.
Quick background: I have a page that is absolutely terrible, but it has links and it's a category page so it ranks. I have a landing page which is significantly - a bizillion times - better, but it is omitted in the search results for the most important query we need. I'm considering switching the content of the two pages, but I have no idea what they will do. I'm not sure if it will cause duplicate content issues or what will happen. Here are the two urls: Terrible page that ranks (not well but it's what comes up eventually) https://kemprugegreen.com/personal-injury/ Far better page that keeps getting omitted: https://kemprugegreen.com/location/tampa/tampa-personal-injury-attorney/ Any suggestions (other than just wait on google to stop omitting the page, because that's just not going to happen) would be greatly appreciated. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Multiple 301 redirects and old site content appearing in Google results
I have found that for some Google searches the old version of the site on a completely different domain is appearing on page one of the results, while the newer site is only on page 3. The old site is redirecting to the new site with a 301 redirect, however there is also an additional redirect on the new site to force SSL. Despite this when you view the Google cache of the result that appears in Google the content of the page is still the old site. Is this normal or is Google not following the chain of 301 redirects? Edit: I just found out that downloading the page by right clicking a link and clicking download rather than viewing it in a browser leads to the old site appearing and the 301 redirect not being followed.
Intermediate & Advanced SEO | | freshleafmedia0 -
Best strategy for duplicate content?
Hi everyone, We have a site where all product pages have more or less similar text (same printing techniques, etc.) The main differences are prices and images, text is highly similar. We have around 150 products in every language. Moz's algorithm tells me to do something about duplicate content, but I don't really know what we could do, since the descriptions can't be changed to be very different. We essentially have paper bags in different colors and and from different materials.
Intermediate & Advanced SEO | | JaanMSonberg0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Duplicate content across internation urls
We have a large site with 1,000+ pages of content to launch in the UK. Much of this content is already being used on a .nz url which is going to stay. Do you see this as an issue or do you thin Google will take localised factoring into consideration. We could add a link from the NZ pages to the UK. We cant noindex the pages as this is not an option. Thanks
Intermediate & Advanced SEO | | jazavide0 -
Adding a huge new product range to eCommerce site and worried about Duplicate Content
Hey all, We currently run a large eCommerce site that has around 5000 pages of content and ranks quite strongly for a lot of key search terms. We have just recently finalised a business agreement to incorporate a new product line that compliments our existing catalogue, but I am concerned about dumping this huge amount of content (that is sourced via an API) onto our site and the effect it might have dragging us down for our existing type of product. In regards to the best way to handle it, we are looking at a few ideas and wondered what SEOMoz thought was the best. Some approaches we are tossing around include: making each page point to the original API the data comes from as the canonical source (not ideal as I don't want to pass link juice from our site to theirs) adding "noindex" to all the new pages so Google simply ignores them and hoping we get side sales onto our existing product instead of trying to rank as the new range is highly competitive (again not ideal as we would like to get whatever organic traffic we can) manually rewriting each and every new product page's descriptions, tags etc. (a huge undertaking in terms of working hours given it will be around 4,400 new items added to our catalogue). Currently the industry standard seems to just be to pull the text from the API and leave it, but doing exact text searches shows that there are literally hundreds of other sites using the exact same duplicate content... I would like to persuade higher management to invest the time into rewriting each individual page but it would be a huge task and be difficult to maintain as changes continually happen. Sorry for the wordy post but this is a big decision that potentially has drastic effects on our business as the vast majority of it is conducted online. Thanks in advance for any helpful replies!
Intermediate & Advanced SEO | | ExperienceOz0 -
Duplicate Content on Wordpress b/c of Pagination
On my recent crawl, there were a great many duplicate content penalties. The site is http://dailyfantasybaseball.org. The issue is: There's only one post per page. Therefore, because of wordpress's (or genesis's) pagination, a page gets created for every post, thereby leaving basically every piece of content i write as a duplicate. I feel like the engines should be smart enough to figure out what's going on, but if not, I will get hammered. What should I do moving forward? Thanks!
Intermediate & Advanced SEO | | Byron_W0