Duplicate Content / 301 redirect Ariticle issue
-
Hello,
We've got some articles floating around on our site
nlpca(dot)com
like this article:
http://www.nlpca.com/what-is-dynamic-spin-release.html
that's is not linked to from anywhere else. The article exists how it's supposed to be here:
http://www.dynamicspinrelease.com/what-is-dsr/
(our other website)
Would it be safe in eyes of both google's algorithm (as much as you know) and with Panda to just 301 redirect from
http://www.nlpca.com/what-is-dynamic-spin-release.html
to
http://www.dynamicspinrelease.com/what-is-dsr/
or would no-indexing be better?
Thank you!
-
I don't think it will "weaken" the domain but if it might provide a better experience for users if instead of clicking a link and being 301d they could click a link straight through to the target page.
You can 301 the duplicate pages as well if you like.
-
Thanks Peter and Ben,
I don't know that we have access to the code in the tag for separate pages in our version of Joomla, but I don't want to leave this duplicate content floating out there. What is your suggestion?
Will a 301 redirect from nlpca to the site with the original articles weaken nlpca(dot)com
-
When you say that it's "not linked to from anywhere else," does that include internal links or just inbound? If it has no internal OR inbound links, then it hardly matters either way. If it gets traffic but has no inbound links, then I'm inclined to agree with Ben - use the canonical tag. That way, the page can "live" on both sites/domains, but only one of them will have search value.
I'm actually looking to take two blogs and consolidate them into one brand new domain, and I think I may use the canonical tag for a couple of months first and then 301-redirect them. In that case, though, it's because I'll eventually shut off the other domains. If there's value to having the page exist (for users) both places, then the canonical is a solid, long-term solution.
-
If I remove it won't that cause a 404 error?
Shouldn't I 301 redirect it to the nlpca.com home page?
I can't use rel="canonical" because we are in Joomla
-
If there's a valid reason to have the article on Nipca (as in it adds a benefit to users) then you could use a rel=canonical.
If it's not adding any value for users and is generally a dead page then why bother no-indexing when you could just remove it all together and not have it wasting crawl allowance.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
302 > 302 > 301 Redirect Chain Issue & Advice
Hi everyone, I recently relaunched our website and everything went well. However, while checking site health, I found a new redirect chain issue (302 > 302 > 301 > 200) when the user requests the HTTP and non-www version of our URL. Here's what's happening: • 302 #1 -- http://domain.com/example/ 302 redirects to http://domain.com/PnVKV/example/ (the 5 characters in the appended "subfolder" are dynamic and change each time)
Intermediate & Advanced SEO | | Andrew_In_Search_of_Answers
• 302 #2 -- http://domain.com/PnVKV/example/ 302 redirects BACK to http://domain.com/example/
• 301 #1 -- http://domain.com/example/ 301 redirects to https://www.domain.com/example/ (as it should have done originally)
• 200 -- https://www.domain.com/example/ resolves properly We're hosted on AWS, and one of my cloud architects investigated and reported GoDaddy was causing the two 302s. That's backed up online by posts like https://stackoverflow.com/questions/46307518/random-5-alpha-character-path-appended-to-requests and https://www.godaddy.com/community/Managing-Domains/My-domain-name-not-resolving-correctly-6-random-characters-are/td-p/60782. I reached out to GoDaddy today, expecting them to say it wasn't a problem on their end, but they actually confirmed this was a known bug (as of September 2017) but there is no timeline for a fix. I asked the first rep I spoke with on the phone to send a summary, and here's what he provided in his own words: From the information gathered on my end and I was able to get from our advanced tech support team, the redirect issue is in a bug report and many examples have been logged with the help of customers, but no log will be made in this case due to the destination URL being met. Most issues being logged are site not resolving properly or resolving errors. I realize the redirect can cause SEO issues with the additional redirects occurring. Also no ETA has been logged for the issue being reported. I do feel for you since I now understand more the SEO issues it can cause. I myself will keep an eye out for the bug report and see if any progress is being made any info outside of this I will email you directly. Thanks. Issue being Experienced: Domains that are set to Go Daddy forwarding IPs may sometimes resolve to a url that has extra characters appended to the end of them. Example: domain1.com forwards to http://www.domain2.com/TLYEZ. However it should just forward to http://www.domain2.com. I think this answers what some Moz users may have been experiencing sporadically, especially this previous thread: https://moz.com/community/q/forwarded-vanity-domains-suddenly-resolving-to-404-with-appended-url-s-ending-in-random-5-characters. My question: Given everything stated above and what we know about the impact of redirect chains on SEO, how severe should I rate this? I told my Director that I would recommend we move away from GoDaddy (something I don't want to do, but feel we _**have **_to do), but she viewed it as just another technical SEO issue and one that didn't necessarily need to be prioritized over others related to the relaunch. How would you respond in my shoes? On a scale of 1 to 10 (10 being the biggest), how big of a technical SEO is this? Would you make it a priority? At the very least, I thought the Moz community would benefit from the GoDaddy confirmation of this issue and knowing about the lack of an ETA on a fix. Thanks!0 -
SEO Strategy - Content/Outreach/Links
Hi everyone I'm trying to prioritise my tasks for 2018 & wondered if anyone had any useful templates they use? In terms of SEO tasks, my priority was going to be content/outreach/links - Focusing on user guides/blogs onsite Then outreach articles/some PR that doesn't go against Google guidelines offsite. My struggle with the onsite content/blogs we produce is we have no real social media plan/manager so my content outreach always seems hampered by this. I've tried taking on some of the social stuff, but this ends up being too much for just me to do. I wondered if there were any other SEOs who face this issue and who have found some good solutions? I'm stuck in a bit of a rut and can't seem to effectively push forward with outreach/content writing. Thank you Becky
Intermediate & Advanced SEO | | BeckyKey1 -
Duplicate content - Images & Attachments
I have been looking a GWT HTML improvements on our new site and I am scratching my head on how to stop some elements of the website showing up as duplicates for Meta Descriptions and Titles. For example the blog area: <a id="zip_0-anchor" class="zippedsection_title"></a>This blog is full of information and resources for you to implement; get more traffic, more leads an /blog//blog/page/2//blog/page/3//blog/page/4//blog/page/6//blog/page/9/The page has rel canonicals on them (using Yoast Wordpress SEO) and I can't see away of stopping the duplicate content. Can anyone suggest how to combat this? or is there nothing to worry about?
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
Duplicate content on yearly product models.
TL;DR - Is creating a page that has 80% of duplicated content from the past year's product model where 20% is about the new model changes going to be detrimental to duplicate content issues. Is there a better way to update minor yearly model changes and not have duplicated content? Full Question - We create landing pages for yearly products. Some years the models change drastically and other years there are only a few minor changes. The years where the product features change significantly is not an issue, it's when there isn't much of a change to the product description & I want to still rank on the new year searches. Since I don't want duplicate content by just adding the last year's model content to a new page and just changing the year (2013 to 2014) because there isn't much change with the model, I thought perhaps we could write a small paragraph describing the changes & then including the last year's description of the product. Since 80% of the content on the page will be duplicated from the last year's model, how detrimental do you think this would be for a duplicate content issue? The reason I'm leaving the old model up is to maintain the authority that page has and to still rank on the old model which is still sold. Does anyone else have any other better idea other than re-writing the same information over again in a different way with the few minor changes to the product added in.
Intermediate & Advanced SEO | | DCochrane0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0 -
Http and https duplicate content?
Hello, This is a quick one or two. 🙂 If I have a page accessible on http and https count as duplicate content? What about external links pointing to my website to the http or https page. Regards, Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Duplicate content for area listings
Hi, I was slightly affected by the panda update on the 14th oct generaly dropping by about 5-8 spots in the serps for my main keywords, since then I've been giving my site a good looking over. On a site I've got city listings urls for certain widget companys, the thing is many areas and thus urls will have the same company listed. What would be the best way of solving this duplicate content as google may be seeing it? I was thinking of one page per company and prominenly listing the areas they operate so still hopefully get ranked for area searches. But i'd be losing the city names in the url as I've got them now for example: mywidgetsite.com/findmagicwidgets/new-york.html mywidgetsite.com/findmagicwidgets/atlanta.html Any ideas on how best to proceed? Cheers!
Intermediate & Advanced SEO | | NetGeek0