Too many 301s?
-
Hi there, If there is a website that has accidently generated say 1,000 pages of duplicate content, would the seo be hurt if all those pages were re-directed to the origional source of the content?
There are no plans to re-write the 1,000 duplicate pages, they are already cached and indexed by Google.
I thought about canonical tags but as they have some traffic and a little seo value i thought 301 re-direct would be more appropiate to the relevant pages?
I am also right in thinking you would be able to remove the 301 in the .htaccess file once the index has updated?
Also once removed the 301 - i could use those urls later from scratch if i wanted?
Any info much appreciated.
-
Great insight Highland!!!
-
If they had links, I would 301 the pages with links. Everything else I would 404
-
How are these pages generating traffic? Are they being found in the search engine?
The real question, do these pages have links to them?
There is little value to a 301 redirect if you are not moving link traffic in the direction you are pointing. If you are out ranking the original content, then perhaps a 301 could help. How well does the original content rank?
-
Ha, yes you can my friend.
-
But you can do it, yes?
-
Bringing back URL's that you didn't want and then decide that you do want is pretty annoying to Google...
-
I would see if they had links, and get rid of the rest, it may look to Bing that you are trying to be tricky. Its not natural
-
Ok, i probably wont but in what istance would you not recommend this?
I understand pa and pr etc will be back to nothing but its the keyword url i might want to use from scratch
-
Yes but I wouldn't really recommend this.
-
Also last one, if i wanted to revive the 301s say in a year i would be allowed to and the pages would index again?
-
Thanks highland.
-
I would 301 the pages and get them out of your site's index. Even if you canonical all of them Google will still have to index 1000 pages instead of 1. The 301 will transfer most of your rank to the new page and you'll improve your crawl budget.
Why take the 301s out? Just leave them in there in case there are links pointed to them.
-
Well they seem to be generating traffic.
In principal is what i intend on doing ok, will it hard the seo or be seen as ok do you know?
Many thanks,
-
That sounds weird! If you generated 1000s of pages automatically, and these are all duplicate content, why don't you remove them? Google will end up removing them from its cache as well after a short period!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I avoid too many internal links in my site navigation?
Hi! I always get this notification on my pages 'Avoid Too Many Internal Links' when I run the Page Optimization Score. And this is the message I get how to fix it: Scale down the number of internal links on your page to fewer than 100, if possible. At a minimum, try to keep navigation and menu links to fewer than 100. On my website I got a desktop navigation menu and a mobile variant, so in the source this will show more internal links. If I hide those links with CSS for the view, is the problem then solved? So Does Google then see less internal links? Or does Google crawl everything? I'm curious how I can fix this double internal links issue with my navigation menu.
Technical SEO | | Tomvl
What are you guys ideas / experiences about this?0 -
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
Technical SEO | | kcb81780 -
Question about breaking out content from one site onto many
We have a website and domain -- which is well-established (since 1998) -- that we are considering breaking apart for business reasons. This is a content site that hosts articles from a few of our brands in portal fashion. These brands are represented in print with their own magazines so it's important to keep their presence separate. All of the content on the site is related to a general industry, with each brand covering a unique segment in the industry. For example, think of a toy industry site that hosts content from it's brands covering stuffed animals, electronics and board games. The current thinking is to break out the content from a couple brands to their own sites and domains. The business case for this branding purposes. I'm of the opinion that this is a bad idea as we would likely see a noticeable decline in search traffic across the board, which we rely on for impressions for our advertisers. If we take the appropriate steps to carefully redirect pages to the new domains what kind of hit should we expect to take from this transition? Would it make much difference if we were transition from 1 to 2 sites vs 1 to 4? Should this move be avoided all together? Any advise would be appreciated.
Technical SEO | | accessintel0 -
Many Errors on E-commerce website mainly Duplicate Content - Advice needed please!
Hi Mozzers, I would need some advice on how to tackle one of my client’s websites. We have just started doing SEO for them and after moz crawled the e-commerce it has detected: 36 329 Errors – 37496 warnings and 2589 Notices all going up! Most of the errors are due to duplicate titles and page content but I cannot identify where the duplicate pages come from, these are the links moz detected of the Duplicate pages (unfortunately I cannot add the website for confidentiality reasons) : • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00&products_per_2&products_per_2&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00=&products_per_00&products_per_2&products_per_2&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00=&products_per_00&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_2=&products_per_00&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00&products_per_00&products_per_00&products_per_00&page=2 With these URLs it is quite hard to identify which pages need to be canonicalize. And this is jsut an example out of thousands on this website. If anyone would have any advice on how to fix this and how to tackle 37496 errors on a website like this that would be great. Thank you for your time, Lyam
Technical SEO | | AlphaDigital0 -
Can you redirect from a 410 server error? I see many 410s that should be directed to an existing page.
We have 150,000 410 server errors. Many of them should be redirected to an existing url. This is a result of a complete website redesign, including new navigation and new web platform. I believe IT may have inadvertently marked many 404s as 410s. Can I fix this or is a 410 error permanent? Thank you for your help.
Technical SEO | | sxsoule0 -
How many times robots.txt gets visited by crawlers, especially Google?
Hi, Do you know if there's any way to track how often robots.txt file has been crawled? I know we can check when is the latest downloaded from webmaster tool, but I actually want to know if they download every time crawlers visit any page on the site (e.g. hundreds of thousands of times every day), or less. thanks...
Technical SEO | | linklater0 -
Too Many On-Page Links
Hello. My Seomoz report this week tells me that I have about 500 pages with Too Many On-Page Links One of the examples is this one: https://www.theprinterdepo.com/hp-9000mfp-refurbished-printer (104 links) If you check, all our products have a RELATED products section and in some of them the related products can be UP to 40 Products. I wonder how can I solve this. I thought that putting nofollow on the links of the related products might fix all of these warnings? Putting NOFOLLOW does not affect SEO?
Technical SEO | | levalencia10 -
Too many footer links?
Hi. We're working on http://www.gear-zone.co.uk/ at the moment, and I was wondering what's everyone's opinion on footer links. There's quite a lot on the page, and I was wondering if there might be a few too many. If so, what would be the best plan of action? Remove them altogether, stick them in an iframe or in a bit of JS so they can't be crawled? Thanks!
Technical SEO | | neooptic0