Fixing large number of 404s
-
Hi
Newbie SEO here....
I converted quite a massive site to wordpress (www.yourbusinesschannel.com). For a number of reasons, over 1000 pages were not converted and the URLs could not easily be preserved in any event
I am now going to add these pages back by importing a large CSV into wordpress.
However what is the best way to tell google that the pages are now back and at new addresses? Is it to submit a site map and get the site recrawled from scratch?
-
Right on......use one of the plugins for wordpress.
We use google xml sitemap. Works like a charm. After you have added content go in and recreate the sitemap and you are in business once they recrawl......make sure you clean up other issues.
Another plugin that might be helpful is wordpress importer.
Here is the copy and paste from the plug in on one of our sites.....
The WordPress Importer will import the following content from a WordPress export file:
- Posts, pages and other custom post types
- Comments
- Custom fields and post meta
- Categories, tags and terms from custom taxonomies
- Authors
For further information and instructions please see the Codex page on Importing Content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix a 404?
I had to delete a page of my site, now when I search for it I see it's showing a 404. How to I get that page to stop showing on Google? Thank you in advance!
On-Page Optimization | | Coppell0 -
Googlebot found an extremely high number of URLs on your site:
Website: www.gobol.in Although I have no indexed my search pages by adding /catalogsearch in robots.txt, still we are getting same error again and again Here's a list of sample URLs with potential problems. http://www.gobol.in/catalogsearch/result/index/?category=&mobile_feature=4575_4578&q=panasonic+NR-BU303LH1H+REFRIGERATOR+296+L+GREY&special_price=32%2C456&x=0&y=0 http://www.gobol.in/mobile-and-accessories/mobiles-and-brands.html?manufacturer=4753_3355_455_4435_4720_3407_2412_4728_4784_4790_2010_4789_4376_2469&operating_system_mobile=4612 Please help
On-Page Optimization | | Obbserv0 -
Impact of number of outgoing links on Page Rank of an optimized page?
What is the current best practice on preferred number of outbound links on a page you are trying to rank with: According to online resources form a pure page rank perspective a high number of outbound follow links can have a negative impact not only on child pages but also the page itself
On-Page Optimization | | thomaspro
http://pr.efactory.de/e-outbound-links.shtml Other resources suggest that particularly placing high quality outbound links on a page (nofollow) increases the trust and authority of a page Are there any other elements to keep in mind? Is the best practice to avoid any follow links on a page you want to rank well in Google for? Thanks /T0 -
Is it impossible to get out of Panda? Matt Cutts says if you fix the problem you "pop back" but if so why are their so few examples?
In this video matt cutts says: http://www.youtube.com/watch?v=8IzUuhTyvJk about 15 "once we re-run our data (every few weeks) if we determine your site is of higher quality you would pop back out of being affected" Panda has effected thousands of sites and a lot of smart people have been working on the problem for about 2 years since the first panda was launched, but I can only find 1 site that has "popped back" to their original rankings. e.g. http://searchengineland.com/google-panda-two-years-later-losers-still-losing-one-real-recovery-149491 Apart from Motortrend.com I can't find any sites (of reasonable size) / case studies of sites that have solved the panda problem, and were definitely hit by panda. Which doesn't feel right, some people have deleted a ton of pages, redesigned their site, improved their content, etc with no success. Therefore is it a pointless exercise? Therefore, is it better to simply give up and start a new site?
On-Page Optimization | | julianhearn1 -
Reducing number crawl-able links?
Hello, I just like to ask for best practice when it comes to reduce number of internal links on a site with a mega menu. Since the mega menu lists all categories and all their subcategories it creates a problem when all categories are linking to all categories directly.. Would the method below reduce the number of links and preventing the link juice flowing directly from category to category? [(link built with JavaScript and the html5 "data-" attribute) Thinking of using these links to categories in the menu not directly below the parent category.](#)
On-Page Optimization | | AJPro0 -
How to fix joomla duplicate page titles??
Hello, I have been using seomoz and sh404sef for almost a month now and can't seem to figure out my duplicate page title issue on my joomla site. I have changed everything I thought necessary but no luck. I have one page that I am trying to make dissapear from duplicate page title error and if I can figure it out i'm sure I can use same fix for the other 1,000 duplicate page titles being reported in my seomoz crawl report I have one link that I am working with sef url: http://www.mysite.com/contactus.html I have used sh404sef url manager and added page title, description and keywords to this link and for 3 weeks straight now it still shows that it is duplicate page title in my seomoz crawl report Is there something that I am missing somewhere or did i setup a bad campaign that is looking at wrong things on site? I 've attached image of my sh404sef settings for this link Thanks in advance screenshot1.jpg
On-Page Optimization | | 41global0 -
How Should I Fix Duplicate Content in Wordpress Pages
In GWMT i see google found 41 duplicate content in my wordpress blog. I am using Yoast SEO plugin to avoid those type of duplicates but still the problem was stick.. You can check the screenshot here - http://prntscr.com/dxfjq Please help..
On-Page Optimization | | mamuti0 -
How to fix duplicate issue among multiple root domains
Hello, I’m doing SEO for one E-commerce website which name is Lily Ann Cabinets & I’ve around 300 different root domains which having same linking structures, same design & even same product database for all 300 websites, but currently I’m focusing only on Lily Ann Cabinets website & trying to get ranking on some targeted keywords, but website is not performing well in Google.com For Example: http://www.lilyanncabinets.com/ (Main Websites)
On-Page Optimization | | CommercePundit
http://www.orlandocabinets.com/
http://www.chicagocabinets.org/
http://www.miamicabinets.org/
http://www.newyorkcabinets.org/
http://www.renocabinets.org/ So please can anyone tell that Will it create duplicate issue in search engines or may be due to this reason website doesn’t have good ranking in search engines, then how can I fix this issue? Do I have to make different structures for Lily Ann Cabinets?0