Sitemap Question - E-commerce - Magento
-
Good Morning...
I have an ecommerce site running on Magento and the sitemap is automatically generated by Magento based on the categories and sub categories and products.
I have recently created new categories that i want to replace the old categories, but they are both in the auto-generated sitemap. The old categories are "active" (as in still exist if you know the URL to type) but not visible (you can't find it just by navigating through the site). The new category pages are active and visible...
If i want Google to rank one page (the new category page) and not the old page (old category page) should i remove the old page from the sitemap? Would removing the old page that used to target the same keywords improve my rankings on the newer category page?
Sitemap currently contains:
www.example.com/oldcategorypage
www.example.com/newcategorypage
Did I confuse you yet?
Any help or guidance is appreciated.
Thanks,
-
First thing would be to 301 redirect old to new so the new pages have the chance to rank. If you don't you might also run into keyword cannibalisation issues where both old and new try to rank for the same keywords.
In Magento I believe if you disable the old category, it will also be removed the in sitemap.xml it generates for you.
If you're generating the sitemap manually then yes definitely remove them from the sitemap after the redirects.
-
Hey Ian, thanks for the response.
The new categories have already been created so it seems like it's too late to rename the older categories and urls.
Question is should i remove the ones i don't want to rank from the sitemap...
Thanks
-
Is there a reason you need to keep those old categories? In magento you can rename the category and the URL into your new category and it will automatically 301 redirect it to whatever new category URL structure you give it, passing SEO value along.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Video sitemap
Hello, I'm no Wordpress developer so need a little help please. I have manually created a video sitemap. It needs to be uploaded to the website. Where should the .xml file be uploaded onto Wordpress? Which directory? Is it Ok to add the code to a notepad file and upload? I'm trying to avoid the plugin route if possible. Thanks
Technical SEO | | AL123al0 -
Duplicated content & url's for e-commerce website
Hi, I have an e-commerce site where I sell greeting cards. Products are under different categories (birthday, Christmas etc) with subcategories (for Mother, for Sister etc) and same product can be under 3 or 6 subcategories, for example: url: .../greeting-cards/Christmas/product1/for-mother
Technical SEO | | jurginga
url:.../greeting-cards/Christmas/product1/for-sister
etc On the CMS I have one description record per each card (product1) with multiple subcategories attached that naturally creates URLs for subcategories. Moz system (and Google for sure) picks these urls (and content) as duplicated.
Any ideas how to solve this problem?
Thank you very much!0 -
Rel=canonical on landing page question
Currently we have two versions of a category page on our site (listed below) Version A: www.example.com/category • lives only in the SERPS but does not live on our site navigation • has links • user experience is not the best Version B: www.example.com/category?view=all • lives in our site navigation • has a rel=canonical to version A • very few links and doesn’t appear in the SERPS • user experience is better than version A Because the user experience of version B is better than version A I want to take out the rel=canonical in version B to version A and instead put a rel=canonical to version B in version A. If I do this will version B show up in the SERPS eventually and replace version A? If so, how long do you think this would take? Will this essentially pass page rank from version A to version B
Technical SEO | | znotes0 -
Meta Title Tags - Quick question!
Hi all, Our category Meta Title Tags are a little woeful and so I'm in the process of rewriting them. Let's say you have a product for sale.... some inkjet cartridges for a Canon BJ10V printer for example. In an effort to keep things concise I was thinking that for this category I should have the meta title set simply as: 'Canon BJ10V Inkjet Cartridges' and perhaps our company name after this text (and a pipe delimiter) This takes us just under 50 characters which is ideal but doesn't include any real keyword variation and will result in the company name being duplicated at the tail of the title tag on 6,000 odd pages. A large number of my competitors have title tags along the lines of: 'Canon BJ10V Cheap Inkjet Cartridges for Canon BJ-10V Ink Printers' I understand the reasoning behind this but does the variation of keywords compensate for the fact that the title looks spammy (to both humans and Search Engines). What would you do? Keep it clean and concise or stuff the title full of keywords. In the event of the former would you include the company name in each title in the knowledge they would be well under 50 characters without? Thanks for your help.
Technical SEO | | ChrisHolgate1 -
Sitemap error
Hi, When i search for my blog post in google i get sitemap results, and when i click on it i get an error, here is the screen shot http://screencast.com/t/lXOIiTnVZR1 http://screencast.com/t/MPWkuc4Ocixy How can i fix that, it loos like if i just add www. it work just fine. Thanks
Technical SEO | | tonyklu0 -
Indexation question
Hi Guys, i have a small problem with our development website. Our development website is website.dev.website.nl This page shouldn't be indexed bij Google but unfortunately it is. What can i do to deindex it and ask google not to index this website. In the robots.txt or are there better ways to do this? Kind regards Ruud
Technical SEO | | RuudHeijnen0 -
Avoiding duplicate content with national e-commerce products and localized vendors
Hello 'mozzers! For our example purposes, let's say we have a national cog reseller, www.cogexample.com, focusing on B2C cog sales. The website's SEO efforts revolve around keywords with high search volumes -- no long tail keywords here! CogExample.com sells over 35,000 different varieties of cogs online, broken into search engine friendly categories and using both HTML and Meta pagination techniques to ensure adequate deep-linking and indexing of their individual product pages. With their recent fiscal success, CogExample.com has signed 2,500 retailers across the United States to re-sell their cogs. CogExample.com's primary objective is B2C online sales for their highly-sought search terms, ie "green cogs". However, CogExample.com also wants their retailers to show up for local/geo search; ie "seattle green cogs". The geo/location-based retailer's web-content will be delivered from the same database as the primary online store, and thus is very likely to cause duplicate content issues. Questions 1. If the canonical meta tag is used to point the geo-based product to the online primary product, the geo-based product will likely be placed in the supplementary indexed. Is this correct? 2. Given the massive product database (35,000) and retailers (2,500) it is not feasible to re-write 87,500,000 pages of content to sate unique content needs. Is there any way to prevent the duplicate content penalty? 3. Google product feeds will be used to localize content and feed Google's product search. Is this "enough" to garnish sizable amounts of traffic and/or retain SERP ranks?
Technical SEO | | CatalystSEM0 -
Robots.txt questions...
All, My site is rather complicated, but I will try to break down my question as simply as possible. I have a robots.txt document in the root level of my site to disallow robot access to /_system/, my CMS. This looks like this: # /robots.txt file for http://webcrawler.com/
Technical SEO | | Horizon
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/ I have another robots.txt file in another level down, which is my holiday database - www.mysite.com/holiday-database/ - this is to disallow access to /holiday-database/ControlPanel/, my database CMS. This looks like this: **User-agent: ***
Disallow: /ControlPanel/ Am I correct in thinking that this file must also be in the root level, and not in the /holiday-database/ level? If so, should my new robots.txt file look like this: # /robots.txt file for http://webcrawler.com/
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/
Disallow: /holiday-database/ControlPanel/ Or, like this: # /robots.txt file for http://webcrawler.com/
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/
Disallow: /ControlPanel/ Thanks in advance. Matt0