How to deal with URLs when changing shopping cart software to ensure SEO
-
NSFW ALERT (LINK BELOW)
We are changing the shopping section of our website. Currently the products sit on our own website and when a user goes to checkout they are taken to Mals (a shopping cart site). This means our URL’s look like this.
NSFWhttps://www.aprilnites.com.au/mascara_vibe.htmlThe new software is Ecwid and we are using this with a site created in RapidWeaver so the URLs will not be clean and will have all ? And # parameters. I’m wondering if this will hurt the SEO of our whole site or just the product pages. I’m also unsure of how best to deal with the current URLs. Should I use a 301 redirect on all of them to take the user back to the home page of the shop. For us the shop is more of a catalogue. Our main website is the most important part but I want to make sure we are following best practice when making this change. Hope someone can help.Many thanks
-
Hello,
To deal with URLs when changing shopping cart software to ensure SEO:
-
Redirect Old URLs: Set up 301 redirects from old URLs to their corresponding new URLs to maintain SEO authority.
-
Maintain Keyword Consistency: Ensure that your new URLs incorporate relevant keywords and match the structure of the old URLs as closely as possible.
-
Submit XML Sitemap: Update your XML sitemap and submit it to search engines to help them discover and index the new URLs.
-
Check for Broken Links: Scan your website for broken links and fix them to ensure a smooth user experience.
-
Monitor Performance: Continuously monitor the SEO performance of your new URLs to identify and address any issues that may arise during the transition.
-
-
Hello,
To deal with URLs when changing shopping cart software to ensure SEO:
-
Redirect Old URLs: Set up 301 redirects from old URLs to their corresponding new URLs to maintain SEO authority.
-
Maintain Keyword Consistency: Ensure that your new URL incorporates relevant keywords and matches the structure of the old URLs as closely as possible.
-
Submit XML Sitemap: Update your XML sitemap and submit it to search engines to help them discover and index the new URLs.
-
Check for Broken Links: Scan your website for broken links and fix them to ensure a smooth user experience.
-
Monitor Performance: Continuously monitor the SEO performance of your new URLs to identify and address any issues that may arise during the transition.
-
-
If you are moving a link permanently the best way to do is to make your old page into 301. The 301 tells that your link is moved permanently to your new site link.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Acquired domains for SEO
hi there, For one of our insurance websites we acquired a domain, this domain is going to be redirected to our domain. After some research we discovered the domain we've bought also includes other domains which 301 redirect to specific 'insurance products'. These domains are also included in the deal. But what is the best technical solution for redirecting these specific redirected product domains? They already redirect to the product pages of the domain we've bought, so after redirect this domain, the sub (product) redirected domains will also link to us. It would be like this: A) www.sub-carinsurancesite.nl (301) -> www.domain-we-bought.com/car-insurance -> www.ourdomain.com/car-insurance
Technical SEO | | remkoallertz
B) www.sub-carinsurancesite.nl (301) -> www.ourdomain.com/car-insurance & www.domain-we-bought.com/car-insurance -> www.ourdomain.com/car-insurance etc0 -
Big SEO Changes
Firstly, this is quite extensive so thank you to anyone who answers some or all of the below! So this is quite a lengthy ordeal, and I'm going to start by saying that I'm no SEO expert (yet). I've paid for SEO for years and only on the odd occasion has it made any real difference. It has come to the point now where I've spent so much money on SEO over the years with practically no benefit that I can't afford to do it anymore, so I am teaching myself. So, back in July my website was hacked for a total sum of three weeks. My SEO/Hosting company at the time didn't do anything about it, let the hack sit there and didn't even take the site offline. It just so happened that at the time I was changing over to a new site at the time anyway, so I launched the new site (completely different in structure to the old one), did all of the relevant 301 redirects, and my traffic hasn't recovered since. I have gone from around 100-150 daily visits to 0-10. The descriptions, keywords, alt image tags, h1 & 2, meta data, etc. is all much better (a lot of it was empty on the previous site) on the new site than it was on the previous site so I was assuming it would be better, but it isn't. Anyone got any suggestions as to why this might be? Here are some specific questions: Canonical Problem? My site is ecommerce and lists some products in several categories, that has resulted in a high duplicate content rate. Is it expected/accepted by google that this would be the case for an ecommerce website or do I need to sort out some serious canonical urls to fix the issue? The site structure of my website could also be a problem, but I'm not qualified enough to know for sure. If you view a product/sub-category, then remove the category section of that link, the product will still appear. I don't know if this structure is good or not? i.e. if you click both links below, the link will appear all the same. http://thespacecollective.com/space-clothing/nasa-and-space-t-shirts http://thespacecollective.com/nasa-and-space-t-shirts Is this a problem for SEO? Duplicate Product Tag Problem? I have many duplicate product tags appearing on many products, should these be blocked in the robots.txt? i.e. http://thespacecollective.com/space-memorabilia/space-flown/apollo-11-flown-cm-meteorite-acrylic http://thespacecollective.com/space-memorabilia/space-flown/apollo-11-flown Site Code Structure When choosing the template I would use for my website I did not stop to consider if the code was SEO friendly, this on my part was due to my ignorance on the subject. Is the site structure SEO-friendly or is it hindering my efforts? Website: http://thespacecollective.com Again, thank you to anyone who takes the time to read/care about the issues facing a newbie. My only option now is to learn SEO myself (which is well overdue), so any advice/answers are appreciated!
Technical SEO | | moon-boots0 -
Personalization software and SEO
Hi guys, I'm just testing a personalization software in our website, basically changing the "location" text depending on the user's IP. I can see in my software that when the Google bot comes to our site the personalization software triggers an action changing the location based text to "California". Can this make Google understand that our website targets only users in California and thereof hurt our rankings in other locations nationwide? I'll appreciate your opinions.
Technical SEO | | anagentile1 -
How to change noindex to index?
Hey, I've recently upgraded to a pro SEOmoz account and have realised i have 14574 issues to do with 'blocked by meta-robot' and that 'This page is being kept out of the search engine indexes by the meta tag , which may have a value of "noindex", keeping this page out of the index.' How can i change this so my pages get indexed? I read somewhere that i need to change my privacy settings but that thread was 3 years old and now the WP Dashboard has updated.. Please let me know Many thanks, Jamie P.s Im using WordPress 3.5 And i have the XML sitemap plugin And i have no idea where to look for this robots.txt file..
Technical SEO | | markgreggs0 -
What is the best way to deal with pages whose content changes?
My site features businesses that offers activities for kids. Each business has its own page on my site. Business pages contains a listing of different activities that organization is putting on (such as events, summer camps, drop-in activities). Some businesses only offer seasonal activities (for example, during Christmas break and summer camps). The rest of the year, the business has no activities -- the page is empty. This is creating 2 problems. It's poor user experience (which I can fix no problem) but it also is thin content and sometimes treated as duplicate content. What's the best way to deal with pages whose content can be quite extensive at certain points of the year and shallow or empty at other parts? Should I include a meta ROBOTS tag to not index when there is no content, and change the tag to index when there is content? Should I just ignore this problem? Should I remove the page completely and do a redirect? Would love to know people's thoughts.
Technical SEO | | ChatterBlock0 -
Dealing with 404 pages
I built a blog on my root domain while I worked on another part of the site at .....co.uk/alpha I was really careful not to have any links go to alpha - but it seems google found and indexed it. The problem is that part of alpha was a copy of the blog - so now soon we have a lot of duplicate content. The /alpha part is now ready to be taken over to the root domain, the initial plan was to then delete /alpha. But now that its indexed I'm worried that Ill have all these 404 pages. I'm not sure what to do.. I know I can just do a 301 redirect for all those pages to go to the other ones in case a link comes on but I need to delete those pages as the server is already very slow. Or does a 301 redirect mean that I don't need those pages anymore? Will those pages still get indexed by google as separate pages? Please assist.
Technical SEO | | borderbound0 -
Geotargeting by IP and SEO
Hi, Part of our site displays localized results based on the user's IP (we get the zipcode based on IP). For example a user in NY would get a list of NY based stores, while a user in CA would get a list of CA based stores. So if CA Googlebot comes to our site, it will get results based on Mountain View CA. Given the pages are generated based on your zip, I'm not sure how we'd indicate to Google that we have results for lots of locations and not just the Googlebot IP locations. (users can change their zipcode, but by default we use geolocation). Our landing pages contain localized content and unique urls with the zipcode etc, but it isn't clear how Google will find results for KY etc.
Technical SEO | | NicB10 -
URL Rewrite
Using the .htaccess file how do I rewrite a url from www.exampleurl.com/index.php?page=example to www.exampleurl.com/example removing index.php?page= Any help is muchly appreciated
Technical SEO | | CraigAddyman0