Reuse an old juicy URL or create a new with the best practices?
-
I'm optimizing a site with all new URL`s, categories, titles, descriptions. All URL's will change but I've old URLs with a lot of backlinks, SEO juice. What is better for SEO with them:
1 - Change those URLs and 301 redirect traffic to the new page.
2 - Keep the URL and work just on new title, description, etc.In option 1 I understand that I'll lose some SEO juice because of the redirect, but the new URL will be correct. In option 2 everything will be strong except from the URL that will make less sense than with option 1. It will not exactly match the product name, title. It`s a reuse of a strong URL.
-
Hi Aviad,
It depends.
I recently managed a full eCommerce re-platform where URLS changed due to a new site structure. I mapped thousands of old urls to new (your option 1) and had minimal ranking fluctuations. Be aware that there will be fluctuations, however if you're meticulous they should be minimal.
In the day to day running of the eCommerce store I actively keep old, well ranking URLs alive, rather than unpublishing them when items sell out (your option 2). When similiar items come into stock I can simply change the image, title etc etc and leave the URL. The slight discrepancies in URL to page content has not been an issue for usability at all.
So depending on your situation both options could work. Send through some more information and the community will be better placed to recommend the best way forward.
Jake
-
Hi Aviad,
I would like to vote for option 1.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Only Indexing Canonical Root URL Instead of Specified URL Parameters
We just launched a website about 1 month ago and noticed that Google was indexing, but not displaying, URLs with "?location=" parameters such as: http://www.castlemap.com/local-house-values/?location=great-falls-virginia and http://www.castlemap.com/local-house-values/?location=mclean-virginia. Instead, Google has only been displaying our root URL http://www.castlemap.com/local-house-values/ in its search results -- which we don't want as the URLs with specific locations are more important and each has its own unique list of houses for sale. We have Yoast setup with all of these ?location values added in our sitemap that has successfully been submitted to Google's Sitemaps: http://www.castlemap.com/buy-location-sitemap.xml I also tried going into the old Google Search Console and setting the "location" URL Parameter to Crawl Every URL with the Specifies Effect enabled... and I even see the two URLs I mentioned above in Google's list of Parameter Samples... but the pages are still not being added to Google. Even after Requesting Indexing again after making all of these changes a few days ago, these URLs are still displaying as Allowing Indexing, but Not On Google in the Search Console and not showing up on Google when I manually search for the entire URL. Why are these pages not showing up on Google and how can we get them to display? Only solution I can think of would be to set our main /local-house-values/ page to noindex in order to have Google favor all of our other URL parameter versions... but I'm guessing that's probably not a good solution for multiple reasons.
Intermediate & Advanced SEO | | Nitruc0 -
Old URLs that have 301s to 404s not being de-indexed.
We have a scenario on a domain that recently moved to enforcing SSL. If a page is requested over non-ssl (http) requests, the server automatically redirects to the SSL (https) URL using a good old fashioned 301. This is great except for any page that no longer exists, in which case you get a 301 going to a 404. Here's what I mean. Case 1 - Good page: http://domain.com/goodpage -> 301 -> https://domain.com/goodpage -> 200 Case 2 - Bad page that no longer exists: http://domain.com/badpage -> 301 -> https://domain.com/badpage -> 404 Google is correctly re-indexing all the "good" pages and just displaying search results going directly to the https version. Google is stubbornly hanging on to all the "bad" pages and serving up the original URL (http://domain.com/badpage) unless we submit a removal request. But there are hundreds of these pages and this is starting to suck. Note: the load balancer does the SSL enforcement, not the CMS. So we can't detect a 404 and serve it up first. The CMS does the 404'ing. Any ideas on the best way to approach this problem? Or any idea why Google is holding on to all the old "bad" pages that no longer exist, given that we've clearly indicated with 301s that no one is home at the old address?
Intermediate & Advanced SEO | | boxclever0 -
How to Create an Infographic
Kindly let me know how to create inforgraphic I am non designer. Any Best tool or template to buy to create infographic?
Intermediate & Advanced SEO | | marknorman0 -
Link Anchor Text - Best Practice?
Moz - Open Site Explorer using the following setup: Tab: Inbound Links
Intermediate & Advanced SEO | | Mark_Ch
Show: "all"
from: "Only Internal" I have run a number of random tests and have noticed the following results in the link anchor text. [No Anchor Text]
company name
website url
Home
etc. What is the best practice and naming convention to be used? Regards Mark0 -
What is best practice SEO approach to re structuring a website with multiple domains and associated search engine rankings for each domain?
Hello Mozzers, I'm trying to improve and establish rankings for my website which has never really been optimised. I've inherited what seems to be a mess and have a challenge for you! The website currently has 3 different www domains all pointing to the one website, two are .com domains and one is a .com.au - the business is located in Australia and the website is primarily targeting Australian traffic. In addition to this there are a number of other non www domains for the same addresses pointing to the website in the CMS which is Adobe Business Catalyst. When I check Google each of the www domains for the website has the following number of pages indexed: www.Domain1,com 5,190 pages
Intermediate & Advanced SEO | | JimmyFlorida
www.Domain2.com 1,520 pages
www,Domain3.com.au 149 pages What is best practice approach from an SEO perspective to re organising this current domain structure? 1. Do I need to use the .com.au as the primary domain given that we are in this market and targeting traffic here? Thats what I have been advised and it seems to be backed up by what I have read here. 2. Do we re direct all domains to the primary .com.au domain? This is easily done in the Adobe Business Catalyst CMS however is this the same as a 301 redirect which is the best approach from an SEO perspective? 3. How do we consolidate all of the current separate domain rankings for the 3 different domains into the one domain rankings within Google to ensure improved rankings and a best practice approach? The website is currently receiving very little organic search traffic so if its simpler and faster to start again fresh rather than go through a complicated migration or re structure and you have a suggestion here please feel free to let me know your ideas! Thank you!0 -
Page URL keywords
Hello everybody, I've read that it's important to put your keywords at the front of your page title, meta tag etc, but my question is about the page url. Say my target keywords are exotic, soap, natural, and organic. Will placing the keywords further behind the URL address affect the SEO ranking? If that's the case what's the first n number of words Google considers? For example, www.splendidshop.com/gift-set-organic-soap vs www.splendidshop.com/organic-soap-gift-set Will the first be any less effective than the second one simply because the keywords are placed behind?
Intermediate & Advanced SEO | | ReferralCandy0 -
Spaces in URL line
Hi Gurus, I recently made the mistake of putting a space into a URL line between two words that make up my primary key word. Think www.example.com/Jelly Donuts/mmmNice.php instead of www.example.com/JellyDonuts/mmmNice.php This mistake now needed fixing to www.example.com/Jelly Donuts/mmmNice.php to pass W3, but has been in place for a while but most articles/documents under 'Jelly Donuts' are not ranking well (which is probably the obvious outcome of the mistake). I am wondering whether the best solution from an SEO ranking viewpoint is to: 1. Change the article directory immediately to www.example.com/JellyDonuts/mmmNice.php and rel=canonical each article to the new correct URL. Take out the 'trash' using robots.txt or to 301 www.example.com/Jelly Donut to the www.example.com/JellyDonut directory? or perhaps something else? Thanks in advance for your help with this sticky (but tasty) conundrum, Brad
Intermediate & Advanced SEO | | BM70 -
Best practice for removing pages
I've got some crappy pages that I want to delete from a site. I've removed all the internal links to those pages and resubmitted new site maps that don't show the pages anymore, however the pages still index in search (as you would expect). My question is, what's the best practice for removing these pages? Should I just delete them and be done with it or make them 301 re-direct to a nicer generic page until they are removed from the search results?
Intermediate & Advanced SEO | | PeterAlexLeigh0