Removing dates from wordpress blog URL
-
Hi all,
Ours is website's blog is built with wordpress. We used to have the below URL pattern like may other websites: www.website.com/blog/2016/04/10/topic-on-how-to-optimise-blog.
Recently we removed the date and made the URL pattern to just like: www.website.com/blog/topic-on-how-to-optimise-blog
All the links have been generated with new URLs across the blog. Still all the old URLs have been reported as crawl errors in search console. I am wondering will there be any auto redirect formula to redirect all the old URLs to new URLs.
Thanks
-
Hi,
You could add the following code to your .htaccess to redirect all dated urls to non-dated version:
RedirectMatch 301 /([0-9]+)/([0-9]+)/([0-9]+)/(.*)$ http://www.**domain**.com/$4
Change domain.com with your domain name.
This should create a redirect from http://www.website.com/blog/2016/04/10/topic-on-how-to-optimise-blog to www.website.com/blog/topic-on-how-to-optimise-blog (and every similar situation).
-
Apache
-
Hi,
In some cases your SEO Plugins should have made sure to update the redirects. But what you could do is a check on if there is a number/number in your URL and have it rewritten/redirect to make sure the old URLS will be redirected to the new ones. In the end the only thing you'd have to do is make sure the dates are stripped.
Are you using NGINX or Apache?
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Confused: Url Restructure
Hello, We're giving our website a bit of a spring clean in terms of SEO. The site is doing ok, but after the time invested in SEO, content and last year's migration of multiple sites into one, we're not seeing the increase in traffic we had hoped. Our current urls look something like this: /a-cake-company/cup-cakes/strawberry We have the company name as the first level as we with the migration we migrated many companies into one site. What we're considering is testing some pages with a structure like this: /cup-cakes/cup-cake-company-strawberry So we'll lose a level and we'll focus more on the category of the product rather than the brand. What's your thoughts on this? We weren't going to do a mass change yet, just a test, but is this something we should be focusing on? In terms of organisation our current url structure is perfect, but what about from an SEO point of view? In terms of keywords customers are looking for both options. Thanks!
Intermediate & Advanced SEO | | HB170 -
URLs: Removing duplicate pages using anchor?
I've been working on removing duplicate content on our website. There are tons of pages created based on size but the content is the same. The solution was to create a page with 90% static content and 10% dynamic, that changed depending on the "size" Users can select the size from a dropdown box. So instead of 10 URLs, I now have one URL. Users can access a specific size by adding an anchor to the end of the URL (?f=suze1, ?f=size2) For e.g: Old URLs. www.example.com/product-alpha-size1 www.example.com/product-alpha-size2 www.example.com/product-alpha-size3 www.example.com/product-alpha-size4 www.example.com/product-alpha-size5 New URLs www.example.com/product-alpha-size1 www.example.com/product-alpha-size1?f=size2 www.example.com/product-alpha-size1?f=size3 www.example.com/product-alpha-size1?f=size4 www.example.com/product-alpha-size1?f=size5 Do search engines read the anchor or drop them? Will the rank juice be transfered to just www.example.com/product-alpha-size1?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Questions Regarding Wordpress Blog Format, Categories and Tag pages...
I'm looking to make some optimizations to a website I'm working on but wanted more input before I get started: Currently, when blogs are posted to the website, the URL for each post looks like this:www.mywebsite.com/blogpost I've heard that for whatever reason, the best practice is to make sure that the blog posts get posted to a blog sub-directory like so: www.mywebsite.com/blog/blogpost If I were to make this change, I'm assuming I would have to 301 redirect all of the existing blogs to their new locations. Is this change worth doing and are there any other considerations I should be taking into account? Also, I'm aware that there are certain schools of thought that category and tag pages should be no-indexed to avoid duplicate content issues. Can anyone shed some light on this from first hand experience? Thanks in advance!
Intermediate & Advanced SEO | | goldbergweismancairo0 -
Two homepage urls
We have two different homepages for our website. One is designed for daytime users (i.e. businesses), whereas the second night version is designed with home consumers in mind. Is this hurting our SEO by having two homepage urls, instead of just building a strong presence around one? We have set up canonical meta on each one: On the night version: domain.com/indexnight.html we have a On the day version: domain.com/index.html we have a It seems to me that we should just choose one of them and set up a permanent 301 redirect from one to the other. Any assistance would be greatly appreciated, thank you!
Intermediate & Advanced SEO | | JessieT0 -
Should I remove 404 urls in webmaster tools?
I've recently removed a lot of category pages so should I remove the urls in webmaster tools or let them drop out of the index naturally?
Intermediate & Advanced SEO | | SamCUK0 -
Removing Canonical Links
We implemented rel=canonical as we decided to paginate our pages. We then ran some testing and on the whole pagination did not work out so we removed all on-page pagination. Now, internally when I click for example a link for Widgets I get the /widgets.php but searching through Google I get to /widgets.php?page=all . There are not redirects in place at the moment. The '?page=all' page has been rated 'A' by the SEOmoz tool under On Page Optimization reports and performs much better than the exact same page without the '?page=all' (the score dips to a 'D' grade) so need to tread carefully so we don't lose the link value. Can anyone advise us on the best way forward? Thanks in advance.
Intermediate & Advanced SEO | | jannkuzel0 -
Removing URLs in bulk when directory exclusion isn't an option?
I had a bunch of URLs on my site that followed the form: http://www.example.com/abcdefg?q=&site_id=0000000048zfkf&l= There were several million pages, each associated with a different site_id. They weren't very useful, so we've removed them entirely and now return a 404.The problem is, they're still stuck in Google's index. I'd like to remove them manually, but how? There's no proper directory (i.e. /abcdefg/) to remove, since there's no trailing /, and removing them one by one isn't an option. Is there any other way to approach the problem or specify URLs in bulk? Any insights are much appreciated. Kurus
Intermediate & Advanced SEO | | kurus1