Unpublish and republish old articles
-
This might be a dumb question but we had an incident where a new SEO guy thought it would be a good idea to un-publish and republish all of your 200+ blog posts which we carefully scheduled over the last 6 months. He did not update the content and did not change anything. His intention was to send out google a sign to recheck the sites or something. Now, the entire blog looks like it wen't live in one day, which I don't think is good? Should we load a backup and get our old publishing dates back, should we keep it with the new publishing dates? What are the consequences? Will it effect our SEO?
-
You guys are awesome!! Thank you so much! That is exactly what we thought! Thank you for confirming, we loaded a backup.
-
From a user standpoint, you should definitely roll back the old dates. Otherwise, it's going to raise suspicion as to why they've all been published on the same date and for current event articles, having the wrong date is not helpful at all for those who are looking at the relevant time frame.
If the content was updated, on the other hand, then it's worth updating the publish date on the article to the date it was updated. And the article will probably be all the better for it, in terms of SEO!
However, as it is, I don't think it's going to have any noticeable effect on the SEO to have the dates as they were or as they are now. But think of the users.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving site from html to Wordpress site: Should I port all old pages and redirect?
Any help would be appreciated. I am porting an old legacy .html site, which has about 500,000 visitors/month and over 10,000 pages to a new custom Wordpress site with a responsive design (long overdue, of course) that has been written and only needs a few finishing touches, and which includes many database features to generate new pages that did not previously exist. My questions are: Should I bother to port over older pages that are "thin" and have no incoming links, such that reworking them would take time away from the need to port quickly? I will be restructuring the legacy URLs to be lean and clean, so 301 redirects will be necessary. I know that there will be link juice loss, but how long does it usually take for the redirects to "take hold?" I will be moving to https at the same time to avoid yet another porting issue. Many thanks for any advice and opinions as I embark on this massive data entry project.
Technical SEO | | gheh20130 -
Old URLs Appearing in SERPs
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to. Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s. We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain. We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input. Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise. Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days. Update robots.txt to block access to the redirecting directories. Thank you. Rosemary One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
Technical SEO | | RosemaryB2 -
Upgrade old sitemap to a new sitemap index. How to do without danger ?
Hi MOZ users and friends. I have a website that have a php template developed by ourselves, and a wordpress blog in /blog/ subdirectory. Actually we have a sitemap.xml file in the root domain where are all the subsections and blog's posts. We upgrade manually the sitemap, once a month, adding the new posts created in the blog. I want to automate this process , so i created a sitemap index with two sitemaps inside it. One is the old sitemap without the blog's posts and a new one created with "Google XML Sitemap" wordpress plugin, inside the /blog/ subdirectory. That is, in the sitemap_index.xml file i have: Domain.com/sitemap.xml (old sitemap after remove blog posts urls) Domain.com/blog/sitemap.xml (auto-updatable sitemap create with Google XML plugin) Now i have to submit this sitemap index to Google Search Console, but i want to be completely sure about how to do this. I think that the only that i have to do is delete the old sitemap on Search Console and upload the new sitemap index, is it ok ?
Technical SEO | | ClaudioHeilborn0 -
My Article Post Title in both the h1 and the h2 are the same. Is this good seo?
I'm seeing a common practice in wordpress themes where the h1 tag for a page has the logo in it, then the h2 would be the title to the article. I've decided to place the title in the h1 dynamically, like this: - Joe's Auto Store where '' is the actual title to the post - the logo is still being used as a background image in the h1... So for example, the page would show this: How install a car battery - Joe's Auto Store I think this is good seo still, but the other issue is that the first, subsequent also has the exact same title because this is the actual post title, which uses the first h2 on the page to display the title. So the code would look like this: - My Company paragraph content text stuff an example would be How install a car battery - Joe's Auto Store How install a car battery At Joe's we teach how to install batteries on site. There are mor...(etc.) Is this an issue since the post title in both the h1 and h2 are nearly the same (except for the company name)? Is this good seo still?
Technical SEO | | johnnydigital0 -
Best way to get SEO friendly URLSs on huge old website
Hi folks Hope someone may be able to help wit this conundrum: A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888 and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/ I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way: Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included? Rebuild the site entirely (preferably on PHP with a decent URL structure) Accept that the URLS can't be made friendly on a site this size and focus on other aspects Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs. Any thoughts from the great minds in the SEOmoz community appreciated! Cheers Simon
Technical SEO | | SCL-SEO1 -
Blog archives vs individual articles
In a client's blog, you can find each individual article pages as well as aggregate of articles per month or sometimes per day (including each entire article). The problem is that the article appears twice, once in a dedicated page (article page) and once with other articles (in the archive). Is there a specific SEO approach to this type of situation? Is there duplicate content? What page name should I give each archive (if at all), as there are quite a few? Thank you
Technical SEO | | DavidSpivac0 -
Why are old versions of images still showing for my site in Google Image Search?
I have a number of images on my website with a watermark. We changed the watermark (on all of our images) in May, but when I search for my site getmecooking in Google Image Search, it still shows the old watermark (the old one is grey, the new one is orange). Is Google not updating the images its search results because they are cached in Google? Or because it is ignoring my images, having downloaded them once? Should we be giving our images a version number (at the end of the file name)? Our website cache is set to 7 days, so that's not the issue. Thanks.
Technical SEO | | Techboy0 -
Adding more content to an old site
We have a site which was de-moted from PR4 to PR3 with the latest Google update. We have not done any SEO for a long time for the site and the content is the same with over 100 page. My question is, in order to update the site, which is the best to do it, do we: 1. re-introduced new content to replace old once 2. re-write old content 3. Add new pages Many thanks in advance.
Technical SEO | | seomagnet0