How do you archive content?
-
In this video from Google Webmasters about content, https://www.youtube.com/watch?v=y8s6Y4mx9Vw around 0:57 it is advised to "archive any content that is no longer relevant".
My question is how do you exactly do that? By adding noindex to those pages, by removing all internal links to that page, by completely removing those from the website?
How do you technically archive content?
-
Hard to say what is meant by that video. Often, Google is purposely vague.
If the content is truly no longer relevant, I would 301 it to more relevant URLs on a page by page basis. This will remove low performing pages from Google's index, and potentially improve your rankings.
On the other hand, if the content still has value but doesn't need to be front and center, a clearly organized archive based on date or some other organizational method should work fine.
-
Hi Sorina,
Archiving is more about classifying information/content that is either outdated or not being accessed that frequently by the visitors into a separate section on your site. I would not no index those pages because they might be ranking well in the search engines and still be getting traffic to the site. You can do this creating an "archives" section on your site so that if the visitors want to access the old content on your site they can still do so by accessing that section.
Here is a useful post on archiving content on your site
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content update on 24hr schedule
Hello! I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page. Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine) Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask) On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.) Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page? Thank you all mozzers!
Intermediate & Advanced SEO | | HashtagHustler0 -
Duplicate Content
Let's say a blog is publishing original content. Now let's say a second blog steals that original content via bot and publishes it as it's own. Now further assume the original blog doesn't notice this for several years. How much damage could this do to blog A for Google results? Any opinions?
Intermediate & Advanced SEO | | CYNOT0 -
Content suggestions and topics
Hello, In the list of topics that moz recommends, how many of the topics that are recommend should I cover just 2 or 3 or 10 of them ? is the more the better ? Then let's say one of the topic recommended is tennis should I just add the topic tennis once in my content or do I need to cover this topic multiple times ? meaning write the topic tennis 3 times across my content ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
Opinions on Boilerplate Content
Howdy, Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please. What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings). For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name. I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it? I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites. Thanks!
Intermediate & Advanced SEO | | kirmeliux0 -
Having a hard time with duplicate page content
I'm having a hard time redirecting website.com/ to website.com The crawl report shows both versions as duplicate content. Here is my htaccess: RewriteEngine On
Intermediate & Advanced SEO | | cgman
RewriteBase /
#Rewrite bare to www
RewriteCond %{HTTP_HOST} ^mywebsite.com
RewriteRule ^(([^/]+/)*)index.php$ http://www.mywebsite.com/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.*)$ $1.php [NC,L]
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L] I added the last 2 lines after seeing a Q&A here, but I don't think it has helped.0 -
Homepage Content
I have a website which perform very well for some keywords and much less for other keywords. I would like to try to optimize the keywords with less performance. Let's say our website offers 2 main services: KEYWORD A and KEYWORD Z. KEYWORD Z is a very important keyword for us in terms of revenue. KEYWORD A gives us position Nr 1 on our local Google and redirect properly the visitors to xxxxxx.com/keyword-a/keyword-a.php KEYWORD Z perform badly and gives us position Nr 7 on local Google search. 90% Google traffic is sent to xxxxxx.com/keyword-z/keyword-z.php and the other 10% is sent to the home page of the website. The Homepage is a "soup" of all the services our company offers, some are important (KEYWORD Z) and other much less important. In order to optimize the keyword KEYWORD Z we were thinking to make a permanent redirect for xxxxxx.com/keyword-z/keyword-z.php to xxxxxx.com and optimize the content of the Homepage to ONLY describe our KEYWORD Z. I am not sure if Google gives more importance in the content of the homepage or not. Of course links on the homepage to other pages like xxxxxx.com/keyword-a/keyword-a.php will still exists. The point for us is maybe to optimize better the homepage and give more importance to the KEYWORD Z. Does it make sense or not?
Intermediate & Advanced SEO | | netbuilder0 -
Duplicate Content | eBay
My client is generating templates for his eBay template based on content he has on his eCommerce platform. I'm 100% sure this will cause duplicate content issues. My question is this.. and I'm not sure where eBay policy stands with this but adding the canonical tag to the template.. will this work if it's coming from a different page i.e. eBay? Update: I'm not finding any information regarding this on the eBay policy's: http://ocs.ebay.com/ws/eBayISAPI.dll?CustomerSupport&action=0&searchstring=canonical So it does look like I can have rel="canonical" tag in custom eBay templates but I'm concern this can be considered: "cheating" since rel="canonical is actually a 301 but as this says: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html it's legitimately duplicate content. The question is now: should I add it or not? UPDATE seems eBay templates are embedded in a iframe but the snap shot on google actually shows the template. This makes me wonder how they are handling iframes now. looking at http://www.webmaster-toolkit.com/search-engine-simulator.shtml does shows the content inside the iframe. Interesting. Anyone else have feedback?
Intermediate & Advanced SEO | | joseph.chambers1