"Revisit-after" Metatag = Why use it?
-
Hi Mozfans,
Just been thinking about the robots revisit metatag, all pages on my website (200+ pages) have the following tag on them;
name="revisit-after" content="7 days" />
I'm wondering what is the purpose of the tag?
Surely isn't it best to allow robots (such as Googlebot or Bingbot) to crawl your site as often as possible so the index and rankings get updated as quickly as possible?
Thanks in advance everyone!
Ash
-
Haha thanks for the example Ryan.
OK, I think I should let my web developer know, he seems to put it on all of his sites (he knows his stuff so maybe it's an old habit he's never bothered to research).
Your example prompted me to find the following page: http://www.seoconsultants.com/clueless/seo/tips/meta/
Quite a good read IMO.
-
The "revisit-after" tag has absolutely no value in HTML nor SEO. At no point of time did this tag ever have any value. There was a single search engine which was never of any significance which created this tag, but it was never adopted by Google nor anyone else.
If anyone disagrees, then I would suggest they add the following meta tag to their page:
It is no more effective then the "revisit-after" tag but at least it's original!
-
At one point this was taken as a "suggestion", but I believe almost all search engines automatically ignore this nowadays.
I think even when it was a valid command, it was still more often than not ignored by Googlebot
Shane
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
Will Google penalize 2 sites for targeting "like" keyword phrases?
I own (2) different websites, one an HTML site that has been live for 20 years and a ecommerce site that has been live for 7 years. We sell custom printed (branded) tents for use at trade shows and other indoor and outdoor events. While our ecomm site targets "trade show" tents our HTML site targets "event" tents. I believe that the keyword phrases are dissimilar enough that targeting "trade show tents" on one site and "event tents" on the other should not cause Google to penalize one or the other or both sites for having similar content. The content is different on both sites. I'm wondering if anyone has experience with, or opinions on, my thoughts... either way. Thanks,
Algorithm Updates | | terry_tradeshowstuff
Terry Hepola0 -
If my article is reposted on another blog, using re=canonical, does that count as a link back?
Hey all! My company blog is interested in letting another blog repost our article. We would ask them to use "re-canonical" in the mark-up to avoid Google digging through "duplicate" info out there. I was wondering, if the other site does use the "re=canonical", will that appear as a backlink or no? I understand that metrics will flow back to my original URL and not the canonical one, but I am wondering if the repost will additionally show as a backlink. Thanks!
Algorithm Updates | | cmguidry0 -
How is best to use Permalinks for Wordpress /category/postname or /postname
Hello , I have a question Regarding the Permalink structure form Wordpress ,I am trying to figure out what would be the best structure of the blog post link ,for the moment I am using the structure example.com/postname and I changed the structure to example.com/category/postname ,redirected with 301 the old links to the new links and I thought about it and wanted to ask , I would really appreciate if you could tell me what is best form SEO point of view to do. Regards,
Algorithm Updates | | anitawapa0 -
Seo results are down. Is my "all in one seo pack" to blame?
My website www.noobtraveler.com has shown a dip of 40% since Penguin's last update in November. I also transferred hosting at time, but I was wondering if I'm over optimizing with the all in one seo pack. I would appreciate it if someone could do a quick sweep and share their thoughts. Thanks!
Algorithm Updates | | Noobtraveler0 -
"We've processed your reconsideration request for www...." - Could this be good news?
Hey, We recently had a Google Penguin related links warning and I've been going through Google WMT and removing the most offensive links. We have requested resubmission a couple of times and have had the standard response of: "
Algorithm Updates | | ChrisHolgate
Site violates Google's quality guidelines We received a request from a site owner to reconsider your site for compliance with Google's Webmaster Guidelines. We've reviewed your site and we still see links to your site that violate our quality guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes. We encourage you to make changes to comply with our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results. If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request. If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support.
" On the 5th September after spending another couple more days removing the most prolific offenders we resubmitted the site again and again got the automated response saying they had received our request. A week later on the 13th September we got a slightly different response of : "
We've processed your reconsideration request We received a request from a site owner to reconsider how we index your site. We've now reviewed your site. When we review a site, we check to see if it's in violation of our Webmaster Guidelines. If we don't find any problems, we'll reconsider our indexing of your site. If your site still doesn't appear in our search results, check our Help Center for steps you can take. " I left it another couple of weeks to see if we'd get a slightly more in depth response however so far there has been nothing. I'll be honest in not being entirely sure what this means. The e-mails says simultaneously 'We've now reviewed your site' (as in past tense) but then continues with "If we don't find any problems" which suggests a future tense. I’m unsure from reading the e-mail whether they have indeed reviewed it (and just not told us the outcome) or whether it’s just a delayed e-mail saying that they have received the reconsideration request. Of course, if I received this e-mail off anyone other than Google I would have thought I was still in the dog house but the fact that it differs from the standard ‘Site violates Google’s quality guidelines’ message leads me to believe that something has changed and they may be happy with the site or at least happier than they were previously. Has anybody else received the latter message and has anybody managed to determine exactly what it means? Cheers guys!0 -
SinglePlatform's Restaurant Menu Across Web Properties vs "SEO-Optimized"
Surprised I wasn't able to find an existing answer given that SinglePlatform apparently serves 500,000 SMBs with menus that appear on over 150 publisher websites. Given Panda's razor-sharp intolerance for duplicate content, am I safe to assume that any claim of SinglePlatform's menu on a local restaurant being beneficial to your SEO is now spurious? If so, what's best way to handle this as a potential SEO liability while still having one of their nicely formatted restaurant menus on your site? For reference: http://www.openforum.com/articles/using-singleplatform-to-build-a-digital-presence Update May 7, 2012 Connected directly with the folks at SinglePlatform, and the answer here is a lot simpler than my over-thinking of it. The menu usually sits within an iFrame or widget so that's that. But the ability to truthfully show an up-to-date menu for any given establishment is a legit way to address the healthy amount of local search intent that seems to be directed at exactly that. Overall a pretty slick platform, looking forward to seeing how they grow into the SMB, local & mobile in the coming months, I think the space is ripe to benefit from products/services that take advantage of these sorts of economies of scale.
Algorithm Updates | | mgalica0 -
Website "penalized" 3 times by Google
I have a website that I'm working with that has had the misfortune of gaining rankings/traffic on Google, then having the rankings/traffic removed...3 times! (Very little was changed on the site to gain or lose "favor" with Google, either.) Notes: Site is a mixture of high quality original content and duplicate content (vacation rental listings) When traffic crashes, we lose nearly all rankings and traffic (90+%) When traffic crashes, we lose all rankings sitewide, including those gained by our high quality, unique pages None of the "crash" dates appear to coincide with any Panda update dates We are working on adding unique content to our pages with duplicate content, but it's a long process and so far doesn't seem to have made any difference I'm confounded why Google keeps "changing its mind" about our site We have an XML sitemap, and Google keeps our site indexed pretty well, even when we lose our rankings Due to the drastic and sitewide loss of rankings, I'm assuming we are dealing with some sort of algorithmic penalty Timeline: Traffic steadily grows starting in Jan 2011 Traffic crashes on Feb 19, 2011. We assumed it was due to a pre-panda anti-scraper update, but don't know. Google sends traffic to our site on March 1, then none the next day On June 16th, I block part of the site using robots.txt (most of the section wasn't indexed anyway) On June 17th, Google starts ranking our site again. I thought it might be due to the robots.txt change, but I had just made the change a few hours ago, and Google wasn't even indexing the part of the site I blocked Traffic/rankings crash again on July 6th. No theory why. Site URL: http://www.floridaisbest.com Traffic Stats: Attached I know that we need more backlinks and less duplicate content, but I can't explain why our Google rankings are "on again, off again". I have never seen a site gain and lose all of its rankings/traffic so drastically multiple times, for no apparent reason. Any thoughts or ideas would be welcome. Thanks! t8IqB
Algorithm Updates | | AdamThompson0