Use of
-
Hi Everyone,
Encountered this NOHTMLINDEX tag within the section of a website and I am not really sure how it affects the performance of the website and its use. This meta robots tag has been used on numerous pages like the home page, content pages and forum threads. Is it good to keep it there or should it be removed and why?
Thanks in advance!
Steve
-
Hi,
I am not sure but I would recommend it's use is to 'hide' pages from the robots, not really for anything else. It is used for exclude content from being crawled, you can add this tag manually to all pages that they don't want the index server to crawl, but why you add this tag on home page and content page do you don't want to crawl that pages??
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Exclude sorting options using nofollow to reduce duplicate content
I'm getting reports of duplicate content for pages that have different sorting options applied, e.g: /trips/dest/africa-and-middle-east/
On-Page Optimization | | benbrowning
/trips/dest/africa-and-middle-east/?sort=title&direction=asc&page=1
/trips/dest/africa-and-middle-east/?sort=title&direction=des&page=1 I have the added complication of having pagination combined with these sorting options. I also don't have the option of a view all page. I'm considering adding rel="nofollow" to the sorting controls so they are just taken out of the equation, then using rel="next" and rel="prev" to handle the pagination as per Google recommendations(using the default sorting options). Has anyone tried this approach, or have an opinion on whether it would work?0 -
Authorship - How bad is it to use a fictional character instead of the actual author ??
Hi I know authorship is all about the author hence should be a real person (the author) but in the case of children's books etc etc is it really not ok to develop authorship around the character rather the actual author ? For example kids searching google will recognise the character but not the author Even if G doesnt like it how bad is it to ignore them and proceed with the character ? All Best Dan
On-Page Optimization | | Dan-Lawrence0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Using example.info when example.com is a link farm. Ok? Bad? Doesn't matter?
Second question of the day- I'm helping a friend with his law firm site. He is using example.info because example.com is being used by a link farm. Is this hurting his search efforts? Thanks
On-Page Optimization | | ahossom0 -
Multi-Site SEO using Host Headers
Hi - I'm working on a proposal for a client who runs 3 different career websites. He uses "host headers" to direct the website visitor to the correct website. For example, if the visitor comes from Washington he goes to one url. If he is in Kansas, he goes to another URL. Does anyone have any experience doing SEO with this type of system? What do I need to know? What are hurdles I'll encounter? Thanks, -Hunter
On-Page Optimization | | HunterW0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
How do I get rid of duplicate page titles when using a php site?
Hi. I have an e-commerce site that sells a list of products. The list is divided into categories and then those categories for the various pages on the site. An example of a page title. would be given root/products.php?c=40 another page would be given root/products.php?c=41 Is there a way to structure the site with SEO in mind?
On-Page Optimization | | curtisgibbsiii0 -
Using Transcriptions
Hi everyone, I've spent a long time trying to figure this one out, so I'm looking forward to your insights. I've recently started having our videos transcribed and keyworded. The videos are hosted on youtube and already embedded on our website. Each embedded video is accompanied by an existing keyword-rich article that covers pretty much the same content of the video, but in a little more detail. I'm now going back and having these videos transcribed. The reason I started doing this was to essentially lengthen the article and get more keywords on the page. Question A. My concern is that the transcription covers the same content as the article, so doesn't add that much for the reader. That's why when I post the transcription (below the embedded video), I use a little javascript link for people to click if they want to read it. Then it becomes visible. Otherwise it's not visible. Note that I am NOT trying to hide it from google by doing this - and it will still show up for people who don't have javascript on - so I'm not trying to cheat google at all and I think I'm doing it based on how they want it done. You can see an example here: http://www.healthyeatingstartshere.com/nutrition/healthy-diet-plan-mistakes So my first question is: do you think the javascript method is a good way of doing it? Question B. Does anyone have any insight on whether it would be better to put the transcription:
On-Page Optimization | | philraymond
1. On the same page as the embedded video/article (which I am doing now), or
2. On a different page, linked to from the above page, or
3. On various other websites (wordpress, blogspot, web2.0 sites) that link back to the video/article on our site. I know it's usually best practice to put it on the same page as the video, but I'm wondering from an <acronym title="Search Engine Optimization">SEO</acronym> point of view if I'm wasting a 500 word transcription by posting it on the same page as a 500 article that covers the same topic and uses the same keywords, and I wonder if it would be better to use the transcription elsewhere. Do you have any thoughts on which of the above methods would be best? Thanks so much for reading and any advice you may have.0