When to not index
-
We are working on a brand new site http://www.shedfinders.com/
The site has some login sections i.e. agent profiles for property agents etc, register etc.
I figured what is the harm to submit all of these in sitemap and use Yoast to ensure on page is as good as it can be.
If a user stumbles across any then they would be redirected back to a a register/login page.
Not sure what is best practice?
Laura
-
only submit pages for indexing if they have unique and quality content on them, right now the current indexed pages of the site looks properly indexed with main pages, PDFs, and no duplicate parameter pages at the moment. The last thing you want to do is make hundreds or thousands of indexed user pages with little to no content on them.
by the way i would move the brand name to the end of the title tag instead of the front, if you want it included. title tags should always start with the main keyword
existing
Shedfinders | Modern Warehousing With Office Accommodation
better
Modern Warehousing With Office Accommodation | Shedfinders
-
You would usually want to no index pages like login pages and other pages that may not help a user. This is to decrease bounce rate of your overall site, usability and your site's own privacy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The particular page cannot be indexed by Google
Hello, Smart People!
On-Page Optimization | | Viktoriia1805
We need help solving the problem with Google indexing.
All pages of our website are crawled and indexed. All pages, including those mentioned, meet Google requirements and can be indexed. However, only this page is still not indexed.
Robots.txt is not blocking it.
We do not have a tag "nofollow"
We have it in the sitemap file.
We have internal links for this page from indexed pages.
We requested indexing many times, and it is still grey.
The page was established one year ago.
We are open to any suggestions or guidance you may have. What else can we do to expedite the indexing process?1 -
Can lazy loading of images affect indexing?
I am trying to diagnose a massive drop in Google rankings for my website and noticed that the date of the ranking and traffic drop coincides with Google suddenly only indexing about 10% of my images, whereas previously it was indexing about 95% of them. Wondering if addition of lazy load script to images (so they don't load from the server until visible in the browser) could cause this index blocking?
On-Page Optimization | | Gavin.Atkinson1 -
Single Page on my client's website is not crawling and indexing new changes. What could be the possible reason?
I made several changes on client's website on different pages, changed titles, add content on few pages, moved blog from subdomain to sub directory. Everything is crawled but there is one page on the website (not part of the blog) that isn't getting crawled in Google and picking up changes. The last crawl of the website is 2 days back whereas that page was last crawled on 30th sep. I just wanted to know the possible reasons and has anyone encountered this before?
On-Page Optimization | | MoosaHemani0 -
No index, or no index no follow?
Wondering if I could garner some views on this issue please. I'm about to add an affiliate store to a website I own, the site has a couple of pages of unique content (blogs, articles, advice etc on home improvement - all written by my team). Obviously, the affiliate store will not be unique content, it will be made using the datafeeds from cj.com et al, and so I don't want to get any duplicate content type penalties from Google for this store. Should I add a no index to the pages and allow the bots to still crawl them, or should I add no index and no follow? Ideally I would like to get the affiliate store category pages indexed as they will be a mixture of lots of different merchants and be fairly unique. Can Google still mark the site down for duplicate content if it can crawl it, even if it is noindex? Thanks, Carl
On-Page Optimization | | Grumpy_Carl0 -
Help, a certain directory is not being indexed
Before I start, dont expect this to be too easy. This really has me puzzled and am surprised I am still yet to find a solution for it. Get ready. We have a wordpress website, launched over 6 months ago and have never had an issue getting content such as pages and post pages and categories indexed. However, I some what recently (about 2 months ago) installed a directory plugin (Business Directory Plugin) which lists businesses via unique urls that are accesible from a sub folder. Its these business listings that I absolutely cannot get indexed. The index page to the directory which links to the business pages is indexed, however for some reason google is not indexing all the listing pages which are linked to from this page. Its not an issue of the content being uncrawlable or at least dont think so as when I run crawlers on my site such as xml sitemap crawlers it finds all the pages including the directory pages so I am sure its not an issue of the search engines not finding the content. I have created xml sitemaps and uploaded to webmaster tools, tools recongises that there are many pages in the xml sitemap but google continues to only index a small percentage (everything but my business listings). The directory has been there for about 8 weeks now so I know there is a issue as it should of been indexed by now. See our main website at www.smashrepairbid.com.au and the business directory index page at www.smashrepairbid.com.au/our-shops/ To throw in a curve ball, in looking into this issue and setting up tools we noticed a lot of 404 error pages (nearly 4,000). We were very confused where these were coming from as they were only being generated from search engines - humans could not access the 404s and so we are guessing se's were firing some javascript code to generate them or something else weird. We could see the 404s in the logs so we know they were legit but again feel it was only search engines, this was validated when we added some rules to robots.txt and we saw the errors in the logs stop. We put the rules in robots txt file to try and stop google from indexing the 404 pages as we could not find anyway to fix the site / code (no idea what is causing them). If you do a site search in google you will see all the pages that are omitted in the results. Since adding the rules to robots, our impressions shown through tools have jumped right up (increased by 5 times) so thought this was a good indication of improvement but still not getting the results we want. Does anyone have any clue whats going on or why google and other se's are not indexing this content? Any help would be greatly appreciated and if you need any other information to assist just ask me. Really appreciate anyone who can spare their time to help me, I sure do need it. Thanks.
On-Page Optimization | | ziller0 -
ON SITE SEARCH INDEXED BY GOOGLE - no follow or no index
Google indexes alll our internetal searches: search box is brand - clothes types - size type - and for each page it creates a page that which creates duplicate page title and unnecessary content. Should I do a nofollow on the advance search or a no index. Many thanks for the info. Sonja
On-Page Optimization | | reallyitsme0 -
Are a lot auf tag-sites in the index a bad signal for low quality? (Panda Update)
Hello everybody, first of all please excuse my bad english. I'm from Germany - I try my best. 😉 The case: I have a Wordpress SEO project which rankings very well. A this moment I have all "archive sites" like "archive", "category" und "tags" indexed. I use the more-Tag for every archive/category/tag site - so duplicate content ist not really a problem, but in view of the Panda Update, which surely arrives in Germany soon, I wonder if all this Tag/Archive/Category Sites in the index maybe seen as low quality und can hurt the ranking of my whole site. Low quality because: With using the more-tag the site are just a list of internal links with content snippets. I have 500 articles und 700 Tag Site (all in the index). So my fear is when google (with Panda Update) looks at my site und sees all this (maybe) low quality tag-sites in the index I get penalised because there is not a good proportion between my normal (good quality) Articles und the archive/tag sites. I hope you guys can understand my thoughts. Do I have a legitimate fear that the mass of tag-site in the index could be problem? Are there any data from the USA, how blogs mit Tag-Site in the Index rank after the Panda Update or if sites which contains of internal Links mit content snippets - like these tag site - are low quality in Google eyes? Or I'm worring to much? Thank you very much! Oliver
On-Page Optimization | | channelplus0 -
Does Frequency of content updates affect likelyhood outbound links will be indexed?
I have several pages on our website with low pr, that also themselves link to lots and lots of pages that are service/product specific. Since there are so many outbound links, I know that the small amount of PR will be spread thin as it is. My question is, if I were to supply fresh content to the top level pages, and change it often, would that influence whether or not google indexes the underlying pages? Also if I supply fresh content to the underlying pages, once google crawls them, would that guarantee that google considers them 'important' enough to be indexed" I guess my real question is, can freshness of content and frequency of update convince google that the underlying pages are 'worthy of being indexed', and can producing fresh content on those pages 'keep google's interest', so to speak, despite having little if any pagerank.
On-Page Optimization | | ilyaelbert0