Prevent indexing of dynamic content
-
Hi folks!
I discovered bit of an issue with a client's site. Primarily, the site consists of static html pages, however, within one page (a car photo gallery), a line of php coding:
dynamically generates a 100 or so pages comprising the photo gallery - all with the same page title and meta description. The photo gallery script resides in the /gallery folder, which I attempted to block via robots.txt - to no avail. My next step will be to include a:
within the head section of the html page, but I am wondering if this will stop the bots dead in their tracks or will they still be able to pick-up on the pages generated by the call to the php script residing a bit further down on the page?
Dino
-
Hello Steven,
Thank you for providing another perspective. However, all factors considered, I agree with Shane's approach on this one. The pages add very little merit to the site and exist primarily to provide the site users with eye-candy (e.g. photos of classic cars).
-
Just personally, I would still deindex or canonical them - they are just pages with a few images - so not of much value and unless all titles and descriptions are targeting varying keywords and content is added, they will canabalize eachother, and possibly even drag down the site due to 100's of pages of thin content....
So actually from an SEO perspective it probably IS better to deindex or canonical 3 - 5 or so years ago, maybe the advice would have been keep them and keyword target - but not in the age of content
(unless the images were optimized for image searches for sale able products (but I do not think it is)
-
Hi Dino,
I know this won't solve the immediate problem you asked for, but wouldn't it be better for your client's site (and for SEO) to alter the PHP so that the title and meta data description are replaced with variables that can also be dynamic, depending on whichever of the 100 or so pages gets created?
That way, rather than worrying about a robot seeing 100 pages as duplicate content, it could see 100 pages as 100 pages.
-
It depends on how the pages are being created (I would assume it is off of a template page)
So within the template of this dynamically created page you would place
But if this is the global template - you cannot do this as it will noindex every page which of course is bad.
If you want to PM me the URL of the page I can take a look at your code, and see what is going on and how to recitify, as right now i think we are talking about the same principles, but different words are being used.
It really is pretty straightforward. (what I am saying) The pages that you want to be not indexed DO NOT need a nofollow they need a meta noindex
But there are many variables, as if you have already robot.txt disallowed the directory, then no bot will go there to get the updated noindex directive....
If there is no way to add a meta noindex then you need to nofollow and put in for a manual removal
-
I completely understand and agree with all points you have conveyed. However, I am not certain as to the best approach to "noindex" the urls which are being created dynamically from within the static html page? Maybe I am making this more complex than it needs to be...
-
So it is the pages themselves that are dynamically created you want out of index, not the page the contains the links?
If this is so ---
noindex the pages that are created dynamically
Therein lies the problem. I did have the nofollow directive in place specifying the /gallery/ folder, but apparently, the bots still crawled it.
Nofollow does not remove from index, it only tells the bot not to pass authority, as it is still feasible that the bot will crawl the link, so without the noindex, nofollow is not the correct directive due to the page (even though nofollowed) is still being reached and indexed.
PS. also if you have the nofollow on the links, you may want to remove it, so the bots will go straight through to the page and grab the noindex directive, but if you wanted to try to not let any authority "evaporate" you can continue to nofollow, but you may need to request the dynamically generated pages (URLS) be removed using webmaster tools.
-
The goal is to have the page remain in the index, but not follow any dynamically generated links on the page. The nofollow directive (in place for months) has not done the job.
-
?
If a link is coming into the page, and you have Noindex, Nofollow - this would remove from index and prevent the following of any links -
This is NOT instant, and can take months to occur depending on depth of page, crawl schedule ect... (you can try to speed it up by using webmaster tools to remove the URL)
What is the goal You are attempting to achieve?
To get the page out of index, but still followed?
Or remain in index, but just not follow links on page?
?
-
Therein lies the problem. I did have the nofollow directive in place specifying the /gallery/ folder, but apparently, the bots still crawled it. I agree that the noindex removes the page, but I wasn't certain if it prevented crawling of the page, as I have read mixed opinions on this.
I just thought of something else... perhaps an external url is linking to this page - allowing it to be crawled. I am off to examine this perspective.
Thanks for your response!
-
noindex will only remove from Index and dissallow the act of indexing the specific page (or pages created off template) you place the tag in upon the next page crawl.
Bots will still follow the page, and follow any links that are readable as long as there is not a nofollow directive.
I am not sure I fully understand the situation, so I would not say this is my "reccomendation" but an answer to the specific question.....
but I am wondering if this will stop the bots dead in their tracks or will they still be able to pick-up on the pages generated
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Consolidating a Large Site with Duplicate Content
I will be restructuring a large website for an OEM. They provide products & services for multiple industries, and the product/service offering is identical across all industries. I was looking at the site structure and ran a crawl test, and learned they have a LOT of duplicate content out there because of the way they set up their website. They have a page in the navigation for “solution”, aka what industry you are in. Once that is selected, you are taken to a landing page, and from there, given many options to explore products, read blogs, learn about the business, and contact them. The main navigation is removed. The URL structure is set up with folders, so no matter what you select after you go to your industry, the URL will be “domain.com/industry/next-page”. The product offerings, blogs available, and contact us pages do not vary by industry, so the content that can be found on “domain.com/industry-1/product-1” is identical to the content found on “domain.com/industry-2/product-1” and so-on and so-forth. This is a large site with a fair amount of traffic because it’s a pretty substantial OEM. Most of their content, however, is competing with itself because most of the pages on their website have duplicate content. I won’t begin my work until I can dive in to their GA and have more in-depth conversations with them about what kind of activity they’re tracking and why they set up the website this way. However, I don’t know how strategic they were in this set up and I don’t think they were aware that they had duplicate content. My first thought would be to work towards consolidating the way their site is set up, so we don’t spread the link-equity of “product-1” content, and direct all industries to one page, and track conversion paths a different way. However, I’ve never dealt with a site structure of this magnitude and don’t want to risk messing up their domain authority, missing redirect or URL mapping opportunities, or ruin the fact that their site is still performing well, even though multiple pages have the same content (most of which have high page authority and search visibility). I was curious if anyone has dealt with this before and if they have any recommendations for tackling something like this?
On-Page Optimization | | cassy_rich0 -
Creating Content for over 50,000 Pages
Hi, Our site is a football (soccer) statistics sites. We gather information on upcoming games and post results of past games. At the moment we have over 50,000 pages of results each having in-game data displayed. The main problem I have is none of these match data pages has any text.Mostly tables of stats. Could anyone suggest a way of creating unique content for these pages? If I created some generic a paragraphic of text that changed based on stats and figures would this be seen as duplicate content?
On-Page Optimization | | jtatsubana0 -
Online classified ads site - duplicate content?
Hello, I was reading hobo s post on duplicate content. Our web is in the classified advertisement industry and our site is built up like this Homepage (last 200 ads) category 1(has the name we want to rank our homepage and around 350 ads) category 2 (around 100 ads) category 3 (around 60 ads) Now our homepage has 200 ads that also appear mostly in category 1 but also in others. We are ranking our homepage as 11 th now on Google. I'm worried a bit that the 200 ads on the homepage are not unique, because they will appear in one other category. Is this OK? Is this duplication? Should we do something? Issue is that we at first started ranking our homepage where all ads were, now there are too many so we show 200 latest on homepage and then they are split into category pages.
On-Page Optimization | | advertisingcloud0 -
Content on product category pages - does Google care?
Hi All, I've always been unsure about the importance of content on product category pages. Nobody reads it. If you search for "living room chairs", you're just going to want to see a big list of living room chairs - not read content about living room chairs, how to choose one, etc. On virtually any ecommerce site, category pages have a paragraph or two of total bla-bla. Does this have any impact on search rankings? More specifically, will Googlebot see content on how to choose a living room chair and say "Yes! This is really helpful content"? Or, will it realize that the searcher intent on this keyword is really just to see a list of chairs, and ignore this content - or at least downplay its importance? WDTY?
On-Page Optimization | | BarryBuckman0 -
Duplicate Content
Is making tabs with general product information on similar products considered duplicate content?
On-Page Optimization | | BridalHotspot0 -
How to check duplicate content with other website?
Hello, I guest that my website may be duplicate contents with other websites. Is this a important factor on SEO? and how to check and fix them? Thanks,
On-Page Optimization | | JohnHuynh1 -
Do product pages need unique content or does having duplcate content hurt on those pages?
We are adding product rapidly to our website but this requires allowing duplicate to exist on our product pages of furniture-online.com. From an SEO standpoint do we need to make this content unique for each product. Since we aren't link building to specific product pages and we don't anticipate product pages being found in a search result, are we ok leaving the duplicate content in place and spending our dollars elsewhere?
On-Page Optimization | | gallreddy0 -
Problem with fresh content on homepage
On my site my homepage acts as sort of a landing page that is geared towards getting the customer sign up (almost like a PPC landing page aside from a few navigation options...about, blog, contact and the legal docs in the footer). My blog is geared towards other businesses in the industry and the like minded tech people. My problem:
On-Page Optimization | | JasonJackson
From a user perspective I don't feel that blog snippets would add anything useful to the homepage. However, I feel like I fresh content would help my SEO endeavors. Suggestions? Note:
Should be mentioned that all my social stuff is deeply integrated into my /blog so importing tweets, for example, is out of the question.0