Prevent indexing of dynamic content
-
Hi folks!
I discovered bit of an issue with a client's site. Primarily, the site consists of static html pages, however, within one page (a car photo gallery), a line of php coding:
dynamically generates a 100 or so pages comprising the photo gallery - all with the same page title and meta description. The photo gallery script resides in the /gallery folder, which I attempted to block via robots.txt - to no avail. My next step will be to include a:
within the head section of the html page, but I am wondering if this will stop the bots dead in their tracks or will they still be able to pick-up on the pages generated by the call to the php script residing a bit further down on the page?
Dino
-
Hello Steven,
Thank you for providing another perspective. However, all factors considered, I agree with Shane's approach on this one. The pages add very little merit to the site and exist primarily to provide the site users with eye-candy (e.g. photos of classic cars).
-
Just personally, I would still deindex or canonical them - they are just pages with a few images - so not of much value and unless all titles and descriptions are targeting varying keywords and content is added, they will canabalize eachother, and possibly even drag down the site due to 100's of pages of thin content....
So actually from an SEO perspective it probably IS better to deindex or canonical 3 - 5 or so years ago, maybe the advice would have been keep them and keyword target - but not in the age of content
(unless the images were optimized for image searches for sale able products (but I do not think it is)
-
Hi Dino,
I know this won't solve the immediate problem you asked for, but wouldn't it be better for your client's site (and for SEO) to alter the PHP so that the title and meta data description are replaced with variables that can also be dynamic, depending on whichever of the 100 or so pages gets created?
That way, rather than worrying about a robot seeing 100 pages as duplicate content, it could see 100 pages as 100 pages.
-
It depends on how the pages are being created (I would assume it is off of a template page)
So within the template of this dynamically created page you would place
But if this is the global template - you cannot do this as it will noindex every page which of course is bad.
If you want to PM me the URL of the page I can take a look at your code, and see what is going on and how to recitify, as right now i think we are talking about the same principles, but different words are being used.
It really is pretty straightforward. (what I am saying) The pages that you want to be not indexed DO NOT need a nofollow they need a meta noindex
But there are many variables, as if you have already robot.txt disallowed the directory, then no bot will go there to get the updated noindex directive....
If there is no way to add a meta noindex then you need to nofollow and put in for a manual removal
-
I completely understand and agree with all points you have conveyed. However, I am not certain as to the best approach to "noindex" the urls which are being created dynamically from within the static html page? Maybe I am making this more complex than it needs to be...
-
So it is the pages themselves that are dynamically created you want out of index, not the page the contains the links?
If this is so ---
noindex the pages that are created dynamically
Therein lies the problem. I did have the nofollow directive in place specifying the /gallery/ folder, but apparently, the bots still crawled it.
Nofollow does not remove from index, it only tells the bot not to pass authority, as it is still feasible that the bot will crawl the link, so without the noindex, nofollow is not the correct directive due to the page (even though nofollowed) is still being reached and indexed.
PS. also if you have the nofollow on the links, you may want to remove it, so the bots will go straight through to the page and grab the noindex directive, but if you wanted to try to not let any authority "evaporate" you can continue to nofollow, but you may need to request the dynamically generated pages (URLS) be removed using webmaster tools.
-
The goal is to have the page remain in the index, but not follow any dynamically generated links on the page. The nofollow directive (in place for months) has not done the job.
-
?
If a link is coming into the page, and you have Noindex, Nofollow - this would remove from index and prevent the following of any links -
This is NOT instant, and can take months to occur depending on depth of page, crawl schedule ect... (you can try to speed it up by using webmaster tools to remove the URL)
What is the goal You are attempting to achieve?
To get the page out of index, but still followed?
Or remain in index, but just not follow links on page?
?
-
Therein lies the problem. I did have the nofollow directive in place specifying the /gallery/ folder, but apparently, the bots still crawled it. I agree that the noindex removes the page, but I wasn't certain if it prevented crawling of the page, as I have read mixed opinions on this.
I just thought of something else... perhaps an external url is linking to this page - allowing it to be crawled. I am off to examine this perspective.
Thanks for your response!
-
noindex will only remove from Index and dissallow the act of indexing the specific page (or pages created off template) you place the tag in upon the next page crawl.
Bots will still follow the page, and follow any links that are readable as long as there is not a nofollow directive.
I am not sure I fully understand the situation, so I would not say this is my "reccomendation" but an answer to the specific question.....
but I am wondering if this will stop the bots dead in their tracks or will they still be able to pick-up on the pages generated
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Delete or not delete outdated content
Hi there!
On-Page Optimization | | Enrico_Cassinelli
We run a website about a region in Italy, the Langhe area, where we write about wine and food, local culture, and we give touristic informations. The website also sports a nice events calendar: in 4 years we (and our users) loaded more than 5700 events. Now, we're starting to have some troubles managing this database. The database related to events is huge both in file size and number of rows. There are a lot of images that eat up disk space, and also it's becoming difficult to manage all the data in our backend. Also, a lot of users are entering the website by landing on outdated events. I was wondering if it could be a good idea to delete events older than 6 months: the idea was to keep only the most important and yearly recurring events (which we can update each year with fresh information), and trash everything else. This of course means that 404 errors will increase, and also that our content will gettin thinner, but at the same time we'll have a more manageable database, and the content will be more relevant and "clean". What do you think? thank you 🙂 Best0 -
Google indexing
Hi In my site I have 2 blogs, the first blog is a standard blog, every post is informative and over 6oo words with pictures and all of them are keyworded. The second blog is basically a journal of bike rides i go on, with a picture and about 100 - 300 word writeup. I use a portfolio plugin to get this online. My question is should I noindex nofollow all of these posts. Im not sure if google will see it as a lot of uninformative noncene, I dont write these as blog posts they are a journal I post 1 or 2 a day. What is the normal practice for this... they are not keyworded or seo'd I dont want them to affect my seo or rankings. Thanks Chris
On-Page Optimization | | mrcsleonard0 -
Duplicate content on domains we own
Hello! We are new to SEO and have a problem we have caused ourselves. We own two domains GoCentrix.com (old domain) and CallRingTalk.com (new domain that we want to SEO). The content was updated on both domains at about the same time. Both are identical with a few exceptions. Now that we are getting into SEO we now understand this to be a big issue. Is this a resolvable matter? At this point what is the best approach to handle this? So far we have considered a couple of options. 1. Change the copy, but on which site? Is one flagged as the original and the other duplicate? 2. Robots.txt noindex, nofollow on the old one. Any help is appreciated, thanks in advance!
On-Page Optimization | | CallRingTalk0 -
Silo and content
I'm about to launch my site but I have a question regarding content and silo structure. If I don't have enough content to fill 4 subpages, could it be better to have only a content-keyword-rich landing page for a silo instead of multiple pages with poor content? Thank you!
On-Page Optimization | | mediodigital0 -
Depreciated content - Canononical, 301, or noindex?
I have a page that has existed on our website for many years, without ever being updated.This is what I would consider an "evergreen" content page, but it is now considered out of date and depreciated. It was never ranking high for any keyword in particular, but it is a page that has existed for many years. We have now created a more up-to-date version of the page, with much more informative content, a new URL, and of course it is SEO optimized. I am puzzled as to what I should do with my old page. Should I add a canononical link pointing it to the new updated page, or should I 301 redirect it to the new page, or should I no-index the old page? What are your thoughts and suggestions? I can give more information if needed. Thank you!!
On-Page Optimization | | jcph0 -
Duplicate content
the report shows duplicate content for a category page that has more than one page. how can we avoid this since i cannot make a different meta content for the second page of the category page: http://www.geographics.com/2-Cool-Colors-Poster-Board-14x22/c183_66_327_387/index.html http://www.geographics.com/2-Cool-Colors-Poster-Board-14x22/c183_66_327_387/index.html?page=2 thanks, Madlena
On-Page Optimization | | Madlena0 -
User experience regarding dulpicate content and managing this content with google.
Hi long title i know! We are moving on to magento and have chosen to allocate a specific colour to each category using corresponding tabbed navigation for user experience.All products within each of the coloured tabs then inherit the repective colour, giving the products a category identiy within the store. This layout has had a positive feedback from our "testers" As a lot of our products are seasonal and can be represented in different categories there is a significant amount of duplicate content. ATM i see our options as being: Alter the site structure so that the category is not shown in the url, therefore eliminating our duplicate products. The downside of this is that the colour co-ordination of the categories would not work at product level as its the category path that assigns the colour. create canonical links for every duplicate, can this be damaging? keep the duplicates and do nothing let google decide the most important version of a product. any guidance would be appreciated!
On-Page Optimization | | LadyApollo0 -
Duplicate page title and content
Hello, I have an ecommerce store where we offer many similar products, and the main difference could be the color or memory storage. Due to this reason my main problem appears to be be duplicate page title and content. What is the best way to correct this issue? I cant make them different neither. I always include this particular difference in title or description. I guess it is not enough? any way to fix it? thanks!
On-Page Optimization | | tolyadem10