Does Google index dynamically generated content/headers, etc.?
-
To avoid dupe content, we are moving away from a model where we have 30,000 pages, each with a separate URL that looks like /prices/<product-name>/<city><state>, often with dupe content because the product overlaps from city to city, and it's hard to keep 30,000 pages unique, where sometimes the only distinction is the price & the city/state.</state></city></product-name>
We are moving to a model with around 300 unique pages, where some of the info that used to be in the url will move to the page itself (headers, etc.) to cut down on dupe content on those unique 300 pages.
My question is this. If we have 300 unique-content pages with unique URL's, and we then put some dynamic info (year, city, state) into the page itself, will Google index this dynamic content?
The question behind this one is, how do we continue to rank for searches for that product in the city-state being searched without having that info in the URL?
Any best practices we should know about?
-
Hi there,
Not sure I have enough information to weigh in on the first part of your question - Google will index whatever it sees on the page. If you deliver the content to Google, then they index it. The problem comes when you deliver different content to different users. Try a tool like SEO Browser to see how googlebot views your site.
To answer your second question, its often hard to rank near-duplicate pages for specific cities/states without running into massive duplicate content problems. Matt Cutts himself actually addressed this awhile back. He basically stated if you have multiple pages all targeting different locations, it's best to include a few lines of unique content on each page (I recommend the top) to make each unique.
“In addition to address and contact information, 2 or 3 sentences about what is unique to that location and they should be fine,” Source
But this technique would be very hard with only 300 product page. The alternative, stuffing these pages with city/state information for every combination possible, is not advised.
http://www.seomoz.org/q/on-page-optimization-to-rank-for-multiply-cities
So in the end, it's actually not hard to rank for city-state keywords without having it in the URL, but the information should be in the content or other places like the title tag or internal link structure - but to do this for 1000's of locations with only 300 pages without keyword stuffing is near impossible.
The best thing to do is figure out how to create unique content for every page you want to rank for, and take that route.
For example, I might create a "Seattle" page, create unique content for the top of the page, then list 50 or so products with the unique Seattle prices. (This is a rough strategy - you'd have to refine it greatly to work for your situation.
Hope this helps! Best of luck with your SEO.
-
I see. To get the city-state pages indexed then they must have their own URL. If you can only access it via posting a form (assumed for using the search feature), the a search engine can't see it.
To get round this, you could put a links underneath the search box to popular searches. This will get them indexed.
Does that answer the questions?
Thanks
Iain - Reload
-
Thanks for the reply. The city-state content wouldn't be driven by the URL, it would be driven by the city-state that the user searched for. ie if the person searched for <product><city><state>I would want our /product/ page to show up, and show them content in their local city state.</state></city></product>
-
Hi Editable Text,
In short if you show Google a crawlable link to the content with the dynamic header/content, and the content is driven by the unique URL, yes it will index it.
As with any SEO/life question, there are a few t&c's with this.
- The pages need to be unique enough not to be classed as duplicate content
- Make sure it's intelligently linked internally
- You have external links pointing deep into the site
- You have a decent site architecture
To answer you second question, you'll need unique pages for each location, unless your content would be so thin, you'd need to group them. The URL doesn't have to include the keyword, but it's damn helpful if it does.
Hope that helps
Iain - Reload Media
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tag Clouds in Google Despite Canonical Links for Single Tags/Articles
I am frustrated to see a lot tag clouds in Google even though I programmed my tagged pages to display a canonical link to the linking article if the is only one result for the tag cloud. The goal to to make sure that the article, which is of better quality than the tag page, ends up in Google without a bunch of thin tag pages getting in there. For instance this article should be in Google and this tag should not be because that tag has a canonical URL for that article. I do not have a lot of experience with tag cloud SEO because I prefer to limit such pages to categories, but I have found tag clouds to be important for aggregating information for specific issues, people, or places that are not already a site category. Some tags I have used to power social media pages that update automatically from RSS feeds for their related tag archives. That is quite useful for pages like that. Should I start using Meta noindex for those instead of rel canonical? I have already done that for author profiles because author profiles get a lot of on site links compared to individual articles because my gridviews use javascript for paging. The same is true for the tags, so if a tag is tagged in 30 articles it will have links from 30 articles but if those articles are not in the latest 20 for that tag only the latest 20 will have links back from the tag archive. I also suspect having a lot of tag pages with little content to negatively impact my indexing rate. I will see a number of recent tag pages added before new articles.
On-Page Optimization | | CopBlaster.com0 -
Duplicate content on domains we own
Hello! We are new to SEO and have a problem we have caused ourselves. We own two domains GoCentrix.com (old domain) and CallRingTalk.com (new domain that we want to SEO). The content was updated on both domains at about the same time. Both are identical with a few exceptions. Now that we are getting into SEO we now understand this to be a big issue. Is this a resolvable matter? At this point what is the best approach to handle this? So far we have considered a couple of options. 1. Change the copy, but on which site? Is one flagged as the original and the other duplicate? 2. Robots.txt noindex, nofollow on the old one. Any help is appreciated, thanks in advance!
On-Page Optimization | | CallRingTalk0 -
Index Page Content
Mozers, I am of the believe and as a person who puts the utmost emphasis on the index page of any website I am trying to rank, especially with a new domain ... insuring content is relevant, structured, optimized and we have some link juice flowing in. I find once we get the index page ranked, Google's little bots then start to index and rank accordingly the rest of the website ... and we start producing results. We also develop websites (dare I say its where we expertise in) and unexpectantly the client has asked us to carry out SEO work additionally to their web development. Problem lies here, their index page, has absolutely no written content at all, just one large image with a logo (Fashion Website) ...Which I identify as a huge issue as per my explanation is paragraphs one or two. I am sure withe the many more qualified SEO experts and gurus within the SEOmoz community, you have also come across this issue So a few questions, if you don't mind adding advice. 1 - Am I putting too much emphasize on content within the index page, in terms of indexing and actually ranking ...yes I appreciate that terms within the website will be ranked against other pages other than the index page, but will it harm us for having no content at all within the index page 2 - If so, and yes is the answer to above, how do we handle it, we have spoke with the client and he is pretty adamant that he want the index page as is, he has been through out the whole website building process. As suggested, any advice would be really appreciated, its a difficult market to rank within a it is, and i can only see this index page making the task a lot more difficult Cheers John
On-Page Optimization | | Johnny4B0 -
Only homepage in Google
Hello SEOmozzers, A colleague of mine started up a site here: www.wikiboedel.nl. Sadly only the homepage is indexed in Google. It's a wordpress install. Robots.txt seems fine, all pages are on index and follow. Can't see why only the home is indexed in Google. Do you know whats going wrong? Please tell me. Tank you.
On-Page Optimization | | B.Great0 -
Index.php getting Duplicate page content.
I am quite new to SEO and have now got my first results. I am getting all my index.php pages returned as Duplicate page content. ie: blue-widgets/index.php
On-Page Optimization | | ivoryred
blue-widgets/ green-widgets/large/index.php
green-widgets/large/ How do solve this issue?0 -
Will Google re-index the page if content is changed/improved?
Hello,
On-Page Optimization | | wickedsunny1
i have a little question, it seems most of my posts which are 3 years old or even more, all have almost same type of 2-3 lines of text at post starting and then images roundups etc.
which recently stopped getting any traffic from google. Probably becuase of lack in text etc.
So if i now edit those old pages, will they be reindexed in google with this new content/data?
thanks in advance.
cheers0 -
Original content and the Google Panda Update
We are an online furniture store with about 1300 products on the site, and we mostly use the catalogue descriptions for the product. Recently I have been reading about One Way Furniture: http://ecommerceprnews.com/e-commerce_articles/2011/03/one-way-furniture-shifts-toward-quality-content-after-google-panda-update-201928.htm They are a big american online furniture which seemed to have lost about a 3rd of there traffic due to being punished in the panda update. Now it seems they are blaming the fact they use they use catalogue descriptions for the product (like us), and now they are going to rewrite all their product descriptions. We are a small company and rewriting 1300 products (meaningfully) is no small task. Looking at our own traffic we have taken a small slump since feb after about 18 months of general increased month on month traffic ( bar seasonal dips and boost), but we didn't have a "fall of the cliff" like One Way Furniture. But have been expanding into other areas (and there for new keywords), so we had expected to be increasing our traffic. So the question is, how important is unique content for all our products? is it worth all the time and money to fix all the pages? Our plan is to make sure our category pages (and there for landing pages) have unique content, would that be enough on its own, or are the product pages damaging the site over all?
On-Page Optimization | | eunaneunan0 -
Page Cache And Index
If you are browsing a site, what is the best way or programs to use to see if the page has been indexed and cached? Thanks
On-Page Optimization | | gregster10000