Certain Product Pages Not Indexing
-
Hey All,
We discovered an issue where new product pages on our site were not getting indexed because a "noindex" tag was inadvertently being added to section when those pages were created.
We removed the noindex tag in late April and some of the pages that had not been previously indexed are now showing up, but others are still not getting indexed and I'd appreciate some help on why this could be.
Here is an example of a page that was not in the index but is now showing after removal of noindex:
http://www.cloud9living.com/san-diego/gaslamp-quarter-food-tour
And here is an example of a page that is still not showing in the index:
http://www.cloud9living.com/atlanta/race-a-ferrari
UPDATE: The above page is now showing after I manually submitted it in WMT. I had previously submitted another page like a month ago and it was still not indexing so I thought the manual submission was a dead end. However, it just so happens that the above URL just had its Page Title and H1 updated to something more specific and less duplicative so I am currently running a test to see if that's the problem with these pages not indexing. Will update this soon.
Any suggestions? Thanks!
-
Significantly changing the Page Title and H1 is working. Second page now indexing after not indexing for some time. Probably shoulda thought of that a long time ago but that noindex tag sidetracked me!
-
It's hard to say, and I admit that I would have also expected them to reindex the pages by now.
A while back I was working for a client who accidentally turned on noindex, nofollow in their WordPress SEO by Yoast plugin site-wide. I didn't catch it for a week, and after I turned it off it took an additional 3 weeks before a single page of the site was reindexed. Granted, this was a low ranked site, and it probably wasn't high on Google's priority, but it did take much longer than I hoped to recover from.
Unfortunately I think you just have to wait it out. Just keep doing what your doing, creating new content, etc. Maybe if you build a new link to the page, Google will recrawl it then?
-
Good point. But why then would they continue to not index a month after manual submission?
-
If the page was set to "noindex" for a long time, Google may have flagged the page as such and chosen to skip over it when it was crawling your site.
-
Also, historically indexation happened very quickly on this site (less than 24 hours) so that's why I think something else is afoot here. And it has been like 6 weeks... which I don't think makes sense for a site with this level of domain authority.
-
Hey Bradley,
Thanks for the response. Yes, I had manually fetched a few of these pages about a month back and that didn't change indexation so I thought it was a dead end. However, one I tried again this morning suddenly indexed and it just so happened to also have had its Page Title and H1 tag changed to be significantly more unique than they were previously so I am wondering if that is problem. I am currently running a test with another page that I manually submitted a month ago but without updating Page Title/H1 and now I have resubmitted with changed info.
We'll see if that does the trick.
Will let you know.
-
You can ask Google to crawl your page: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1352276
Ask Google to crawl a page or site:
- On the Webmaster Tools Home page, click the site you want.
- On the Dashboard, under Health, click Fetch as Google.
- In the text box, type the path to the page you want to check.
- In the dropdown list, select Web. (You can select another type of page, but currently we only accept submissions for our Web Search index.)
- Click Fetch. Google will fetch the URL you requested. It may take up to 10 or 15 minutes for Fetch status to be updated.
- Once you see a Fetch status of "Successful", click Submit to Index, and then click one of the following:
- To submit the individual URL to Google's index, select URL and click Submit. You can submit up to 500 URLs a week in this way.
- To submit the URL and all pages linked from it, click URL and all linked pages. You can submit up to 10 of these requests a month.
It's probably just Google slowly making their way around to re-crawling these pages. I would fetch the page, and just wait a little while longer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with sold product pages when everything you sell are unique one off items
Hi there, This is something i have been unsure of for years. It's a little different to most ecom website situations. What would you do with product pages when every product is a "one off" unique product and once sold will never be for sale again? Should i redirect to a category page? 404? Leave it as is marked as sold or say it is sold and show links to similar items? At the moment we have 700 products for sale but over 5000 sold products that have their own product page and my concern is as this grows it could become a lot for a WordPress woocommerce site to handle? I don't want to do anything to slow my site down or unnecessarily bloat it but i want to do the right thing by the visitor and also not do anything to hurt my rankings. These pages often rank in google and may have been there for years before the item actually sells. To throw another curve ball, there may be multiple other products (for sale or already sold) with the exact same name but are unique and different from each other. These products pages will often be 98% the same content as each other too. To explain how this could be the case, we sell artworks from many different artists, Every artwork is an original and is unique. But many artists paint the same subject matter multiple times, albeit in a slightly different way from previous times. So you end up with a unique product that has everything the same as another (same artist, same name of artwork, same size, same description, different image, different sku) but is actually different and unique. This has left me somewhat uncertain of what is best to do. Any advice would be greatly appreciated. Thank you
Intermediate & Advanced SEO | | Scottlinklater0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
How do I get my Golf Tee Times pages to index?
I understand that Google does not want to index other search results pages, but we have a large amount of discount tee times that you can search for and they are displayed as helpful listing pages, not search results. Here is an example: http://www.activegolf.com/search-northern-california-tee-times?Date=8%2F21%2F2013&datePicker=8%2F21%2F2013&loc=San+Diego%2C+CA&coupon=&zipCode=&search= These pages are updated daily with the newest tee times. We don't exactly want every URL with every parameter indexed, but at least http://www.activegolf.com/search-northern-california-tee-times. It's weird because all of the tee times are viewable in the HTML and are not javascript. An example of similar pages would be Yelp, for example this page is indexed just fine - http://www.yelp.com/search?cflt=dogwalkers&find_loc=Lancaster%2C+MA I know ActiveGolf.com is not as powerful as Yelp but it's still strange that none of our tee times search pages are being indexed. Would appreciate any ideas out there!
Intermediate & Advanced SEO | | CAndrew14.0 -
Old pages still in index
Hi Guys, I've been working on a E-commerce site for a while now. Let me sum it up : February new site is launched Due to lack of resources we started 301's of old url's in March Added rel=canonical end of May because of huge index numbers (developers forgot!!) Added noindex and robots.txt on at least 1000 urls. Index numbers went down from 105.000 tot 55.000 for now, see screenshot (actual number in sitemap is 13.000) Now when i do site:domain.com there are still old url's in the index while there is a 301 on the url since March! I know this can take a while but I wonder how I can speed this up or am doing something wrong. Hope anyone can help because I simply don't know how the old url's can still be in the index. 4cArHPH.png
Intermediate & Advanced SEO | | ssiebn70 -
How can Google index a page that it can't crawl completely?
I recently posted a question regarding a product page that appeared to have no content. [http://www.seomoz.org/q/why-is-ose-showing-now-data-for-this-url] What puzzles me is that this page got indexed anyway. Was it indexed based on Google knowing that there was once content on the page? Was it indexed based on the trust level of our root domain? What are your thoughts? I'm asking not only because I don't know the answer, but because I know the argument is going to be made that if Google indexed the page then it must have been crawlable...therefore we didn't really have a crawlability problem. Why Google index a page it can't crawl?
Intermediate & Advanced SEO | | danatanseo0 -
No index.no follow certain pages
Hi, I want to stop Google et al from finding a some pages within my website. the url is www.mywebsite.com/call_backrequest.php?rid=14 As these pages are creating a lot of duplicate content issues. Would the easiest solution be to place a 'Nofollow/Noindex' META tag in page www.mywebsite.com/call_backrequest.php many thanks in advance
Intermediate & Advanced SEO | | wood1e19680 -
Sudden Change In Indexed Pages
Every week I check the number of pages indexed by google using the "site:" function. I have set up a permanent redirect from all the non-www pages to www pages. When I used to run the function for the: non-www pages (i.e site:mysite.com), would have 12K results www pages (i.e site:www.mysite.com) would have about 36K The past few days, this has reversed! I get 12K for www pages, and 36K for non-www pages. Things I have changed: I have added canonical URL links in the header, all have www in the URL. My questions: Is this cause for concern? Can anyone explain this to me?
Intermediate & Advanced SEO | | inhouseseo0 -
How long till pages drop out of the index
In your experience how long does it normally take for 301-redirected pages to drop out of Google's index?
Intermediate & Advanced SEO | | bjalc20110