Certain Pages Not Being Indexed - Please Help
-
We are having trouble getting a bulk of our pages indexed in google. Any help would be greatly appreciated!
The Following Page types are being indexed through escaped fragment:
http://www.cbuy.tv/celebrity#!65-Ashley-Tisdale/fashion/4097-Casadei-BLADE-PUMP/Product/175199
<cite>www.cbuy.tv/celebrity/155-Sophia-Bush#!</cite>
However, all our pages that look like this, are not being indexed:
-
Hi Takeshi,
We have a sitemap but also the pages are all interlinked. I didn't know that google puts an upper-bound on indexing based on PR - that's interesting.
Since there is a black and white difference between a set of pages of a certain kind (zero of these pages are being indexed) I suspect there is some other issue. Is it at all possible that google does not like the urls of these pages? :
1. does google not like the parameters?
2. should we reduce the length of our guid id number and move it to the end of the url?
-
Where are these pages being linked from? If you want these pages indexed, you may want to try making them more prominent in your site's navigation and architecture. Listing them in a sitemap can help them get discovered by Google, but actually linking to them from your site will have much more impact.
Also, I notice that the site is only pagerank 2, and already has 5000+ pages indexed in Google. Google limits the number of pages it indexes for sites based on their pagerank, so you may want to consider improving your PR so Google indexes more pages from your site.
-
Hi Mike,
I am sure you've probably already barked up this tree, but do those pages contain 100% substantially unique content?
Also, have you had an SEO developer review your robots.txt and .htaccess files to make sure there isn't something it there preventing crawlers from having access?
Dana
-
Hello Dana,
Thanks for your reply.
We have thousands of #! pages being indexed. Googlebot is sent to our escaped fragment page through a redirect. Our dynamic sitemap helped us get many pages indexed. However there are a subset of pages that google does not like at all and we cannot figure out why. For example when you visit our homepage, http://www.cbuy.tv, then navigate through images in our carousel (each assigned a unique url) none of these pages are being indexed.
Mike
-
Hi Mike,
I am not a developer, but I think the problem is the hashtag in your URL. This is a problem for search engines in that, anything following the "#" is completely ignored by search engines.
Depending on your platform, I would consider re-writing all of your URLs to omit that hashtag completely. Search engines (and humans!) can respond in unpredictable ways to anything other than alpha-neumeric characters. Then I would implement 301 redirects if necessary (depending on how old the site is and how many inbound links there are to each page).
I don't think that sitemap submission is even going to help right now because of the hashtag issue, but I'd love to hear from a developer on this for verification.
I hope this helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can anyone help me diagnose an indexing/sitemap issue on a large e-commerce site?
Hey guys. Wondering if someone can help diagnose a problem for me. Here's our site: https://www.flagandbanner.com/ We have a fairly large e-commerce site--roughly 23,000 urls according to crawls using both Moz and Screaming Frog. I have created an XML sitemap (using SF) and uploading to Webmaster Tools. WMT is only showing about 2,500 urls indexed. Further, WMT is showing that Google is indexing only about 1/2 (approx. 11,000) of the urls. Finally (to add even more confusion), when doing a site search on Google (site:) it's only showing about 5,400 urls found. The numbers are all over the place! Here's the robots.txt file: User-agent: *
Intermediate & Advanced SEO | | webrocket
Allow: /
Disallow: /aspnet_client/
Disallow: /httperrors/
Disallow: /HTTPErrors/
Disallow: /temp/
Disallow: /test/ Disallow: /i_i_email_friend_request
Disallow: /i_i_narrow_your_search
Disallow: /shopping_cart
Disallow: /add_product_to_favorites
Disallow: /email_friend_request
Disallow: /searchformaction
Disallow: /search_keyword
Disallow: /page=
Disallow: /hid=
Disallow: /fab/* Sitemap: https://www.flagandbanner.com/images/sitemap.xml Anyone have any thoughts as to what our problems are?? Mike0 -
Do I need to remove pages that don't get any traffic from the index?
Hi, Do I need to remove pages that don't get any traffic from the index? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
How to check if the page is indexable for SEs?
Hi, I'm building the extension for Chrome, which should show me the status of the indexability of the page I'm on. So, I need to know all the methods to check if the page has the potential to be crawled and indexed by a Search Engines. I've come up with a few methods: Check the URL in robots.txt file (if it's not disallowed) Check page metas (if there are not noindex meta) Check if page is the same for unregistered users (for those pages only available for registered users of the site) Are there any more methods to check if a particular page is indexable (or not closed for indexation) by Search Engines? Thanks in advance!
Intermediate & Advanced SEO | | boostaman0 -
Page Count in Webmaster Tools Index Status Versus Page Count in Webmaster Tools Sitemap
Greeting MOZ Community: I run www.nyc-officespace-leader.com, a real estate website in New York City. The page count in Google Webmaster Tools Index status for our site is 850. The page count in our Webmaster Tools Sitemap is 637. Why is there a discrepancy between the two? What does the Google Webmaster Tools Index represent? If we filed a removal request for pages we did not want indexed, will these pages still show in the Google Webmaster Tools page count despite the fact that they no longer display in search results? The number of pages displayed in our Google Webmaster Tools Index remains at about 850 despite the removal request. Before a site upgrade in June the number of URLs in the Google Webmaster Tools Index and Google Webmaster Site Map were almost the same. I am concerned that page bloat has something to do with a recent drop in ranking. Thanks everyone!! Alan
Intermediate & Advanced SEO | | Kingalan10 -
Wordpress - Dynamic pages vs static pages
Hi, Our site has over 48,000 indexed links, with a good mix of pages, posts and dynamic pages. For the purposes of SEO and the recent talk of "fresh content" - would it be better to keep dynamic pages as they are or manually create static pages/ subpages. The one noticable downside with dynamic pages is that they arent picked up by any sitemap plugins, you need to manually create a separate sitemap just for these dynamic links. Any thoughts??
Intermediate & Advanced SEO | | danialniazi1 -
Website is not indexed in Google, please help with suggestions
Our client website was removed from Google index. Anybody could recommend how to speed up process of re index: Webmaster tools done SM done (Twitter, FB) sitemap.xml done backlinks in process PPC done Robots.txt is fine Guys any recommendations are welcome, client is very unhappy. Thank you
Intermediate & Advanced SEO | | ThinkBDW0 -
Traffic down 60% - about to cry, please help
Hiya guys and girls, I've just spent 6 months, a lot of blood sweat and tears, and money developing www.happier.co.uk. In the last weeks the site started to make a trickle of money, still loss making but showing green shoots. But then on Friday the traffic dropped due to my rankings on google.co.uk dropping. Visits: Thur 25th april = 1950 Fri 26th april = 1284 Sat 27th april = 906 So it looks like Ive been hit with some sort of penalty. I did get a warning on the 20th april about an increase in the number of 404 errors, currently showing 77. I've now remove the links to those 404 pages, ive left the 404 pages as is, as was suggested here: http://www.seomoz.org/blog/how-to-fix-crawl-errors-in-google-webmaster-tools. Could that be the reason? We have spent a lot of time on site design and content. We think the site is good, but I agree it has a long way to go but without income that is hard, so we have been struggling through. Any ideas on the reason/s for the penalty? Big thanks, Julian.
Intermediate & Advanced SEO | | julianhearn0 -
Should pages of old news articles be indexed?
My website published about 3 news articles a day and is set up so that old news articles can be accessed through a "back" button with articles going to page 2 then page 3 then page 4, etc... as new articles push them down. The pages include a link to the article and a short snippet. I was thinking I would want Google to index the first 3 pages of articles, but after that the pages are not worthwhile. Could these pages harm me and should they be noindexed and/or added as a canonical URL to the main news page - or is leaving them as is fine because they are so deep into the site that Google won't see them, but I also won't be penalized for having week content? Thanks for the help!
Intermediate & Advanced SEO | | theLotter0