Certain Pages Not Being Indexed - Please Help
-
We are having trouble getting a bulk of our pages indexed in google. Any help would be greatly appreciated!
The Following Page types are being indexed through escaped fragment:
http://www.cbuy.tv/celebrity#!65-Ashley-Tisdale/fashion/4097-Casadei-BLADE-PUMP/Product/175199
<cite>www.cbuy.tv/celebrity/155-Sophia-Bush#!</cite>
However, all our pages that look like this, are not being indexed:
-
Hi Takeshi,
We have a sitemap but also the pages are all interlinked. I didn't know that google puts an upper-bound on indexing based on PR - that's interesting.
Since there is a black and white difference between a set of pages of a certain kind (zero of these pages are being indexed) I suspect there is some other issue. Is it at all possible that google does not like the urls of these pages? :
1. does google not like the parameters?
2. should we reduce the length of our guid id number and move it to the end of the url?
-
Where are these pages being linked from? If you want these pages indexed, you may want to try making them more prominent in your site's navigation and architecture. Listing them in a sitemap can help them get discovered by Google, but actually linking to them from your site will have much more impact.
Also, I notice that the site is only pagerank 2, and already has 5000+ pages indexed in Google. Google limits the number of pages it indexes for sites based on their pagerank, so you may want to consider improving your PR so Google indexes more pages from your site.
-
Hi Mike,
I am sure you've probably already barked up this tree, but do those pages contain 100% substantially unique content?
Also, have you had an SEO developer review your robots.txt and .htaccess files to make sure there isn't something it there preventing crawlers from having access?
Dana
-
Hello Dana,
Thanks for your reply.
We have thousands of #! pages being indexed. Googlebot is sent to our escaped fragment page through a redirect. Our dynamic sitemap helped us get many pages indexed. However there are a subset of pages that google does not like at all and we cannot figure out why. For example when you visit our homepage, http://www.cbuy.tv, then navigate through images in our carousel (each assigned a unique url) none of these pages are being indexed.
Mike
-
Hi Mike,
I am not a developer, but I think the problem is the hashtag in your URL. This is a problem for search engines in that, anything following the "#" is completely ignored by search engines.
Depending on your platform, I would consider re-writing all of your URLs to omit that hashtag completely. Search engines (and humans!) can respond in unpredictable ways to anything other than alpha-neumeric characters. Then I would implement 301 redirects if necessary (depending on how old the site is and how many inbound links there are to each page).
I don't think that sitemap submission is even going to help right now because of the hashtag issue, but I'd love to hear from a developer on this for verification.
I hope this helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is dynamic pages helps in E commerce SEO?
Whats are the best way to create dynamic pages in eCommerce website having static urls? Or what are other ways to increase/create more pages in websites.
Intermediate & Advanced SEO | | Obbserv0 -
Better to 301 or de-index 403 pages
Google WMT recently found and called out a large number of old unpublished pages as access denied errors. The pages are tagged "noindex, follow." These old pages are in Google's index. At this point, would it better to 301 all these pages or submit an index removal request or what? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Renaming your domain from an existing live domain and SEO implications - Please Help *shudder*
Please see the details below. Site A: http://south-african-holiday.mobi is an existing site that is our best site. It is Joomla 3.1 and runs all our ecommerce. Site B: http//www.southerncircle.com/ is our original and has the best DA but is out of date and pretty clunky. joomla 1.5 and all bookings (tour site) are redirected to Site A for processing. Instead of redesigning the Site A I'd like to change the domain name of http://south-african-holiday.mobi -> http://southerncircle.com So far my reading and research (Thanks MOZ for awesome forum!) has provided me with: 1. Do the SEO groundwork. i.e. remove dead links from both sites. Delete useless content and generally tidy up both sites. 2. Map all pages from site a: http://southerncircle.com -> http://south-africa-holiday/ so that the existing pages that have good ranking will have a home on the new site. 3. When ready do a small sample 301 redirect from: http://southerncircle.com to http://south-africa-holiday.mobi. 4. arghhhh now I'm stuck ..... If I redirect to this site then I lose my http://southerncircle.com domain which is what I want to keep....I just want the .mobi site to move to the southerncircle.com site.... I don't consider myself totally thick but this is really confuseing the *$%# out of me PLEASE could you give me some insight here. I'm sure it has been done before without completely losing the sites seo ranking and sending my site into SEO oblivion. If there are any JOOMLA gurus that have done this I'd love to hear from you as well. Many thanks in advance.
Intermediate & Advanced SEO | | SoutherlySwell0 -
Need help with huge spike in duplicate content and page title errors.
Hi Mozzers, I come asking for help. I've had a client who's reported a staggering increase in errors of over 18,000! The errors include duplicate content and page titles. I think I've found the culprit and it's the News & Events calender on the following page: http://www.newmanshs.wa.edu.au/news-events/events/07-2013 Essentially each day of the week is an individual link, and events stretching over a few days get reported as duplicate content. Do you have any ideas how to fix this issue? Any help is much appreciated. Cheers
Intermediate & Advanced SEO | | bamcreative0 -
Help me solve a keyword ranking mystery please
I posted this and had some help (thank you!) but found some new things, so I thought I'd just start a new thread so no info. is missed. Hi everyone, I'm new here 🙂 So far I've had wonderful success seo wise and none of the updates (Penguin nor Panda) affected any sites, until this one. For example, one site has 7 keywords I'm optimizing for. Out of those 7, all but 2 (and variations of the 2 - one word vs long-tail) completely tanked. These keywords were all on page 2/3. One of the two survivors never budged from page 2 (it's a brand keyword so I was very happy to finally get it to page 2) Now when I check rankings, the other terms show up in the 200-400 spots, but NOT for the URL I was optimizing for (category page) but instead for random products in the category. The only thing I've done differently with the 2 keywords that are still doing well, was focus - we did more link-building for those, but not an extreme amount. Never over-optimize. My question is, how did 2 survive and 5 are still floating up and down. Last night I saw one go up 122 spots, now today down 14. I'm really struggling with this. I just ran another diagnostic crawl here and the report found 0 errors and 0 warnings. I checked category content with a plagiarism checker and found some external duplicate content which I've already taken care of. No critical warnings/messages in WMT either. I'm stumped 😞 Thank you for any help.
Intermediate & Advanced SEO | | Freelancer130 -
Wrong Page Indexing in SERPS - Suggestions?
Hey Moz'ers! I have a quick question. Our company (Savvy Panda) is working on ranking for the keyword: "Milwaukee SEO". On our website, we have a page for "Milwaukee SEO" in our services section that's optimized for the keyword and we've been doing link building to this. However, when you search for "Milwaukee SEO" a different page is being displayed in the SERP's. The page that's showing up in the SERP's is a category view of our blog of articles with the tag "Milwaukee SEO". **Is there a way to alert google that the page showing up in the SERP's is not the most relevant and request a new URL to be indexed for that spot? ** I saw a webinar awhile back that showed something like that using google webmaster sitelinks denote tool. I would hate to denote that URL and then loose any kind of indexing for the keyword.
Intermediate & Advanced SEO | | SavvyPanda
Ideas, suggestions?0 -
Most Painless way of getting Duff Pages out of SE's Index
Hi, I've had a few issues that have been caused by our developers on our website. Basically we have a pretty complex method of automatically generating URL's and web pages on our website, and they have stuffed up the URL's at some point and managed to get 10's of thousands of duff URL's and pages indexed by the search engines. I've now got to get these pages out of the SE's indexes as painlessly as possible as I think they are causing a Panda penalty. All these URL's have an addition directory level in them called "home" which should not be there, so I have: www.mysite.com/home/page123 instead of the correct URL www.mysite.com/page123 All these are totally duff URL's with no links going to them, so I'm gaining nothing by 301 redirects, so I was wondering if there was a more painless less risky way of getting them all out the indexes (IE after the stuff up by our developers in the first place I'm wary of letting them loose on 301 redirects incase they cause another issue!) Thanks
Intermediate & Advanced SEO | | James770 -
Category Pages - Canonical, Robots.txt, Changing Page Attributes
A site has category pages as such: www.domain.com/category.html, www.domain.com/category-page2.html, etc... This is producing duplicate meta descriptions (page titles have page numbers in them so they are not duplicate). Below are the options that we've been thinking about: a. Keep meta descriptions the same except for adding a page number (this would keep internal juice flowing to products that are listed on subsequent pages). All pages have unique product listings. b. Use canonical tags on subsequent pages and point them back to the main category page. c. Robots.txt on subsequent pages. d. ? Options b and c will orphan or french fry some of our product pages. Any help on this would be much appreciated. Thank you.
Intermediate & Advanced SEO | | Troyville0