There seems to be something obvious stopping our product pages from being listed in organic results - help!
-
Hello All
Firstly new to SEO MOZ but what a fantastic resource, good work!
I help run a platform at ethical community (dot) com (have phrased it like that so google doesn't pick up this thread hope thats ok).
We seem to have something glaringly obvious with the SEO ability of our product pages. We now have over 7000 products on the site and would like to think we have done a pretty good job in terms of optimisning them, lots of nice keywords, relevant page titles, good internal links, and even recently have reduced the loading speeds a fair amount. We have a sitemap set up feeding in URLS to Google and some of them are now nearly a year old.
The problem, when doing an EXACT google search on a product title the product pages dont show up for the majority of the 7000 products.
HOWEVER.... we get fantastic ranking in google products, and get sales through other areas of the site, which seems even more odd. For example, if you type in "segway" you'll see us ranking on the first page of google in google products, but the product page itself is nowhere to be seen.
For example "DARK CHOCOLATE STRANDS 70G CAKE DECORATION" gets no results on google (aside from google products) when we have this page at OURDOMAIN/eco-shop/food/dark-chocolate-strands-70g-cake-decoration-5592
Can anyone help identify if there is a major bottleneck here our gut feeling is there is one major factor that is causing this.
-
Can stephen/anyone else help with my latest question here?
-
HI Stephen did you manage to look into that new example I gave any further?
-
Thanks for this Stephen - very helpful.
Ajax wise we made some amendments last week which seem to already be having an impact (our pagination on search results had /# links and now they have crawlable urls). Google has already crawled more pages so far. We are going to move away from the ajax search altogether in the next few weeks too.
One thing that still bugs us all is this example: http://www.google.co.uk/search?sourceid=chrome&ie=UTF-8&q=CEBRA+RHASSOUL+ORGANIC+MOROCCAN+LAVA+CLAY This bugs the hell out of us because
a) The product page for this (http://www.ethicalcommunity.com/eco-shop/toiletries-and-cosmetics/cebra-rhassoul-organic-moroccan-lava-clay-8206) IS in the our sitemap
b) It features on our homepage (has done for 1 week at least) so in theory google should have crawled this link by now
c) We get a huge amount of traffic (well, relative, but about 40% of total traffic on sunday) for the term RHASSOUL CLAY. What's annoying is this links to a content piece on the clay but the product itself is nowhere to be seen.Even googling an exact URL to the product page returns nothing, so if google does know about it, its blocking it.
This also would tend to perhaps invalidate the theory that its the search causing all of this problem as actually this product has accessible links from the homepage.
I have a feeling solving this one case would solve the issue for most of the other products on the site.
Any ideas here?
-
I think that navigation issues may well be what trip you up.
http://www.ethicalcommunity.com/eco-shop/buy/Food-and-Drink when I navigate through the product pages, I am not seeing the html changing to reflect the new contents, the html reflects the coffee content of the first page
However when I copy+paste the url directly to page 3 I get the correct html for the products on the page
Ajax isnt my strong point, but you need to aim to give engines access to your products via clear categorisation and pagination. Has your developer considered using a hashbang (#!)?
Here's a SEOmoz post that may help http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Let me know if this is the issue, interested to know
S
-
thanks will do and look forward to it Stephen
-
check your ajax and what happens to your html when your products load. will look more later
-
Also!...
- You touched on slow loading speeds, can you give me something quantifiable here as as far as we know these should be fairly competitive and have run various speeds tests and they seem to come out ok in terms of loading time. Can you back up that statement with some more solidity? Many thanks again
-
RE: 3)..... perhaps I misunderstood here but maybe what you meant by duplicate content was a lot of the content on the pages is "re-used" stuff for example
ETHICAL CREDENTIALS
Click the icons below to learn about this product's ethical credentials.
etc......
is that more what you meant rather than the product descriptions being copied/pasted from other shop site (as only some of the descriptions will be copy/paste jobs)
- Benchmark...
http://www.etsy.com/listing/70625156/waving-leaves-crisp-white-and-lime-green?ref=fp_treasury_2 is a huge benchmark for us in terms of high volume optimised pages (and a similar model to us so one to review for sure). I guess if you break down the elements on that page a lot of the content are individual to the page... is this what you mean?
-
Hi Stephen
Thank you for your feedback here...
-
Stripping the pages down. Thanks for this and a valid comment here, will get our developer to look into what we can strip out to put more focus on the content.
-
That server would appear to be from Woopra which as you may know is a live tracking tool for us to see live site action/pathway analysis etc. We use it quite a lot and have not heard that this could impact the site, do you know otherwise?
-
Duplicate content, I wouldnt say the majority of pages are duplicated, of course some sellers have their own product pages and copy/paste into ours but a lot of them also write them fresh. Even the ones that are writing them fresh are no-where to be seen
I personally think there is another factor influencing this as it still doesnt explain why some of the pages dont come up literally at all with an exact product search (even when searching the URLs)
look forward to hearing further thoughts and thanks again for your time on this.
-
-
http://www.ethicalcommunity.com/eco-shop/coffee/papua-new-guinean-coffee-454g-10261
All these coffee pages look like crappy duplicate content - you have such a tiny percentage of actual content product description vs all the other stuff on the page - on product pages strip out all the extraneous crap and just focus on selling your product
Also the site is very slow for me, run yslow/google page speed optimiser and see what you can improve
Also check woopra-ns.com
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Would My Page Have a Higher PA and DA, Links & On-Page Grade & Still Not Rank?
The Search Term is "Alcohol Ink" and our client has a better page authority, domain authority, links to the page, and on-page grade than those in the SERP for spaces 5-10 and we're not even ranked in the top 51+ according to Moz's tracker. The only difference I can see is that our URL doesn't use the exact text like some of the 5-10 do. However, regardless of this, our on-page grade is significantly higher than the rest of them. The one thing I found was that there were two links to the page (that we never asked for) that had a spam score in the low 20's and another in the low 30's. Does anyone have any recommendations on how to maybe get around this? Certainly, a content campaign and linking campaign around this could also help but I'm kind of scratching my head. The client is reputable, with a solid domain age and well recognized in the space so it's not like it's a noob trying to get in out of nowhere.
Intermediate & Advanced SEO | | Omnisye0 -
Blocking Dynamic Search Result Pages From Google
Hi Mozzerds, I have a quick question that probably won't have just one solution. Most of the pages that Moz crawled for duplicate content we're dynamic search result pages on my site. Could this be a simple fix of just blocking these pages from Google altogether? Or would Moz just crawl these pages as critical crawl errors instead of content errors? Ultimately, I contemplated whether or not I wanted to rank for these pages but I don't think it's worth it considering I have multiple product pages that rank well. I think in my case, the best is probably to leave out these search pages since they have more of a negative impact on my site resulting in more content errors than I would like. So would blocking these pages from the Search Engines and Moz be a good idea? Maybe a second opinion would help: what do you think I should do? Is there another way to go about this and would blocking these pages do anything to reduce the number of content errors on my site? I appreciate any feedback! Thanks! Andrew
Intermediate & Advanced SEO | | drewstorys0 -
Fetch as Google -- Does not result in pages getting indexed
I run a exotic pet website which currently has several types of species of reptiles. It has done well in SERP for the first couple of types of reptiles, but I am continuing to add new species and for each of these comes the task of getting ranked and I need to figure out the best process. We just released our 4th species, "reticulated pythons", about 2 weeks ago, and I made these pages public and in Webmaster tools did a "Fetch as Google" and index page and child pages for this page: http://www.morphmarket.com/c/reptiles/pythons/reticulated-pythons/index While Google immediately indexed the index page, it did not really index the couple of dozen pages linked from this page despite me checking the option to crawl child pages. I know this by two ways: first, in Google Webmaster Tools, if I look at Search Analytics and Pages filtered by "retic", there are only 2 listed. This at least tells me it's not showing these pages to users. More directly though, if I look at Google search for "site:morphmarket.com/c/reptiles/pythons/reticulated-pythons" there are only 7 pages indexed. More details -- I've tested at least one of these URLs with the robot checker and they are not blocked. The canonical values look right. I have not monkeyed really with Crawl URL Parameters. I do NOT have these pages listed in my sitemap, but in my experience Google didn't care a lot about that -- I previously had about 100 pages there and google didn't index some of them for more than 1 year. Google has indexed "105k" pages from my site so it is very happy to do so, apparently just not the ones I want (this large value is due to permutations of search parameters, something I think I've since improved with canonical, robots, etc). I may have some nofollow links to the same URLs but NOT on this page, so assuming nofollow has only local effects, this shouldn't matter. Any advice on what could be going wrong here. I really want Google to index the top couple of links on this page (home, index, stores, calculator) as well as the couple dozen gene/tag links below.
Intermediate & Advanced SEO | | jplehmann0 -
Is it possible that Google is pulling description from third party websites and displaying in the description section in organic result?
Hi all, I have come across the most weird situation ever in my SEO career. Google is displaying description in organic results for brand term under the website URL that doesnt exist on the website ANYWHERE but this description does appear on some directory sites created back in 2002 or so. Is there a possibility that Google is pulling info from directory sites and displaying as a description in the organic results? I am super confused! Help needed! Thanks
Intermediate & Advanced SEO | | Malika10 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
"No index" page still shows in search results and paginated pages shows page 2 in results
I have "no index, follow" on some pages, which I set 2 weeks ago. Today I see one of these pages showing in Google Search Results. I am using rel=next prev on pages, yet Page 2 of a string of pages showed up in results before Page 1. What could be the issue?
Intermediate & Advanced SEO | | khi50 -
Site less than 20 pages shows 1,400+ pages when crawled
Hello! I’m new to SEO, and have been soaking up as much as I can. I really love it, and feel like it could be a great fit for me – I love the challenge of figuring out the SEO puzzle, plus I have a copywriting/PR background, so I feel like that would be perfect for helping businesses get a great jump on their online competition. In fact, I was so excited about my newfound love of SEO that I offered to help a friend who owns a small business on his site. Once I started, though, I found myself hopelessly confused. The problem comes when I crawl the site. It was designed in Wordpress, and is really not very big (part of my goal in working with him was to help him get some great content added!) Even though there are only 11 pages – and 6 posts – for the entire site, when I use Screaming Frog to crawl it, it sees HUNDREDS of pages. It stops at 500, because that is the limit for their free version. In the campaign I started here at SEOmoz, and it says over 1,400 pages have been crawled…with something like 900 errors. Not good, right? So I've been trying to figure out the problem...when I look closer in Screaming Frog, I can see that some things are being repeated over and over. If I sort by the Title, the URLs look like they’re stuck in a loop somehow - one line will have /blog/category/postname…the next line will have /blog/category/category/postname…and the next line will have /blog/category/category/category/postname…and so on, with another /category/ added each time. So, with that, I have two questions Does anyone know what the problem is, and how to fix it? Do professional SEO people troubleshoot this kind of stuff all of the time? Is this the best place to get answers to questions like that? And if not, where is? Thanks so much in advance for your help! I’ve enjoyed reading all of the posts that are available here so far, it seems like a really excellent and helpful community...I'm looking forward to the day when I can actually answer the questions!! 🙂
Intermediate & Advanced SEO | | K.Walters0 -
Removing hundreds of old product pages - Best process
Hi guys, I've got a site about discounts/specials etc. A few months ago we decided it might be useful to have shop specials in PDF documents "pulled" and put on the site individually so that people could find the specials easily. This resulted in over 2000 new pages being added to the site over a few weeks (there are lots of specials).
Intermediate & Advanced SEO | | cashchampion
However, 2 things have happened: 1 - we have decided to go in another direction with the site and are no longer doing this
2 - the specials that were uploaded have now ended but the pages are still live Google has indexed these pages already. What would be the best way to "deal" with these pages? Do I just delete them, do I 301 them to the home page? PS the site is build on wordpress. Any ideas as I am at a complete loss. Thanks,
Marc0