Best practices for structuring an ecommerce site
-
I'm revamping my wife's ecommerce site. It is currently a very low traffic website that is not indexed very well in Google. So, my plan is to restructure it based upon the best practices that helps me avoid duplicate content penalties, and easier to index strategies.
The store has about 7 types of products. Each product has approximately 30 different size variations that are sometimes specifically searched for.
For example: 20x10x1 air filters, 20x10x2 air filters, 20x10x1 allergy reducing air filters, etc
So, is it best for me to create 7 different products with 30 different size variations (size selector at the product level that changes the price) or is it better to create 210 different product pages, one for each style/size?
-
People do both. They will search for particular sizes as well as general broad searches and then once on the site, drill down to their particular size.
-
Here is what I think about this!
If people actually search for specific size then my idea is to go with 210 different products as it will be easier for one to rank in search engine for specific size.
If people search for products name in that case going with 7 different products and offer different sizes on the product page.
Hope this helps!
-
Thanks for the response.
I was leaning that way simply because unique content would be so much easier. Plus, the management aspect of 300+ unique pages would be a pain to deal with.
-
The chances of the product page getting inbound links is better if the size variations are on the same page. That way if someone links to you as having air filter 10x20x1 you could also get a link to that page from a person looking for 10x20x2.
On the page itself you can have pictures and copy displaying the various sizes. With how much Google clusters based on similarity you're better off with as strong a page as possible versus lots of low strength ones.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
Link Anchor Text - Best Practice?
Moz - Open Site Explorer using the following setup: Tab: Inbound Links
Intermediate & Advanced SEO | | Mark_Ch
Show: "all"
from: "Only Internal" I have run a number of random tests and have noticed the following results in the link anchor text. [No Anchor Text]
company name
website url
Home
etc. What is the best practice and naming convention to be used? Regards Mark0 -
Best practice for retiring old product pages
We’re a software company. Would someone be able to help me with a basic process for retiring old product pages and re-directing the SEO value to new pages. We are retiring some old products to focus on new products. The new software has much similar functionality to the old software, but has more features. How can we ensure that the new pages get the best start in life? Also, what is the best way of doing this for users? Our plan currently is to: Leave the old pages up initially with a message to the user that the old software has been retired. There will also be a message explaining that the user might be interested in one of our new products and a link to the new pages. When traffic to these pages reduces, then we will delete these pages and re-direct them to the homepage. Has anyone got any recommendations for how we could approach this differently? One idea that I’m considering is to immediately re-direct the old product pages to the new pages. I was wondering if we could then provide a message to the user explaining that the old product has been retired but that the new improved product is available. I’d also be interested in pointing the re-directs to the new product pages that are most relevant rather than the homepage, so that they get the value of the old links. I’ve found in the past that old retirement pages for products can outrank the new pages as until you 301 them then all the links and authority flow to these pages. Any help would be very much appreciated 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
How do I best handle Duplicate Content on an IIS site using 301 redirects?
The crawl report for a site indicates the existence of both www and non-www content, which I am aware is duplicate. However, only the www pages are indexed**, which is throwing me off. There are not any 'no-index' tags on the non-www pages and nothing in robots.txt and I can't find a sitemap. I believe a 301 redirect from the non-www pages is what is in order. Is this accurate? I believe the site is built using asp.net on IIS as the pages end in .asp. (not very familiar to me) There are multiple versions of the homepage, including 'index.html' and 'default.asp.' Meta refresh tags are being used to point to 'default.asp'. What has been done: 1. I set the preferred domain to 'www' in Google's Webmaster Tools, as most links already point to www. 2. The Wordpress blog which sits in a /blog subdirectory has been set with rel="canonical" to point to the www version. What I have asked the programmer to do: 1. Add 301 redirects from the non-www pages to the www pages. 2. Set all versions of the homepage to redirect to www.site.org using 301 redirects as opposed to meta refresh tags. Have all bases been covered correctly? One more concern: I notice the canonical tags in the source code of the blog use a trailing slash - will this create a problem of inconsistency? (And why is rel="canonical" the standard for Wordpress SEO plugins while 301 redirects are preferred for SEO?) Thanks a million! **To clarify regarding the indexation of non-www pages: A search for 'site:site.org -inurl:www' returns only 7 pages without www which are all blog pages without content (Code 200, not 404 - maybe deleted or moved - which is perhaps another 301 redirect issue).
Intermediate & Advanced SEO | | kimmiedawn0 -
Our Site's Content on a Third Party Site--Best Practices?
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content. I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site. Our thoughts so far: add a paragraph of original content to our content link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties) What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site? They are really pushing for not using a canonical--so this isn't an option. What would you do?
Intermediate & Advanced SEO | | nicole.healthline1 -
Keyword Targeting Best Practices??
What is the best way to target a specific keyword? I rank well for several of my keywords but want to do better on others. How do I go about doing this?
Intermediate & Advanced SEO | | bronxpad0 -
Best Practices for Pagination on E-commerce Site
One of my e-commerce clients has a script enabled on their category pages that allows more products to automatically be displayed as you scroll down. They use this instead of page 1, 2, and a view all. I'm trying to decide if I want to insist that they change back to the traditional method of multiple pages with a view all button, and then implement rel="next", rel="prev", etc. I think the current auto method is disorienting for the user, but I can't figure out if it's the same for the spiders. Does anyone have any experience with this, or thoughts? Thanks!
Intermediate & Advanced SEO | | smallbox0 -
What strategies to best use to boost rankings across long-tail articles on site?
Heya! I'm currently engaged in what appears to be a slightly unusual SEO task. I run a large, reasonably well-respected (but not global-standard, yet) site that I'm currently monetising through individual articles targetted at addressing specific search engine queries that I know have decent traffic. It's the EHow / Demand Media model, except with a focus on a single specific (video games) niche, and much, much better quality articles (sufficiently good that they attract a fair amount of praise - all the writers on the site are published authors and the quality's damn high). Most of our articles end up ranking with essentially no backup, but they don't rank high - usually 2nd or 3rd page of Google. I'm trying to determine what the most effective strategy would be for us to boost our article rankings with the least possible expense / effort (we don't have a huge budget). Our long-tail articles are mostly being trumped by articles with either a couple of external links to them or by other articles with no links but from a site with significantly higher Domain Authority (70+ to our 48).I'm working to improve our on-page optimisation, but it's already pretty good (an "A" report from the SEOMoz tools on most or all pages). So, I'm wondering what the best use of our time would be to increase traffic globally across the site. Strategies I'm considering: Focussing on building links to the homepage and to any other pages on the site, by asking for links from community members, doing linkbait articles, directory submissions, guest blogging, and so on. Long-term aim: increase our domain-wide MozRank and MozTrust. Build links to our long-tail articles specifically, most popular first. Get direct links from relevant blogs, press releases, social bookmarking, etc. Long-term aim: get to #1 on Google one page at a time. Something Else? I'm wondering what the big SEO brains here would suggest? Happy to provide additional details if it would help!
Intermediate & Advanced SEO | | Cairmen1