An Infrastructure Change for a Large eCommerce Site - Any advice?
-
Hello Mozers,
We're currently under going quite a large infrastructure change to our website and I wouldn't to hear your thoughts on the type of things we should be careful of.
We currently have close to 4,000 individual products each with their own page. The seo work is then driven behind certain pages which house a catalog display of groups of products. The groups are done by style. An example is we have a page called "Style A" which displays 8 different colours of style A. We then seo the style A page and the individual items received minimal seo work.
The change would involve having one individual product page for each style but on that page the user would have the ability to purchase the different colours/variations via menus. This will result in approximately a %70 reduction in the size of our site (as several products will no longer be published)
The things we are currently concerned with are:
1. The lose of equity to those unwanted 'style A' pages - I think a series of careful planned 301s will be the solution.
2. Possible loss of long tail traffic to the individual products which might not be caught by one individual page per style.
3. Internal link structure will need to be monitored to make sure that we're still highlight the most important pages as well, important.
Sorry for the long post, it's a difficult change to explain without revealing the clients name - any other things we should be thinking about would be greatly appreciated!
Thanks
Nigel
-
1. The lose of equity to those unwanted 'style A' pages - I think a series of careful planned 301s will be the solution.
If you redirect the discarded pages you might have a gain in equity.
2. Possible loss of long tail traffic to the individual products which might not be caught by one individual page per style.
Actually, with lots more words on a page you might have a gain in long tail traffic. The only way to know is to try it... just saying this because it might not be a loss.
More important, you might be moving away from a potential duplicate content problem as these pages might be very similar.
3. Internal link structure will need to be monitored to make sure that we're still highlight the most important pages as well, important.
This job is always present.
-
Your concerns are certainly valid, but in my opinion, I think you should definitely go forwards with your ideas. Especially in the post-Panda world, we're seeing Google really reward simplicity in design and infrastructure. Moreover, I think consolidating all of the different colors of one style onto one page makes the most sense for the users - in terms of creating an intuitive user experience and creating a faster and smoother browsing experience.
301 redirects are the right move for the product pages that you phase out. I think you will find link building and SEO work on the product level much easier with less pages to focus on. As far as the long-tail traffic loss implications, this is a valid concern, but obviously you can have a list of the different available colors on each product page. I would also beef up my long-tail optimization with a push for user generated content in the form of user reviews. If you don't already accept these, consider doing so. If you do accept these, how about a promotion of some type to stimulate a big push to accrue some more. You can have users select the color of the item they are reviewing in order to get those terms on the page more frequently.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need to update Google Search Console profile for http to https change. Will a "change of address" option suffice or do we need to create a new GSC profile?
In the past I have seen most clients create new Google Search Profile when they update to a https URL. However a colleague of mine asked if just updating the change of address option will suffice https://support.google.com/webmasters/answer/83106. Would it be best to just update the change of address for the Google Search Console profile to keep the data seamless? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
The images on site are not found/indexed, it's been recommended we change their presentation to Google Bot - could this create a cloaking issue?
Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
Intermediate & Advanced SEO | | KateWaite0 -
Subdomain optimization - advices
Hi, I need some specific advices on which is the best way to optimize the subdomain of a main domain. Besides meta title, description, etc. Br.
Intermediate & Advanced SEO | | Tormar0 -
Ipad Sales & Traffic Improvement for my Ecommerce site
Do you guys know any tool or software which provides follow things for my ecommerce site? Real Time/ next day data for ipad traffic Real Time/ next day data for ipad urls visited Read time/ next day data for ipad Page rendering load time for all the urls separately Real Time/ next day data for ipad network load time for all the urls separately Real Time/ next day data for ipad dom processing time for the all the urls separately Real Time/ next day data for ipad request queuing load time for all the urls separtely Real Time/ next day data for ipad web application load time for all the urls separtely Real Time/ next day data for ipad total load time for each url Real Time/ Next day data for ipad timestamp i.e Time of each url being accessed by the visitor Real Time/ next day data for ipad visitor city Real Time/ next day data for ipad visitor country code Real Time/ next day data for ipad visitor duration on that page Real Time/ next day data for ipad visitor user agent name foreg chrome, IE, safari, firefox etc Real time/ next day data for ipad visitor user agent OS foreg. ipad only Real time/ next day data for ipad user agent version foreg. ipad 8.0, ipad 6.0, ipad air, ipad ratina, ipad mini etc Real time/ next day data for ipad visitor for each url session trace in water fall like backend time, dom processing, page load, waiting on ajax, interactions of visitors etc Real time/ next day data for ipad visitor for each url with total request for each page. Real time/ next day data for ipad visitors for each url with javascript error on the page and javascript url plus stake track of that error. Real time/ next day data for ipad visitors for each url with ajax error on the page and ajax url plus stake track of the error Real time/ next day data for ipad visitors for each and every url where each and every request time taken in waterfall layout. Real time/ next day data for ipad visitors funnel visiualization tracking Real time/ next day data for ipad visitors transcations tracking. Please note that all above data also require day wise, country wise, previous days and month, model wise sorting, pagination feature, etc. waiting for your reply Regards, Mit
Intermediate & Advanced SEO | | mit0 -
Adding a huge new product range to eCommerce site and worried about Duplicate Content
Hey all, We currently run a large eCommerce site that has around 5000 pages of content and ranks quite strongly for a lot of key search terms. We have just recently finalised a business agreement to incorporate a new product line that compliments our existing catalogue, but I am concerned about dumping this huge amount of content (that is sourced via an API) onto our site and the effect it might have dragging us down for our existing type of product. In regards to the best way to handle it, we are looking at a few ideas and wondered what SEOMoz thought was the best. Some approaches we are tossing around include: making each page point to the original API the data comes from as the canonical source (not ideal as I don't want to pass link juice from our site to theirs) adding "noindex" to all the new pages so Google simply ignores them and hoping we get side sales onto our existing product instead of trying to rank as the new range is highly competitive (again not ideal as we would like to get whatever organic traffic we can) manually rewriting each and every new product page's descriptions, tags etc. (a huge undertaking in terms of working hours given it will be around 4,400 new items added to our catalogue). Currently the industry standard seems to just be to pull the text from the API and leave it, but doing exact text searches shows that there are literally hundreds of other sites using the exact same duplicate content... I would like to persuade higher management to invest the time into rewriting each individual page but it would be a huge task and be difficult to maintain as changes continually happen. Sorry for the wordy post but this is a big decision that potentially has drastic effects on our business as the vast majority of it is conducted online. Thanks in advance for any helpful replies!
Intermediate & Advanced SEO | | ExperienceOz0 -
What SEO tactics are effective for optimising a site where content is changing very frequently (for example an online newspaper)?
I have always worked with sites where content has a reasonably long life-span but need to now consider SEO for a site where content is changing very rapidly. I have read that Google will re-spider your content more frequently if it finds that it is changing frequently but are there effective ways to let the search engines know as new articles are published? Also, if content is removed within only a day or two of being published, can this have a negative impact on SEO?
Intermediate & Advanced SEO | | Greenie0 -
Site dancing
Hi guys I have a site which is dancing. I mean one day is on position 20 , if I put more backlinks is falling, after rising again,, I dont know what is going on. The site is 2 years old, pr 2, authority 35. Why this is happening? Usually when he appears again is ranking higher, but today he disappear totally from rankings. Maybe return tomorrow? But anyway why is dancing? Thanks
Intermediate & Advanced SEO | | nyanainc0 -
Franchise sites on subdomains
I've been asked by a client to optimise a a webpage for a location i.e. London. Turns out that the location is actually a franchise of the main company. When the company launch a new franchise, so far they have simply added a new page to the main site, for example: mysite.co.uk/sub-folder/london They have so far done this for 10 or so franchises and task someone with optimising that page for their main keyword + location. I think I know the answer to this, but would like to get a back up / additional info on it in terms of ranking / seo benefits. I am going to suggest the idea of using a subdomain for each location, example: london.mysite.co.uk Would this be the correct approach. If you think yes, why? Many thanks,
Intermediate & Advanced SEO | | Webrevolve0