Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What are your best tips for SEO on a shopping cart?
-
So, I am working on a shopping cart platform (X-Cart) and so far don't like it. Also, the web designer is not someone I've worked with before and he is understandably conservative about access--which limits what I can and cannot do from the back end.
One of the things I like to do is include text for the search engines. However, based on conversion, etc., I think the product images on a landing page (main brand info with specific products that show up) should show up first to move toward conversion first.
I am thinking of adding the text below the product images on the brand pages so the viewer sees the products first while still keeping the content seo. My practice is to use between 300-350 words minimum on a page.
Just wondering what best practices you have for a shopping cart. Care to share?
Any tips or hints? Thoughts on what I might do that would be most effective?
As always, thanks in advance for your sage advice!
-
Thanks Ian...I'll take a look. Nothing I can do but depending what I see I can add it to the "to do" list.
-
Thanks Ian...I'll take a look. Nothing I can do but depending what I see I can add it to the "to do" list.
-
What he is saying (I think) is that sessions can generate a limitless number of urls related to shopping carts. If you use get form actions for a shopping cart the entire form info shows up in the url and can be interpreted by a search spider as a unique site link leading to thousands of products being added to shopping carts by spiders. This isn't anything you can fix but you can test by adding an item to a shopping cart and seeing if you get a url that basically says "viewcart" or one that says "addtocart&item=234235"
-
I'm not sure what you are discussing Highland. It would be great if you could clarify what SEO technique you are talking about.
-
Hi Aron,
Yes, I write original content and am focusing on unique information that has not been given. The big problem on this site is that there is little information, if any.
I think the idea of rating and commenting on products is a great idea for user generated content. Since I cannot do much from my end based on the limited access, I'll suggest it to the client.
He does have social media accounts set up with a couple of icons on the site but I think creating more interactions would be a great idea.
-
One thing we do is we don't give you a session on your first hit. Since your cart is tied to your session, going to the cart without one just generates a "Your cart is empty" page. If you're a customer you'd need at least 2 hits to get to the cart and once you have a session it lasts 30 days. This cuts down a ton on bot problems with the cart. Yes, this is a technical thing (and probably not something your developer can do) but I've always found it useful.
Also, I once made a newbie mistake of making an item that could be added to the cart via a GET. Don't do it! Adding to the cart should always be a POST.
-
My best tip for shopping sites is to ensure your content is unique and well written giving as much genuineinformation on the product as possible. I always like to include bullet points to highlight features too.
Allowing customers to rate and comment on products is a great way to build up some awesome UGC on product pages.
Both of these options will help SEO and potentially help conversion too.
Including social sharing buttons for people to FB Like/Tweet Products cant hurt either.
-
Thanks EGOL, yes I agree. One of the main problems of this site is the lack of optimized links to the categories and to the products. Your point about related links is a great one. If you like this product you will like this...but using the names is a good strategy.
-
One of the most overlooked concerns is the navigation links that will drive power and anchor text into the product pages.
These could be from category pages, related product pages, article pages, homepage, blog pages, sitemap and more.
If you want your product pages to be vigorous in the index you need to drive some linkjuice into them. In addition, on-site text links can work magic for your rankings.
Related links on product pages can also lead to increased sales and larger shopping carts.
-
Hi Stephan...no worries, I am just working on the brand and product pages but thanks for sharing your concerns.
-
Thanks again Ryan. No, the developer will not give admin privileges which is a pain but I do have URL, title, description, meta control and sent over canonical, 404 page instructions, htaccess edits, and a few other requests which he did and then billed the client for.
He actually billed the client for adding me to a non-admin access (which I did not need).
Most of the web designers I work with let me in to do my thing so this is a bit difficult and annoying. LOL I can't get to the images but do have access to some of the alt.
Big issue is that it is costing my client unexpected expenditures but we are doing are best to work around it.
I am heading to the X-Cart forums in the am for some other ideas and appreciate your consistent input--most valuable.
-
If you are talking about the shopping cart itself, after folks put stuff in it: I would make that a noindex. I get paranoid about security and privacy so I would keep anything that is personalized in any way out of the search index. You don't want a link to a shopping cart with specific shopper id to get into Google. Even if the shopper id doesn't tie back to any personal data, you could have multiple people with the same cart id.
Anyway... I would suggest making the cart a noindex... I'm paranoid about it so I'd play it safe.
-
The best practices for a shopping cart would be the same as any other web page. Each cart software offers varying abilities to natively conform to SEO best practices. Many popular carts offer extensions to bridge the gap between the software design and the changes necessary for improved SEO performance.
If I were in your situation I would request control over the URL, page title, header, text, alt text, etc. All the normal SEO factors. If the web developer is not willing to provide you access, then ask the developer to make the changes. Either the developer will likely not be happy with making so many changes and then grant you permission, or the site owner wont like paying a developer to make text changes and he will request for you to be given access. Either way, the result is the needed changes are made.
My best suggestion would be to inquire on the X-Cart software forums as to what SEO extensions and customizations are available.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Masking SEO Impact
I hope I am explaining this correctly. If I need to provide any clarity please feel free to ask. We currently use a domain mask on an external platform that points back to our site. We are a non-profit and the external site allows users to create peer-to peer fundraisers that benefit our ministry. Currently we get many meta issues related to this site as well as broken links when fundraisers expire etc. We do not have a need to rank for the information from this site. Is there a way to index these pages so that they are not a part of the search engine site crawls as it relates to our site?
Technical SEO | | SamaritansPurse0 -
Breadcrumbs on Mobile How important are they for SEO?
Due to Poor unsightly look of breadcrumbs and the space it takes up above the fold we only employ breadcrumbs on our desktop version. Breadcrumbs are hidden from view on mobile version. However as mobile first indexing is now in play what technical SEO impacts will this have? one thing that comes to mind is crawling deeper pages where breadcrumbs made them accessible in less than 3 link clicks? But i am unsure now of the impacts of not having breadcrumbs visible for mobile version of our site.
Technical SEO | | oceanstorm0 -
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Express js and SEO?
Hi fellow Mozzers, I have been tasked with providing some SEO recommendations for a website that is to be built using express.js and Angular. I wondered whether anyone has had any experience in such a framework? On checking a website built in this and viewing as a GoogleBot etc using the following tools it appears as though most of the content is invisible: http://www.webconfs.com/search-engine-spider-simulator.php http://www.browseo.net/ Obviously this is a huge issue and wonder if there are any workarounds, or reccomendations to assist (even if means moving away from this - would love to hear about it)
Technical SEO | | musthavemarketing2 -
Static or dynamic category pages for seo
Hi, I'm developing an accommodation site with a limited number of properties in 8 categories. I had been looking at making the properties blog posts and then using category function to show lists but its going to require a lot of customisation and I have seo concerns about the dynamic content as the category page is crucial. As I don't have a lot to add and listings will remain the same my latest thought was to create all as pages. However if I create a page with a list of 12 properties on a category page is there anyway of adding some sorting criteria to that page (would be 7 options - swimming pool, near beach, on site creche, budget, mid-range, luxury) Thanks for any tips Neil
Technical SEO | | neilhenderson0 -
What is the best way to refresh a webpage of a news site, SEO wise?
Hello all, we have a client which is a sports website. In fact it is a veyr big website and has a huge number of news per day. This is mostly the reason why it refreshes some of its pages with news list every 420 seconds. We currently use meta refresh. I have read here and elsewhere that meta refreshes should be avoided. But we don't do it to send to another page and pass any kind of page authority / juice. Is in this case javascript refresh better? Is there any other better way. What do you think & suggest? Thank you!
Technical SEO | | pkontopoulos0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Best Dynamic Sitemap Generator
Hello Mozers, Could you please share the best Dynamic Sitemap Generator you are using. I have found this place: http://www.seotools.kreationstudio.com/xml-sitemap-generator/free_dynamic_xml_sitemap_generator.php Thanks in advanced for your help.
Technical SEO | | SEOPractices0