Canonical tags being direct to "page=all" pages for an Ecommerce website
-
I find it alarming that my client has canonical tags pointing to "page=all" product gallery pages. Some of these product gallery pages have over 100 products and I think this could effect load time, especially for mobile. I would like to get some insight from the community on this, thanks!
-
Currently my 301's are being directed to relative pages. but for example:
www.shoes.com/category/redshoes.do <------ Current Redirect from (www.shoes.com/category/myredshoes.do)
www.shoes.com/category/redshoes.do=sortby=page1
www.shoes.com/category/redshoes.do=sortby=page2
www.shoes.com/category/redshoes.do=sortby=page=all <----- **Current Canonical **
www.shoes.com/category/redshoes.do=sortby=page=all <--Should I Redirect from www.shoes.com/category/redshoes.do
I basically want to distribute my authority to one page and contemplating if redirecting to a "page=all" along with my canonical will improve the overall performance for that page.
-
I asked John Mueller in a recent hangout about 301 redirects and he stated that if you had multiple 301's from the same domain going to a single point i.e homepage , then google may discount many of those 301's and treat them 404's. In my context , it was as I had done a migration and being lazy I 301'd all the urls to the home t. He was saying to map them like for like or you could lose out.
So I guess it depends on your 301's etc..
Pete
-
The rel=canonical tag passes the same amount of link juice (ranking power) as a 301 redirect, so should I also point my redirects to a "view=all" pages to aggregate Page Authority?
-
We use both rel=next and rel=prev along with a canonical tag pointing to the view all pages on our eCommerce site. As Greenstone mentions above, this is what google recommends.
We also use a Cloudflare CDN (Content delivery Network) which takes care of any speed issue . They offer a free package which you can use to trial it and the paid packages are also very good value ,approx $20-30 per month by memory but it does make the website lightening quick. It's very easy to setup to.
Pete
-
Implementing a rel canonical for a paginated series to a "view all" is certainly recommended practice from a technical standpoint.
With that said, this should be implemented as the recommended course if it enhances user experience. If it takes too long to load, and users abandon the page all together, it helps no one. I would certainly do speed tests, and check the usability of it.
- If it takes longer than a few seconds, I would certainly recommend checking to see if there are ways to speed it up.
- If this proves to be difficult, there is certainly room to consider implementing a paginated series that is more manageable and contains rel=prev and rel=next tags to ensure search engines are aware these pages are a related series.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Google chose different canonical than user" Issue Can Anyone help?
Our site https://www.travelyaari.com/ , some page are showing this error ("Google chose different canonical than user") on google webmasters. status message "Excluded from search results". Affected on our route page urls mainly. https://www.travelyaari.com/popular-routes-listing Our canonical tags are fine, rel alternate tags are fine. Can anyone help us regarding why it is happening?
White Hat / Black Hat SEO | | RobinJA0 -
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
A doorway-page vendor has made my SEO life a nightmare! Advice anyone!?
Hey Everyone, So I am the SEO at a mid-sized nationwide retailer and have been working there for almost a year and half. This retailer is an SEO nightmare. Imagine the worst possible SEO nightmare, and that is my unfortunate yet challenging everyday reality. In light of the new algorithm update that seems to be on the horizon from Google to further crack down on the usage of doorway pages, I am coming to the Moz community for some desperately needed help. Before I was employed here, the eCommerce director and SEM Manager connected with a vendor that told them basically that they can do a PPC version of SEO for long-tail keywords. This vendor sold them on the idea that they will never compete with our own organic content and can bring in incremental traffic and revenue due to all of this wonderful technology they have that is essentially just a scraper. So for the past three years, this vendor has been creating thousands of doorway pages that are hosted on their own server but our masked as our own pages. They do have a massive index / directory in HTML attached to our website and even upload their own XML site maps to our Google Web Master Tools. So even though they “own” the pages, they masquerade as our own organic pages. So what we have today is thousands upon thousands of product and category pages that are essentially built dynamically and regurgitated through their scraper / platform, whatever. ALL of these pages are incredibly thin in content and it’s beyond me how Panda has not exterminated them. ALL of these pages are built entirely for search engines, to the point that you would feel like the year was 1998. All of these pages are incredibly over- optimized with spam that really is equivalent to just stuffing in a ton of meta keywords. (like I said – 1998) Almost ALL of these scraped doorway pages cause an incredible amount of duplicate content issues even though the “account rep” swears up and down to the SEM Manager (who oversees all paid programs) that they do not. Many of the pages use other shady tactics such as meta refresh style bait and switching. For example: The page title in the SERP shows as: Personalized Watch Boxes When you click the SERP and land on the doorway page the title changes to: Personalized Wrist Watches. Not one actual watch box is listed. They are ALL simply the most god awful pages in terms of UX that you will ever come across BUT because of the sheer volume of this pages spammed deep within the site, they create revenue just playing the odds game. Executives LOVE revenue. Also, one of this vendor’s tactics when our budget spend is reduced for this program is to randomly pull a certain amount of their pages and return numerous 404 server errors until spend bumps back up. This causes a massive nightmare for me. I can go on and on but I think you get where I am going. I have spent a year and half campaigning to get rid of this black-hat vendor and I am finally right on the brink of making it happen. The only problem is, it will be almost impossible to not drop in revenue for quite some time when these pages are pulled. Even though I have helped create several organic pages and product categories that will pick-up the slack when these are pulled, it will still be awhile before the dust settles and stabilizes. I am going to stop here because I can write a novel and the millions of issues I have with this vendor and what they have done. I know this was a very long and open-ended essay of this problem I have presented to you guys in the Moz community and I apologize and would love to clarify anything I can. My actual questions would be: Has anyone gone through a similar situation as this or have experience dealing with a vendor that employs this type of black-hat tactic? Is there any advice at all that you can offer me or experiences that you can share that can help be as armed as I can when I eventually convince the higher-ups they need to pull the plug? How can I limit the bleeding and can I even remotely rely on Google LSI to serve my organic pages for the related terms of the pages that are now gone? Thank you guys so much in advance, -Ben
White Hat / Black Hat SEO | | VBlue1 -
Redirecting from https to http - will pass whole link juice to new http website pages?
Hi making permanent 301 redirection from https to http - will pass whole link juice to new http website pages?
White Hat / Black Hat SEO | | Aman_1230 -
How to add ">" category reveal in google search
When i look through google search and see some website categories their site this way. For example groupon www.groupon.com › Coupons › Browse Coupons by Store how do you do this for a website? for example wordpress. does this help with seo?
White Hat / Black Hat SEO | | andzon0 -
Low quality websites with spammy EMDs still ranking higher than genuine websites?
Hey guys, I've just been doing some searching and couldn't quite contemplate how heavily low-quality and spammy EMDs are still running some Google searches. Just take "cheap kitchens", for instance. Here are a list of URLs that appeared; http://kitchenunitsdoors.co.uk/ http://www.kitchenunits9.co.uk/ http://www.aboutkitchenunits.co.uk/ http://www.cheapkitchenunits1.co.uk/ http://www.cheapkitchensonline.com/ http://www.buycheapkitchens.com/ http://www.cheapkitchenscheapkitchen.co.uk/ http://www.cheapkitchensforsale1.co.uk/ http://cheapkitchensaberdeen.co.uk/ http://www.kitchensderby1.co.uk/ http://www.cheapcheapkitchens.co.uk/ http://kitchen-cheap.co.uk/ http://www.cheapestkitchensinbritain.co.uk/ http://www.cheapkitchenss.co.uk/ http://www.cheaperthanmfi.com/ http://cheapkitchenuk.co.uk/ As you can see, none of them appear to be genuine retailers and are setup purely to influence Google rankings. I'm amazed that Google is still giving so much weight to these types of sites - especially considering how search is meant to be better than it ever was before! Any insights into why this is?
White Hat / Black Hat SEO | | Webrevolve0 -
Why do websites use different URLS for mobile and desktop
Although Google and Bing have recommended that the same URL be used for serving desktop and mobile websites, portals like airbnb are using different URLS to serve mobile and web users. Does anyone know why this is being done even though it is not GOOD for SEO?
White Hat / Black Hat SEO | | razasaeed0 -
Switching existing website to a Wordpress Site and afraid of losing top spot
I am going to be switching my current site from a standard html site to a wordpress site. I'm kind of paranoid of losing my top spot for the keyterms. If I keep the content the same, and keep the same amount of image alt tags, the same anchor text etc, nothing should change right? Grateful for any advice. Thanks Will
White Hat / Black Hat SEO | | willie790