Trailing Slash Problems
-
Link juice being split between trailing slash and non versions. ie. ldnwicklesscandles.com/scentsy-uk and ldnwicklesscandles.com/scentsy-uk/
Initially asked in here and was told to do a rewrite in the htaccess file.
I don't have access to this with squarespace, nor can I add canonical tags on a page by page basis.
301 redirect from scentsy-uk to scentsy-uk/ didn't work either...said that the redirect wasn't completing in an error message on the browser.
Squarespace hasn't been very helpful at all.
My question is....is there another way to fix this? or should I just call it a day with squarespace and move to wordpress?
-
I know this is an old thread but just wondering if anyone ever found a solution in Squarespace, or did everyone just move over to Wordpress?
-
You'll be hard pressed to find a hosted platform that is technically optimized for search engines. Adobe Catalyst, Squarespace, Wix, etc. will all have little (or major) issues. I don't know of too many really popular sites hosted on these platforms, but that's not to say those hosted sites won't rank well for chosen keywords. Anyway, here's what Google has to say about it: http://www.youtube.com/watch?v=CTrdP7lJ2HU
-
Hi Christine:
Did you ever find a solution for this? I have a client who's Squarespace site shows rel-con issues with my recent crawl. And to your point, you can't implement that on a per page basis. Squarespace hasn't responded (yet) to a service request. Any suggestions would be helpful. Thank you!
-
Is there a way to get around this without moving to Wordpress? I only will do that if there's absolutely no other way to help my site.
-
Looks like a move to Wordpress is a safe bet then as your system seems very SEO-Unfriendly.
When you do move to Wordpress be sure to check out Yoast SEO Plugin http://yoast.com/wordpress/seo/
-
Aran I can't add canonical tags on a page by page basis. x
-
have you tried using the canonical tag?
-
Essentially, its only a issue when you have links to both the slash and non-slash versions. I would standardize on either having slashes or not, and make sure all links on the site follow the standard. However, after hearing that they lack basic SEO, I would convert to WP.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have two robots.txt pages for www and non-www version. Will that be a problem?
There are two robots.txt pages. One for www version and another for non-www version though I have moved to the non-www version.
Technical SEO | | ramb0 -
How to force Wordpress to remove trailing slashes?
I've searched around quite a bit for a solution here, but I can't find anything. I apologize if this is too technical for the forum. I have a Wordpress site hosted on Nginx by WP Engine. Currently it resolves requests to URLs either with or without a trailing slash. So, both of these URLs are functional: <code>mysite.com/single-post</code> and <code>mysite.com/single-post/</code> I would like to remove the trailing slash from all posts, forcing mysite.com/single-post/ to redirect to mysite.com/single-post. I created a redirect rule on the server: ^/(.*)/$ -> /$1 and this worked well for end-users, but rendered the admin panel inaccessible. Somewhere, Wordpress is adding a trailing slash back on to the URL mysite.com/wp-admin, resulting in a redirect loop. I can't see anything obvious in .htaccess. Where is this rule adding a trailing slash to 'wp-admin' established? Thanks very much
Technical SEO | | james-tb0 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
Weird problems with google's rich snippet markup
Once upon a time, our site was ranking well and had all the markups showing up in the results. We than lost some of our rankings due to dropped links and not so well kept maintenance. Now, we are gaining up the rankings again, but the markups don't show up in the organic search results. When we Google site:oursite.com, the markups show up, but not in the organic search. There are no manual actions against our site. any idea why this would happen?
Technical SEO | | s-s0 -
Long title problem
I'm getting an incredible number of 4xx errors and long titles from a small website (northstarpad.com); over 13k 4xx errors and almost 20k "title element is too long". The number keeps climbing, but the site shouldn't have more than a couple hundred pages. When I look at the 4xx errors they are clearly being generated by some program since they have multiple and repeating keywords. Here's an example: | http://northstarpad.com/category/wedding-photographer-farmington-michigan/pet-photography/wedding-photography/pet-photography/wedding-photography/wedding-photography/wedding-photography/pet-photography/wedding-photography/wedding-photography/pet-photography/ | I looked at the ftp files and plugins and couldn't see anything that could cause it, but I'm a beginner so no surprise there. Any suggestions where to look or how to fix this?
Technical SEO | | dwerkema0 -
Issues with trailing slash url
Recently, we have changed our website to www.example.com/super-rentals/ (example) and we have done a 301 redirection to the new urls from the old one. We have noticed in Google webmaster tool that urls without trailing slash as 404 error. www.example.com/super-rentals. Please let us know how to fix this issue as soon as possible. Note: Our previous urls are not the urls without trailing slash. It is a different url (www.example.com/super-rentals.htm) we have rewritten in to www.example.com/super-rentals/ only. I would like to know why GWT pulls out the urls without trailing slash and shows in 404 error. Thanks for your time
Technical SEO | | massimobrogi0 -
How to publish duplicate content legitimately without Panda problems
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists. Your visitors love these articles and columns but the search engines see them as duplicate content. You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty. So, you decide to continue publishing the content and use... <meta name="robots" content="noindex, follow"> This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them. I have two questions..... If you use "noindex" will that be enough to prevent your site from being considered as a content farm? Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
Technical SEO | | EGOL0 -
Slashes In Url's
If your cms has created two urls for the same piece of content that look like the following, www.domianname.com/stores and www.domianname.com/stores/, will this be seen as duplicate content by google? Your tools seem to pick it up as errors. Does one of the urls need 301 to the other to clear this up, or is it not a major problem? Thanks.
Technical SEO | | gregster10000