Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google also indexed trailing slash version - PLEASE HELP
-
Hi Guys,
We redesigned the website and somehow our canonical extension decided to add a trailing slash to all URLs. Previously our canonical URLs didn't have a trailing slash.
During the redesign we haven't changed the URLs. They remained same but we have now two versions indexed. One with trailing slash one without.
I've now fixed the issue and removed the the trailing slash from canonical URLs.
Is this the correct way of fixing it? Will our rankings be effected in a negative way?
Is there anything else I need to do.
The website went live last Tuesday.
Thanks
-
Thats great! The canonical URLs are showing URLs without slash as they are probably reflecting their original URL which is without slash. Hope Google clears them soon..
-
Seems like you got the 301-redirect resolved below - if you've got that in place and fixed the canonical tag, it should be ok. It'll just take some time (usually longer than you'd like) for Google to clear out the pages, especially the deeper ones. If you see gradual de-indexation, though, you'll probably be fine.
-
-
Actual rel="canonical" tags.
-
As soon as we relised everything was fixed. Canonical tag is showing urls without slash and also aplied to htaccess to redirect slash version to non slash version.
<cite>we're using www.shopify.com</cite>
-
-
Could you clarify a couple of things:
(1) When you say canonical URLs, do you mean your internal links, or the actual URLs in your rel="canonical" tags?
(2) If it was just the canonical tags, is everything consistent now (tags, internal links, etc.)?
Since both version will resolve, just fixing the canonical tags (if that's the issue) should be enough - it's just going to take a little time. They should be as effective as a 301-redirect in this case. Either way, though, it can take Google a while to kick out the duplicates. I'd just monitor the index closely and make sure the top-level pages are clearing up (i.e. your home-page and major category duplicates should be disappearing). If that's happening, you're ok - you just need to wait a bit. If that's not happening, then you may have some other mixed signals in play.
-
You are welcome.
Well, the first time you did submit the sitemap right, but now since Google has found new URLs on your website and indexed them, it would be good to notify the big G that they are no longer a part of your website and resubmitting would not hurt.
About the redirections, Google does take a bit of time to understand that the URLs have permanently moved and will gradually remove them from the index. So, keep checking the index for the trailing slash URLs and when they are gone, you can remove the redirections.
Cheers,
-
Thanks a lot.
Now when i click the slash version of the indexed URL from google goes to nonslash version. So it seems we're safe now.
The other thing is when I submitted the sitemap.xml after launch it was without slash. Also all internal links are targeting nonslash URLs. I think google should understand that this is a technical issue and now it has been solved.
When should i remove that redirect?
-
Yups, its done. Just need to be sure if the Home Page is fine. The indexed version of the Home Page stays as it is without any redirection.
Cheers,
-
I checked with this website: http://www.internetofficer.com/seo-tool/redirect-check/
It says:
http://www.mydomain.com/jason.html/
Type of redirect: 301 Moved Permanently
http://www.mydomain.com/jason.html
So looks as if it's done the job. Right?
-
Sounds good, do keep a check to make it 100% sure. I believe the SE's will be fine now.
Cheers,
-
RewriteRule ^([^/]+/)*([^/.]+).html/ http://www.mydomain.com/$2.html [R=301,L]
Looks like above did the trick
-
I think some of these posts can help you understand:
http://html5boilerplate.com/docs/Proper-usage-of-trailing-slash-redirects/
Do try this a test environment and take a backup of the .htaccess file before making any changes, Have it go through a programmer.
Cheers,
-
Please can you tell me how to redirect urls with slash to non slash urls using .htaccess.
-
Jvalops,
This is a common scenario in SEO when you have 2 versions indexed of the same URL. This bascially creates a duplicate issue. Now, this situation has a solution which includes 2 things to implement:
1. Fix it from the search engines's perspective.
2. Make changes at the server level.
You did remove the trailing slash so you fixed it at the server level but you left the search engines to think - Where did the URL go? Am I supposed to show a 404 for that or what?.
So, it is important that you first fix them for the SE's and then make any server level changes because you never know how quick the crawlers can re-visit the disappeared URL and take their own action. Since this is just a recent change I hope that the SE;s will not evaluate it in a negative way but you should be quick to inform them. Now, since you have already removed it, do add a code in the .htaccess file stating that any URL with a slash redirects it to the URL without slash. I hope there are no URLs that have to end with a slash (just have a re-look on this, the home page and others).
After this is done, to make things more clear to the search engines, resubmit your XML sitemap with all the correct URLs on the website and I think you will be just fine.
On the rankings, I don't think it will be affected, unless there was a re-crawl after the indexation.
Cheers,
-
I'm not 100% sure how to answer your question, but an .htaccess 301 might work.
/example.html/ example.html
Try that to see if it works.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind) where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Any risks involved in removing a sub-domain from search index or completely taking down? Ranking impact?
Hi all, One of our sub-domains has thousands of indexed pages but traffic is very less and irrelevant. There are links between this sub-domain to other sub domains of ours. We are planning to take this subdomain completely. What happens if so? Google responds for this with a ranking change? Thanks
Web Design | | vtmoz0 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
Greetings Moz Community: As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories: 1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate 2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
Web Design | | Kingalan1
15 PAGES Low bounce rate. 3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate 4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES 5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate 6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content 7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%) Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me. Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us. Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well? Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"? Any harm in doing this for about half the pages on the site? I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains. Thanks in advance for your responses!! Alan0 -
From Google Sites to Wordpress - Anyone Ventured this SEO terrain?
We have a few sites in Google Sites - and they are ugly! We have a majority (40+) of websites in Wordpress. But we have a few websites just stuck on Google Sites, and since Google won't let you fully edit the HTML, add scripts, or implement any technology since 2000, we want to move. The sad problem - the Google sites are ranking well. We rank well in Manhattan, Atlanta, Dallas, and Philadelphia. The problem is - the sites do not give much room for growth - and the bounce rate is high because they are so ugly. Has Anyone moved from Google sites to Wordpress? Should we just stay with Google and bite the ugly bullet? My fear is that these sites will not allow for growth. It is hard to update them and even harder to make them look nice. To get a sample - beware: www.counselingphiladelphia.com Even another reason to leave: The slider is non-semantic and terrible SEO. Google won't allow a slider script with tags and a hrefs, so the only way to implement a slider is through a Google Docs Presentation that keeps sliding. I know - terrible SEO (#donthate) but we needed something. Any advice and thoughts would help! Thanks Mozzers!
Web Design | | _Thriveworks0 -
Custom 404 Page Indexing
Hi - We created a custom 404 page based on SEOMoz recommendations. But.... the page seems to be receiving traffic via organic search. Does it make more sense to set this page as "noindex" by its metatag?
Web Design | | sftravel0