Noindex follow on checkout pages in 2017
-
Hi,
My website really consists of 2 separate sites.-
Product site:
• Website with product pages.
• These product pages have SEO optimised content. -
Booking engine & checkout site:
• When a user clicks 'Book' on one of the product pages on the aforementioned product site they go to a seaparate website which is a booking engine and checkout.
• These pages are not quality, SEO optimised content, they only perform the function of booking and buying.
Q1) Should I set 'noindex follow' via the meta tag on all pages of the 'Booking engine and checkout' site?
ie.Q2) should i add anything to the book buttons on the product site?
I am hoping all this will somehow help concentrate the SEO juice onto the Product Site's pages by declaring the Booking engine and Checkout sites pages to be 'not of any content value'.
-
-
Hi
Ironically MOZ will pick this up as a problem as it reports anything that is noindexed!
For me I just ignore noindex as a problem in certain cases as clearly it makes perfect sense to noindex certain pages and indeed sometimes whole directories.
I sometimes find that developers have noindexed directories like /new-products or /sale but clearly there are better ways of handling the potential duplicate problem here by adding a canonical. In you case it makes no sense having Google index the checkout pages.
Regards Nigel
-
Hi Martin / Nigel,
Thanks for your responses, In regards to Q1.
By adding to the 'Booking engine and checkout' site's pages will this also stop Moz from Crawling these pages - and consequently remove 'issues' from their Moz Site Crawl 'issues count' as it currently crawls these pages and picks up issues?
-
Hi Nigel,
You're right, I didn't think about the duplicates from UTM previously.
Thanks for the update.
Best, Martin
-
Hi Martin
Surely if the traffic was coming from a different source then that would be in the URL of that source. Adding a UTM would simply create duplicate page content between the URL and the UTM tagged URL.
He'd then be faced with the tricky and potentially dangerous task of messing with parameters. I just wouldn't mess with creating UTM tagged URLs,
Apologies - I didn't mean to argue I just couldn't understand your logic.
Regards Nigel
-
Hey Nigel,
As far as I've understand the system of his websites, it consists of two separate websites (unless he meant "page" by the "site").
Then, I think it would be useful to add the UTM so he can see from exactly which source a user comes (since those are two separate websites).
Also, I suppose that by clicking on the book buttons on the product site, they will be redirected to the book site so you would basically add the UTMs in the URL.
If he meant by "site" only "page" then the solution would be different, of course.
Cheers, Martin
-
Hi Martin
Please can you explain why and how you would add UTM parameters to the book buttons on his website?
Thanks Nigel
-
Hey there,
Regarding Q1, I'd set , as you've said. Since the Booking site has no content value for the visitor, there's no need for it to be found in Google SERP.
Regarding Q2, you can add UTM parameters to make the analytics easier in GA.
Since the booking site has no "content value", there's nothing more you can really pass.
Hope it helps. Cheers, Martin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not showing the recent cache info: How to know the last cached version of a page?
Hi, We couldn't able to see the last Google cached version of our homepage after March 29th. Just wondering why this is happening with other websites too. When we make some changes to the website, we will wait to our website indexed and cached, so the changes will have some ranking impact. Now we couldn't able to check if the website got indexed with changes. Is there any other way to check the latest cached version or time of last index? Thanks
Algorithm Updates | | vtmoz0 -
Lost Wikipedia page and dropped heavily in rankings. How many of you aware of and experienced this?
Hi all, We lost of our Wikipedia page for 2nd time and we dropped in rankings 2nd time too. I got confused first time whether Wikipedia was the actual reason as we had couple of major changes in our website. But recently it's been clear that losing Wikipedia page is the culprit as we have no website changes around these days. How many of you aware of this and experienced this? Please share your views. Hope this info will help you. Thanks
Algorithm Updates | | vtmoz0 -
Does using parent pages in WordPress help with SEO and/or indexing for SERPs?
I have a law office and we handle four different practice areas. I used to have multiple websites (one for each practice area) with keywords in the actual domain name, but based on the recommendation of SEO "experts" a few years ago, I consolidated all the webpages into one single webpage (based on the rumors at the time that Google was going to be focusing on authorship and branding in the future, rather than keywords in URLs or titles). Needless to say, Google authorship was dropped a year or two later and "branding" never took off. Overall, having one webpage is convenient and generally makes SEO easier, but there's been a huge drawback: When my page comes up in SERPs after searching for "attorney" or "lawyer" combined with a specific practice area, the practice area landing pages don't typically come up in the SERPs, only the front page comes up. It's as if Google recognizes that I have some decent content, and Google knows that I specialize in multiple practice areas, but it directs everyone to the front page only. Prospective clients don't like this and it causes my bounce rate to be high. They like to land on a page focusing on the practice area they searched for. Two questions: (1) Would using parent pages (e.g. http://lawfirm.com/divorce/anytown-usa-attorney-lawyer/ vs. http://lawfirm.com/anytown-usa-divorce-attorney-lawyer/) be better for SEO? The research I've done up to this point appears to indicate "no." It doesn't make much difference as long as the keywords are in the domain name and/or URL. But I'd be interested to hear contrary opinions. (2) Would using parent pages (e.g. http://lawfirm.com/divorce/anytown-usa-attorney-lawyer/ vs. http://lawfirm.com/anytown-usa-divorce-attorney-lawyer/) be better for indexing in Google SERPs? For example, would it make it more likely that someone searching for "anytown usa divorce attorney" would actually end up in the divorce section of the website rather than the front page?
Algorithm Updates | | micromano0 -
Page 2 to page 1
I've found a lot of times it does not take much activity to get a keyword from ranking on page 3 of Google or further down to page 2 but there seems to be a hurdle from page 2 to page 1. It is very frustrating to be between 11 and 15 but not being able to make that push to 9 or 10. Has anyone got or seen any data to justifiy this?
Algorithm Updates | | S_Curtis0 -
The risk of semi-hidden text, which only shows-up when page viewer clicks button.
Hello Mozzers! I'm working on a holiday accommodation website and there's an accessibility statement at the bottom of each of the (50 odd) accommodation types on offer. This only comes up on the page (the text extends on the same page as the accommodation type) when you click the button (although it's there in the HTML at all times!). My other concern is might this "hidden until button pressed" semi-hidden text be seen as potentially manipulative by Googlebot, although it isn't!
Algorithm Updates | | McTaggart0 -
Quickest way to deindex a large number of pages
Our site was recently hacked by spammers posting fake content and bringing down our servers, etc. After a few months, we finally figured out what was going on and fixed the issue. However, it turns out that Google has indexed 26K+ spammy pages and we've lost page rank and search engine rankings as a result. What is the best and fastest way to get these pages out of Google's index?
Algorithm Updates | | powpowteam0 -
Google showing different pages for same search term in uk and usa
Hi Guys, I have an interesting question and think Google is being a bit strange.. Can anyone tell me why when I input the term design agency in Google.co.uk it shows one page, but when i tyupe in the same search term in Google.com (worldwide search) it shows another page.. Any ideas guys? Is this not bit strange?? Any help here be much appreciated.. Thanks Gareth
Algorithm Updates | | GAZ090 -
Why is a link considered active, but is no longer on the page?
How come links sometimes show up in OSE or Yahoo Site Explorer and then when you go to the page, they're not there anymore? Why is a link indexed or considered active but is no longer on the page?
Algorithm Updates | | MichaelWeisbaum0