Affiliate Link is Trumping Homepage - URL parameter handling?
-
An odd and slightly scary thing happened today: we saw an affiliate string version of our homepage ranking number one for our brand, along with the normal full set of site-links.
We have done the following:
1. Added this to our robots.txt :
User-agent: *
Disallow: /*?2. Reinserted a canonical on the homepage (we had removed this when we implemented hreflang as had read the two interfered with each other. We haven't had canonical for a long time now without issue. Is this anything to do with the algo update perhaps?!
The third thing we're reviewing I'm slightly confused about: URL Parameter Handling in GWT. As advised - with regard to affiliate strings - to the question: "Does this parameter change page content seen by the user?" We have NO selected, which means they should be crawling one representative URL. But isn't it the case that we don't want them crawling or indexing ANY affiliate URLs? You can specify Googlebot to not crawl any of particular string, but only if you select: "Yes. The parameter changes the page content." Should they know an affiliate URL from the original and not index them? I read a quote from Matt Cutts which suggested this (along with putting a "nofollow" tag in affiliate links just in case)
Any advice in this area would be appreciated. Thanks.
-
I'm glad to hear you've been sorted out Lawrence Neal. I find it interesting the the other Lawrence saw something similar, and I'll ask around to see if it was a glitch that other people have noticed too.
For anyone reading this wondering what Mr. Neal was referring to in regard to rel canonical / href lang conflict, there's a good writeup of it over at Dejanseo.com and Gianluca Fiorelli mentions it in his comment on Dr. Pete's Rel Canonical uber post here on Moz.
-
Luckily it's disappeared today, which leads me to believe it was a Google-side algo error that was swiftly corrected (nothing we have done will have reflected in the serp so quickly, I doubt)
-
Lets say your site is using php?
Your system no doubt picks up the parameter with a php get and stores it as a session variable.
That is likely all that would need to be done before the page is 301 redirected.
Best thing to do is create a test page with the cod mentioned above on your site and try it
have the page redirect to the homepage and see if that affiliate code is stored.
-
I don't know if this has anything to do with the algo update, but at least your not the only one. I saw a competitor ranking with a second version of their homepage. The second version had utm parameters behind them.
Luckily the page with the utm parameters disappeared from the serps this morning. He was actually ranking first with the normal version and second with the version with the url parameters. This was on some pretty competitive keywords and lasted almost three days.
-
Thanks for your reply, Gary. I'm not entirely sure how our (far reaching and lucrative) affiliate tracking/logging works, but I would have thought 301ing all the links to the original page would sabotage it, no?!
The canonical will certainly work but we've only reinstated it on the homepage as we have 6 other sites that have hreflang alternates in place and the canonical seems to interfere with their function.
-
hmmm.. seems like Google is getting some strong linking signals that this is the popular page to arrive at.
The canonical tag on the homepage is the right way to go.
You could 301 redirect any customer that lands on you with an affiliate code in the url? This would be a very simple bit of code you could even put it in an an include at the top of each page. This way those pages never even exist and you get all the link juice.
One other thing might be to put a noindex on any page that has an affiliate parameter. But you would lose the link juice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Format
Often we have web platforms that have a default URL structure that looks something like this www.widgetcompany.co.uk/widget-gallery/coloured-widgets/red-widgets This format is quite well structured but would it just be more effective to be www.widgetcompany.co.uk/red-widgets? I realise that it may depend on a lot of factors but generally is it better to have the shorter URL if targeting the key phrase "red widgets" One thing, it certainly looks a bit keyword stuffy with all those "widgets"
Technical SEO | | vital_hike0 -
What is the best way to handle links that lead to a 404 page
Hi Team Moz, I am working through a site cutover with an entirely new URL structure and have a bunch of pages that could not, would not or just plain don't redirect to new pages. Steps I have taken: Multiple new sitemaps submitted with new URLs and the indexing looks solid used webmasters to remove urls with natural result listings that did not redirect and produce urls Completely built out new ppc campaigns with new URL structures contacted few major link partners Now here is my question: I have a pages that produce 404s that are linked to in forums, slick deals and stuff like that which will not be redirected. Is disavowing these links the correct thing to do?
Technical SEO | | mm9161570 -
XML Sitemap and unwanted URL parameters
We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing. So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong? Thanks !
Technical SEO | | jfmonfette0 -
Canonical URL
Hi there Our website www.snowbusiness.com has a non www version and this one has 398 backlinks. What is the best way of transfering this link value if i establish the www. address as the canonical URL? Thanks, Ben
Technical SEO | | SnowFX0 -
URL redirect question
Hi all, Just wondering whether anybody has experience of CMSs that do a double redirect and what affect that has on rankings. here's the example /page.htm is 301 redirected to /page.html which is 301 redirected to /page As Google has stated that 301 redirects pass on benefits to the new page, would a double redirect do the same? Looking forward to hearing your views.
Technical SEO | | A_Q0 -
Internal Linking
Hello there, I own a "how to" website with 1000+ articles, and the number of articles is growing every day. Often some articles are easier to understand if I link a certain step to an article that was written before, because that article explains the step in more detail. Should I use "read here/read more" or the "title of the article I'm referring to" as anchor text? When is internal linking too much? Should I use nofollow?
Technical SEO | | FisnikSylka0 -
Link Detox
Hey guys, I'm currently working on cleaning up our link profile and have been looking at several tools. Has any one used this from http://www.linkresearchtools.com do you think its worth investing in? Matthew
Technical SEO | | EwanFisher0 -
Which is The Best Way to Handle Query Parameters?
Hi mozzers, I would like to know the best way to handle query parameters. Say my site is example.com. Here are two scenarios. Scenario #1: Duplicate content example.com/category?page=1
Technical SEO | | jombay
example.com/category?order=updated_at+DESC
example.com/category
example.com/category?page=1&sr=blog-header All have the same content. Scenario #2: Pagination example.com/category?page=1
example.com/category?page=2 and so on. What is the best way to solve both? Do I need to use Rel=next and Rel=prev or is it better to use Google Webmaster tools parameter handling? Right now I am concerned about Google traffic only. For solving the duplicate content issue, do we need to use canonical tags on each such URL's? I am not using WordPress. My site is built on Ruby on Rails platform. Thanks!0