All pages going through 302 redirect - bad?
-
So, our web development company did something I don't agree with and I need a second opinion.
Most of our pages are statically cached (the CMS creates .html files), which is required because of our traffic volume. To get geotargeting to work, they've set up every page to 302 redirect to a geodetection script, and back to the geotargeted version of the page.
Eg: www.example.com/category 302 redirects to www.example.com/geodetect.hp?ip=ip_address. Then that page 302 redirects back to either www.example.com/category, or www.example.com/geo/category for the geo-targeted version.
**So all of our pages - thousands - go through a double 302 redirect. It's fairly invisible to the user, and 302 is more appropriate than 301 in this case, but it really worries me. I've done lots of research and can't find anything specifically saying this is bad, but I can't imagine Google being happy with this. **
Thoughts? Is this bad for SEO? Is there a better way (keeping in mind all of our files are statically generated)? Is this perfectly fine?
-
I would think there has to be a better way to do that. Sites detect IP addresses and deliver dynamically created local content all the time. I would think there are some scripts out there which would do what you want without all the 302 redirects. It would be cleaner and better SEO. Unfortunately, I'm not a developer and don't have a specific suggestion, but I'm sure there's a better solution.
-
If you can prevent the redirects then I would definitely choose for this option, I'm not a big fan of redirects because there will always be some damage in the authority that is passed on.
-
This is what I've been struggling with. It's not a link-juice issue, and the page hasn't moved. We're just showing a slightly different version of the page based on where you are coming from. So even though www.example.com/category and www.example.com/geo/category both exist, www.example.com/category is the canonical URL and we don't want the /geo version indexed (because it's essentially duplicate content).
So from a technical perspective, it's essentially being used correctly. My concern is that when google suddenly sees thousands of pages double 302 redirecting, some kind of red flag will go up and we'll be penalized.
-
it's only bad if you want those pages to get ranked and there are links (internal or external) pointing to the referring URLs.
In other words, 302 redirects do not pass link juice as a 301 does. Unless you are no-indexing these pages anyway, it's just not a good idea. If it were me I'd wonder why we were using 302s at all? I've only ever used one once and that was because I didn't want the blackhat-SEO links coming over to the new domain... But this is a different case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise / Help on Bad Link Removals
Hey everyone.
White Hat / Black Hat SEO | | TheITOteam
Im new to the community and new to backlinks - hence the question to the community today.
I would like help understanding options and work load around back links and removing them.
I have a client with over 8000 back links as a few years ago he paid someone about £10 to boost his rankings by adding thousands of backlinks.
We fear this is having a bad effect on their site and rankings organically as 90% of these back links have a spam score of over 50% and also no follows. My questions to the community (if you could be so kind to share) are:
1. Whats the best way to decide if a Backlink is worth keeping or removing
2. Is there a tool to decide this or assist with this somewhere on the internet? Ive had advise stating if its not hurting the page we should keep it. However, again...
How do I know what damage each Backlink is causing to the domain? I appriciate anyones time to offer some advice to a novice looking to clear these1 -
Robots.txt file in Shopify - Collection and Product Page Crawling Issue
Hi, I am working on one big eCommerce store which have more then 1000 Product. we just moved platform WP to Shopify getting noindex issue. when i check robots.txt i found below code which is very confusing for me. **I am not getting meaning of below tags.** Disallow: /collections/+ Disallow: /collections/%2B Disallow: /collections/%2b Disallow: /blogs/+ Disallow: /blogs/%2B Disallow: /blogs/%2b I can understand that my robots.txt disallows SEs to crawling and indexing my all product pages. ( collection/*+* ) Is this the query which is affecting the indexing product pages? Please explain me how this robots.txt work in shopify and once my page crawl and index by google.com then what is use of Disallow: Thanks.
White Hat / Black Hat SEO | | HuptechWebseo0 -
.com geotagging redirect to subdomains - will it affect SEO?
Hi guys, We have a .com domain and we've got geoIP on it, so UK goes to .co.uk and USA goes to .com/us We're just migrating over to another platform so we're thinking of keeping a "dummy" server just to do this geoIP pointing for us. Essentially .com will just point over to the right place and hold a specific .com/abc (which is generic for everyone worldwide) Current Scenario:
White Hat / Black Hat SEO | | Infruition
.com (Magento + geoIP)
.com/us (US Magento)
.co.uk (UK - geoIP redirect to Shopify)
.com/abc (sits on Magento server) Wanted Scenario:
.com - used for GEOIP and a specific .com/abc (for all users)
.co.uk (UK) - Shopify eCom
.com/us -> migration to us.xx.com (USA) - Shopify eCom I just wanted to know if this will affect our rankings on google? Also, any advice as to the best practises here would be great. Thanks! Nitesh0 -
What is really a bad link in 2017?
Hi, Routine answer is: A link which doesn't provides any value. Tired of listening to this statement where we can see number of back-links been generated with different scenarios. There are still many low DA websites which speaks exactly about a brand and link a brand naturally. So, is this a bad link or good link? Let's be honest here. No one gonna visit such pages and browse through our website; it's all about what it's been doing in-terms of SEO. Do these websites to be in disavow list? Beside the context how a brand been mentioned, what are the other metrics to disavow a domain? Expecting some real answers for this straight question. If it's a low DA site and speaking about exactly our website- Good or bad? Vice-versa...high DA website mentioned website with less matching content. What is the proportion of website authority and content context? Can we keep a medium DA backlinks with some Moz spam score?
White Hat / Black Hat SEO | | vtmoz0 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
Unnatural Link Notification - Third Go Round, specific questions
Hi all, I'm posting what is sure to be a common question, but I can't seem to find much information by searching Q&A over the last month so thought I'd throw this out there. There's a lot of 'what do I do??' questions about 'unnatural link notification', but most of them are from first timers. We're pretty far along in the process and it feels like we're going nowhere, so I was hoping to pick the brains of anyone else who's 'been there'. We have a client that we inherited with an unnatural link profile; they were warned shortly after we took them on (around March was the first warning). We compiled an apologetic letter, specifically identified a previous agency who >was< doing bad things, mentioned things would be different from now on, and provided a list of links we were working on to remove based on WMT and OSE and some other sources. This was submitted in early June. Traffic on the main keyword plummeted; ranking went from top 5 to about mid-page 4. We got hit with that same rash of Unnatural Link warnings on July 23 that everyone else did and after looking around I decided not to respond to those. We got a response to the reinclusion request submitted in June above, saying the site was still violating guidelines. This time I went all out, and provided a Google docs spreadsheet of the over 1,500 links we had removed, listed the other links that had no contact info (not even in WHOIS), listed the links we had emailed/contact formed but got no response, everything. So they responded to that recently, simply saying 'site still violates guidelines' with no other details, and I'm not sure what else I can do. The campaign above was quite an investment of resources and time, but I'm not sure how to most efficiently continue. I promised specific questions, so here they are: Are the link removal services (rmoov, removeem, linkdelete, et al) worth investigating? To remove the 1,500 links I mentioned above I had a full time (low paid) person working for a week. Does Google even reconsider after long engagements like this? Most of what I've read has said that inclusion gets cleared up on the first/second request, and we're at bat for the third now. Due to the lack of feedback I don't know if their opinion is "nope, you just missed some" or "you are so blackhat you shouldn't even bother asking anymore". One of the main link holders is this shady guy who runs literally thousands of directories the client appears in thanks to previous SEO agency, and wants $5 per link he removes. Should I mention this to Google, do they even care? Or is it solely our responsibility? Thanks in advance for any advice;
White Hat / Black Hat SEO | | icecarats0 -
301 Redirect?
Is it black-hat SEO to send bluewidget.com, redwidget.com, and greewidget.com to widgetbrands.com (send all to the same page, or same domain home0? Since the domain name matching with keywords is important, this strategy makes sense. It is allowed? How many domains can I 301 redirect to widgetbrands.com if I want to target all the colors of widgets? I'd be willing to put up a one page keyword article of original content for the search engines to crawl before redirecting the 'bluewidget' page. Would this be necessary
White Hat / Black Hat SEO | | Uramark0