Poor Load Balancer Implementation, now the site is indexed 4 times
-
I was brought on to a project where the network admin has set up a load balancer to distribute traffic but somehow has incorrectly deployed the strategy. Now the site is listed 4 times as links to the primary domain in search console. How can I remove these from the index? I have already requested he no-index them, but they still remain in search console. What else can I do to ensure Google only sees this as a single site?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How To Implement Breadcrumbs
Hi, I'm looking to implement breadcrumbs for e-commerce store so they will appear in the SERP results like the attached image. In terms of implementing to a site, would you simply add HTML to each page like this Google example? Which looks like this: Books › Science Fiction Award Winners Then is there anything you need to do, to get this showing in the SERPs results e.g. doing something in search console. Or do you just wait into google has crawled and hopefully starts showing in the SERPs results? Cheers. wn3ybMMOQFW98fNQkxtJkA.png [SERP results with bread crumbs](SERP results with bread crumbs)
Intermediate & Advanced SEO | | jaynamarino0 -
Need some expert help – My Client bought out competitor and now wants to completely duplicate the current site with the same stock & categories using the Competitor brand
I am the SEO consultant for a large online homewares store. This company currently ranks very well in Google. I can PM the domain name if anyone needs however i don't want to post it on this forum. The company has bought out a competitor and plan to use the same warehouse, same products, and same back-end system as the current site, so they want to completely duplicate the current website. Titles, meta descriptions, product descriptions will all be renamed/rewritten/reworded (however keep in mind there are not many ways to reword a 3 piece saucepan set) Pricing will mostly be the same (some difference though), images cannot be renamed, categories cannot be renamed... the structure of the site will be exactly the same... placement etc. (however will have different banners, logo etc.) I personally don't believe the new site will rank, because it will be too similar. Can someone please offer me a 2nd opinion... Thanks
Intermediate & Advanced SEO | | ryanlenton0 -
Duplicate site (disaster recovery) being crawled and creating two indexed search results
I have a primary domain, toptable.co.uk, and a disaster recovery site for this primary domain named uk-www.gtm.opentable.com. In the event of a disaster, toptable.co.uk would get CNAMEd (DNS alias) to the .gtm site. Naturally the .gtm disaster recover domian is an exact match to the toptable.co.uk domain. Unfortunately, Google has crawled the uk-www.gtm.opentable site, and it's showing up in search results. In most cases the gtm urls don't get redirected to toptable they actually appear as an entirely separate domain to the user. The strong feeling is that this duplicate content is hurting toptable.co.uk, especially as .gtm.ot is part of the .opentable.com domain which has significant authority. So we need a way of stopping Google from crawling gtm. There seem to be two potential fixes. Which is best for this case? use the robots.txt to block Google from crawling the .gtm site 2) canonicalize the the gtm urls to toptable.co.uk In general Google seems to recommend a canonical change but in this special case it seems robot.txt change could be best. Thanks in advance to the SEOmoz community!
Intermediate & Advanced SEO | | OpenTable0 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Optimal site structure for travel site
Hi there, I am seo-managing a travel website where we are going to make a new site structure next year. We have about 4000 pages on the site at the moment. The structure is only 2-levels at the moment: Level 1: Homepage Level 2: All other pages (4000 individual pages - (all with different urls)) We are adding another 2-3 levels, but we have a challenge: We have potentially 2 roads to the same product (e.g. "phuket diving product") domain.com/thailand/activities/diving/phuket-diving-product.asp domain.com/activities/diving/thailand/phuket-diving-product.asp I would very much appreciate your view on the problem: How do I solve this dilemma/challenge from a SEO standpoint? I want to avoid DC if possible, I also only want one landing page - for many reasons. And usability is of course also very important. Best regards, Chris
Intermediate & Advanced SEO | | sembseo0 -
Is my site being penalized?
I launched http://rumma.ge in February of this year. Because I'm using a domain hack (the Georgian domain), I'd really like to rank for just the word "rummage". After launching, I was steady at around page 4/5 on searches for "rummage". However since then I've tumbled out of the first 100. In fact I can't even find the site in the first 20 pages on Google for that search. Even a search for my exact homepage title text doesn't bring up the site, despite the fact that the site is still in the index. I'm wondering if one of the following could be the root cause: We have a ccTLD (.ge)--not sure about the impacts of this, but seems like it might not be the root cause because we were ranking for "rummage" when we first launched. Tried running an Adwords campaign but the site was flagged as a "bridge page" (working on getting this addressed). I'm wondering if this could have carryover impacts into natural search rankings? We've tried doing some press and built up a decent number of backlinks over the past couple of months, many of which had "rummage" in the anchor text. This was all organic, but happened over the span of a month which may be too fast? Am I being penalized? Beyond checking indexing of the site, is there a way to tell if I've been flagged for some bad behavior? Any help or thoughts would be greatly appreciated. I'm really confused by this since I feel like I've been doing things right and my rankings have been travelling downward. Thanks!! Matt
Intermediate & Advanced SEO | | minouye0 -
One platform, multiple niche sites: Worth $60/mo so each site has different class C?
Howdy all, The short of it is that I currently run a very niche business directory/review website and am in the process of expanding the system to support running multiple sites out of the same database/codebase. In a normal setup I'd just run all the sites off of the same server with all of them sharing a single IP address, but thanks to the wonders of the cloud, it would be fairly simple for me to run each site on it's own server at a cost of about $60/mo/site giving each site a unique IP on a unique c-block (in many cases a unique a-block even.) The ultimate goal here is to leverage the authority I've built up for the one site I currently run to help grow the next site I launch, and repeat the process. The question is: Is the SEO-value that the sites can pass to each other worth the extra cost and management overhead? I've gotten conflicting answers on this topic from multiple people I consider pretty smart so I'd love to know what other people say.
Intermediate & Advanced SEO | | qurve0 -
On-Site Optimization Tips for Job site?
I am working on a job site that only ranks well for the homepage with very low ranking internal pages. My job pages do not rank what so ever and are database driven and often times turn to 404 pages after the job has been filled. The job pages have to no content either. Anybody have any technical on-site recommendations for a job site I am working on especially regarding my internal pages? (Cross Country Allied.com)
Intermediate & Advanced SEO | | Melia0