Development/Test Ecommerce Website Mistakenly Indexed
-
My question is - relatively speaking, how damaging to SEO is it to have BOTH your development/testing site and your live version indexed/crawled by Google and appearing in the SERPs?
We just launched about a month ago, and made a change to the robots text on the development site without noticing ... which lead to it being indexed too.So now the ecommerce website is duplicated in Google ... each under different URLs of course (and on diff servers, DNS etc)
We'll fix it right away ... and block crawlers to the development site. But again, may general question is what is the general damage to SEO ... if any ... created by this kind of mistake. My feeling is nothing significant
-
No my friend, no! I'm saying we'll point the existing staging/testing environment to the production version and will stop using it as staging instead of closing it completely like I mentioned earlier. And, we'll launch a fresh instance for staging/testing use case.
This will help us transferring majority if the link juice of already indexed staging/testing instance.
-
Why would you want to 301 a staging/dev environment to a production site? Unless you plan on making live changes to the production server (not safe), you'd want to keep them separate. Especially for eCommerce it would be important to have different environments to test and QA before pushing a change live. Making any change that impacts a number of pages could damage your ability to generate revenue from the site. You don't take down the development/testing site, because that's your safe environment to test changes before pushing updates to production.
I'm not sure I follow your recommendation. Am I missing a critical point?
-
Hi Eric,
Well, that's a valid point that bots might have considered your staging instances as the main website and hence, this could end up giving you nothing but a face palm.
The solution you suggested is similar to the one I suggested where we are not getting any benefit from the existing instance by removing it or putting noindex everywhere.
My bad! I assumed your staging/testing instance(s) got indexed recently only and are not very powerful from domain & page authority perspective. In fact, being a developer, I should have considered the worst case only
Thanks for pointing out the worst case Eric i.e when your staging/testing instances are decently old and you don't want to loose their SEO values while fixing this issue. And, here'e my proposed solution for it: don't removed the instance, don't even put a noindex everywhere. The better solution would be establishing a 301 redirect bridge from your staging/testing instance to your original website. In this case, ~90% of the link juice that your staging/testing instances have earned, will get passed. Make sure each and every URL of the staging/testing instance is properly 301 redirecting to the original instance.
Hope this helps!
-
It could hurt you in the long run (Google may decide the dev site is more relevant than your live site), but this is an easy fix. No-index your dev site. Just slap a site-wide noindex meta tag across all the pages, and when you're ready to move that code to the production site you remove that instance of code.
Disallowing from the robots.txt file will help, but that's a soft request. The best way to keep the dev site from being indexed is to use the noindex tag. Since it seems like you want to QA in a live environment that would prevent search engines from indexing the site, and still allow you to test in a production-like scenario.
-
Hey,
I recently faced the same issue when the staging instances got indexed accidentally and we were open for the duplicate content penalty (well, that's not cool). After a decent bit of research, I followed the following steps and got rid of this issue:
- I removed my staging instances i.e staging1.mysite.com, staging2.mysite.com and so on. Removing such instances helps you deindex already indexed pages faster than just blocking the whole website from robots.txt
- Relaunched the staging instances with a slightly different name like new-staging1.mysite.com, new-staging2.mysite.com and disallow bots on these instances from the day zero to avoid this mess again.
This helped me fixing this issue asap. Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cross Linking two related ecommerce websites
Hi Guys, Hope you'll be able to help me with a technical problem I am facing right now. We are a company right ? We own 2 webistes. Let's say one sells car parts, the other one buys second hand car parts to refurbish them and sell them. (It is not our case, just an example very similar to ours). sellparts.com buyparts.com Both are ecommerce websites, with large catalogues (7000 skus). sellparts sells a lot and is a big actor in its market. buyparts.com doesn't work nad has a really low DA. My new SEO external consultant, which I am not too convinced about, is telling me to cross link the sites on product level using cross-linking extensions. He want have them do-follow. That would mean having hundreds or thousands of links with really similar linking patterns. buy [parts] [model ] [make] sell [parts] [model ] [make] That to me seems a bit too much and I am worried it compromises the sellparts site's SEO. So should i no-follow the links ? Or do it differently ?
Intermediate & Advanced SEO | | Kepass0 -
URL Index Removal for Hacked Website - Will this help?
My main question is: How do we remove URLs (links) from Google's index and the 1000s of created 404 errors associated with them after a website was hacked (and now fixed)? The story: A customer came to us for a new website and some SEO. They had an existing website that had been hacked and their previous vendor was non-responsive to address the issue for months. This created THOUSANDS of URLs on their website that were then linked to pornographic and prescription med SPAM sites. Now, Google has 1,205 pages indexed that create 404 errors on the new site. I am confident these links are causing Google to not rank well organically. Additional information: Entirely new website Wordpress site New host Should we be using the "Remove URLs" tool from Google to submit all 1205 of these pages? Do you think it will make a difference? This is down from the 22,500 URLs that existed when we started a few months back. Thank you in advance for any tips or suggestions!
Intermediate & Advanced SEO | | Tosten0 -
A new website issue
Hello everybody,
Intermediate & Advanced SEO | | mtmaster
I have started a new website 22 days ago at the beginning of this month and i have long articles. I think this should make the site appear in search results for long tail keywords even if they are not very relevant but as you can see in the attached image from my webmaster tools the impression count has suddenly increased to 100 then significantly decreased again. Even when i cancel "filter" option. Is this normal for a 3 weeks old website? or there is something i have to check? thanks. cLMa04l.jpg0 -
Duplicate Content / Canonical Conundrum on E-Commerce Website
Hi all, I’m looking for some expert advice on use of canonicals to resolve duplicate content for an e-Commerce site. I’ve used a generic example to explain the problem (I do not really run a candy shop). SCENARIO I run a candy shop website that sells candy dispensers and the candy that goes in them. I sell about 5,000 different models of candy dispensers and 10,000 different types of candy. Much of the candy fits in more than one candy dispenser, and some candy dispensers fit exactly the same types of candy as others. To make things easy for customers who need to fill up their candy dispensers, I provide a “candy finder” tool on my website which takes them through three steps: 1. Pick your candy dispenser brand (e.g. Haribo) 2. Pick your candy dispenser type (e.g. soft candy or hard candy) 3. Pick your candy dispenser model (e.g. S4000-A) RESULT: The customer is then presented with a list of candy products that they can buy. on a URL like this: Candy-shop.com/haribo/soft-candy/S4000-A All of these steps are presented as HTML pages with followable/indexable links. PROBLEM: There is a duplicate content issue with the results pages. This is because a lot of the candy dispensers fit exactly the same candy (e.g. S4000-A, S4000-B and S4000-C). This means that the content on these pages are the basically same because the same candy products are listed. I’ll call these the “duplicate dispensers” E.g. Candy-shop.com/haribo/soft-candy/S4000-A Candy-shop.com/haribo/soft-candy/S4000-B Candy-shop.com/haribo/soft-candy/S4000-C The page titles/headings change based on the dispenser model, but that’s not enough for the pages to be deemed unique by Moz. I want to drive organic traffic searches for the dispenser model candy keywords, but with duplicate content like this I’m guessing this is holding me back from any of these dispenser pages ranking. SOLUTIONS 1. Write unique content for each of the duplicate dispenser pages: Manufacturers add or discontinue about 500 dispenser models each quarter and I don’t have the resources to keep on top of this content. I would also question the real value of this content to a user when it’s pretty obvious what the products on the page are. 2. Pick one duplicate dispenser to act as a rel=canonical and point all its duplicates at it. This doesn’t work as dispensers get discontinued so I run the risk of randomly losing my canonicals or them changing as models become unavailable. 3. Create a single page with all of the duplicate dispensers on, and canonical all of the individual duplicate pages to that page. e.g. Canonical: candy-shop.com/haribo/soft-candy/S4000-Series Duplicates (which all point to canonical): candy-shop.com/haribo/soft-candy/S4000-Series?model=A candy-shop.com/haribo/soft-candy/S4000-Series?model=B candy-shop.com/haribo/soft-candy/S4000-Series?model=C PROPOSED SOLUTION Option 3. Anyone agree/disagree or have any other thoughts on how to solve this problem? Thanks for reading.
Intermediate & Advanced SEO | | webmethod0 -
Wise or cluttery for a website? Should our "out of the mainstream" of popular products be listed on our site? (older/discontinued, umfamiliar brands, parts to products, etc...)
For instance, should we list replacement parts for a music stand? Or parts for a trumpet, like a valve button? To some, this seems like a cluttery thing to do. I suppose another way to ask would be, "Should we only list the high quantity selling items that are well branded and that everyone shops for, and leave the rest off the website for instore customers only to buy?" (FYI: Our website focus is for our local market mainly, and we're not trying to take on the world per-say, but if the world wants in, that's cool too.) (My thought here is that if a customer walks into our retail store and they request an odd ball part or item... we go hunting for it and find it for them. Or perhaps another Music Store needs a part? To me, it's ALL for sale,... right? Our retail depth, should be reflected in our online presence as much as possible,... correct? I'd personally choose to list the odd balls on our site, just as if a customer was standing in the store. Another side thought is, if we only list the main stream products... we are basically lessening our content (which could affect our rankings) and would be inviting ourselves into a higher competitive market place because we wouldn't be saying anything different than what most other music store sites out there say. I believe we need to show off our uniqueness,... and product depth (of course w/good SEO & content too) is really kinda it, aside of course also from good expert people and a large facility. But perhaps that's a wrong way to look at it?) Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
Increasing index
Hi! I'm having some trouble getting Google to index pages which once had a querystring in them but now are being redirected with a 301. The pages have a lot of unique content but this doesn't seem to matter. I feels as if there stuck in limbo (or a sandbox 🙂 Any clues on how to fix this? Thanks / Niklas
Intermediate & Advanced SEO | | KAN-Malmo0 -
To index or not to index search pages - (Panda related)
Hi Mozzers I have a WordPress site with Relevanssi the search engine plugin, free version. Questions: Should I let Google index my site's SERPS? I am scared the page quality is to thin, and then Panda bear will get angry. This plugin (or my previous search engine plugin) created many of these "no-results" uris: /?s=no-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Akids+wall&cat=no-results&pg=6 I have added a robots.txt rule to disallow these pages and did a GWT URL removal request. But links to these pages are still being displayed in Google's SERPS under "repeat the search with the omitted results included" results. So will this affect me negatively or are these results harmless? What exactly is an omitted result? As I understand it is that Google found a link to a page they but can't display it because I block GoogleBot. Thanx in advance guys.
Intermediate & Advanced SEO | | ClassifiedsKing0 -
Removing pages from index
Hello, I run an e-commerce website. I just realized that Google has "pagination" pages in the index which should not be there. In fact, I have no idea how they got there. For example, www.mydomain.com/category-name.asp?page=3434532
Intermediate & Advanced SEO | | AlexGop
There are hundreds of these pages in the index. There are no links to these pages on the website, so I am assuming someone is trying to ruin my rankings by linking to the pages that do not exist. The page content displays category information with no products. I realize that its a flaw in design, and I am working on fixing it (301 none existent pages). Meanwhile, I am not sure if I should request removal of these pages. If so, what is the best way to request bulk removal. Also, should I 301, 404 or 410 these pages? Any help would be appreciated. Thanks, Alex0