Should I let Google crawl my production server if the site is still under development?
-
I am building out a brand new site. It's built on Wordpress so I've been tinkering with the themes and plug-ins on the production server. To my surprise, less than a week after installing Wordpress, I have pages in the index.
I've seen advice in this forum about blocking search bots from dev servers to prevent duplicate content, but this is my production server so it seems like a bad idea.
Any advice on the best way to proceed? Block or no block? Or something else? (I know how to block, so I'm not looking for instructions).
- We're around 3 months from officially launching (possibly less).
- We'll start to have real content on the site some time in June, even though we aren't planning to launch.
- We should have a development environment ready in the next couple of weeks.
Thanks!
-
Thank you for the detailed response, Paul. I'll get cracking on your suggestions.
I was mostly worried that if I blocked it now, it would be mad at me later. You've given me a way to deal with the bot concerns.
I am less concerned that anyone will find these pages. I only knew about their index status because of one of my monitoring services which alerted me that google was crawling.
-
Thanks for the confirmation, Dan! Looks like you're up & working early on a Sunday morning
-
In my opinion, no, you definitely should NOT allow the production server to be indexed while it's in this state. For all intents and purposes it IS your dev server at the moment, and the last thing you want is for the search crawlers to think that what's there will be representative of the quality of your site when it's finished.
My recommendation:
- get the current site out of the SERPs. (Use WordPress setting in Settings -> Read to check the "Discourage from indexing" box. DON'T add a no-index in robots.txt until the pages have all dropped out of the SERPs)
- when the dev site goes into operation, make _certain_right from the start it cannot be crawled (vastly better than trying to fix the problem after it get's accidentally indexed).
- as soon as you have time, build a proper front page and a few content pages on the production site that indicate what the full site will be about, and get some strong basic, well-written content on there that will also remain after the go-live. (keep ALL the rest of the pages of the prod site out of the SERPs with meta no-index tags)
- once you have a the new, stable, basic content up on prod, allow the SEs to start indexing it.
This gets the messy stuff out of the SERPs before it can pollute the index (and gives you a bad reputation with any actual visitors to the site who shouldn't be seeing your tinkering). By getting some real content as soon as possible, even on a very basic template, you'll start giving the SEs a quality idea of what is to come. Wouldn't hurt to start building a few backlinks once the basic content is up on prod - e.g. links from its new social profiles etc.
This way, when the full site goes live, you'll already have some quality visibility in the engines, so it will be quicker to get the rest of the new site crawled and indexed.
Does that make sense?
Paul
P.S. If at all appropriate, use the basic prod content to show why/how they should connect with you on social media, and offer them a chance to sign up for your newsletter notification of when the site goes live. (It's never too early to start trying to get those subscribers!)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirecting a site that currently links to the target site
I have a personal blog that has a good amount of back links pointing at it from high quality relevant authoritative sites in my niche. I also run a company in the same niche. I link to a page on the company site from the personal blog article that has bunch of relevant links pointing at it (as it's highly relevant to the content on the personal blog). Overview: Relevant personal blog post has a bunch of relevant external links pointing at it (completely organic). Relevant personal blog post then links (externally) to relevant company site page and is helping that page rank. Question: If I do the work to 301 the personal blog to the company site, and then link internally from the blog page to the other relevant company page, will this kill that back link or will the internal link help as much as the current external link does currently? **For clarity: ** External sites => External blog => External link to company page VS External sites => External blog 301 => Blog page (now on company blog) => Internal link to target page I would love to hear from anyone that has performed this in the past 🙂
Intermediate & Advanced SEO | | Keyword_NotProvided0 -
Structured Data and Google Rich Cards for products
It appears Google is moving towards the Rich Cards JSON-LD for all data. https://webmasters.googleblog.com/2016/05/introducing-rich-cards.html However on an ecommerce site when I have schema.org microdata structured data inline for a product and then I add the JSON-LD structured data Google treats that as two products on the page even though they are the same. To make the matter more confusing Bing doesn't appear to support JSON-LD. I can go back to the inline structured data only, but that would mean when Rich Cards for products eventually come I won't be ready. What do you recommend I do for long term seo, go back to the old or press forward with JSON-LD?
Intermediate & Advanced SEO | | K-WINTER0 -
Our site is on a secure server (https) will a link to http:// be of less value?
Our site is hosted on a secure network (I.E. Our web address is - https://www.workbooks.com). Will a backlink pointing to: http://www.workbooks.com provide less value than a link pointing to: https://www.workbooks.com ? Many thanks, Sam
Intermediate & Advanced SEO | | Sam.at.Moz0 -
How can Google index a page that it can't crawl completely?
I recently posted a question regarding a product page that appeared to have no content. [http://www.seomoz.org/q/why-is-ose-showing-now-data-for-this-url] What puzzles me is that this page got indexed anyway. Was it indexed based on Google knowing that there was once content on the page? Was it indexed based on the trust level of our root domain? What are your thoughts? I'm asking not only because I don't know the answer, but because I know the argument is going to be made that if Google indexed the page then it must have been crawlable...therefore we didn't really have a crawlability problem. Why Google index a page it can't crawl?
Intermediate & Advanced SEO | | danatanseo0 -
Shopify Product Variants vs Separate Product Pages
Let's say I have 10 different models of hats, and each hat has 5 colors. I have two routes I could take: a) Make 50 separate product pages Pros: -Better usability for customer because they can shop for just masks of a specific color. We can sort our collections to only show our red hats. -Help SEO with specific kw title pages (red boston bruins hat vs boston bruins hat). Cons: -Duplicate Content: Hat model in one color will have almost identical description as the same hat in a different color (from a usability and consistency standpoint, we'd want to leave descriptions the same for identical products, switching out only the color) b) Have 10 products listed, each with 5 color variants Pros: -More elegant and organized -NO duplicate Content Cons: -Losing out on color specific search terms -Customer might look at our 'red hats' collection, but shopify will only show the 'default' image of the hat, which could be another color. That's not ideal for usability/conversions. Not sure which route to take. I'm sure other vendors must have faced this issue before. What are your thoughts?
Intermediate & Advanced SEO | | birchlore0 -
If I had an issue with a friendly URL module and I lost all my rankings. Will they return now that issue is resolved next time I'm crawled by google?
I have 'magic seo urls' installed on my zencart site. Except for some reason no one can explain why or how the files were disabled. So my static links went back to dynamic (index.php?**********) etc. The issue was resolved with the module except in that time google must have crawled my site and I lost all my rankings. I'm nowher to be found in the top 50. Did this really cause such an extravagant SEO issue as my web developers told me? Can I expect my rankings to return next time my site is crawled by google?
Intermediate & Advanced SEO | | Pete790 -
Adding a huge new product range to eCommerce site and worried about Duplicate Content
Hey all, We currently run a large eCommerce site that has around 5000 pages of content and ranks quite strongly for a lot of key search terms. We have just recently finalised a business agreement to incorporate a new product line that compliments our existing catalogue, but I am concerned about dumping this huge amount of content (that is sourced via an API) onto our site and the effect it might have dragging us down for our existing type of product. In regards to the best way to handle it, we are looking at a few ideas and wondered what SEOMoz thought was the best. Some approaches we are tossing around include: making each page point to the original API the data comes from as the canonical source (not ideal as I don't want to pass link juice from our site to theirs) adding "noindex" to all the new pages so Google simply ignores them and hoping we get side sales onto our existing product instead of trying to rank as the new range is highly competitive (again not ideal as we would like to get whatever organic traffic we can) manually rewriting each and every new product page's descriptions, tags etc. (a huge undertaking in terms of working hours given it will be around 4,400 new items added to our catalogue). Currently the industry standard seems to just be to pull the text from the API and leave it, but doing exact text searches shows that there are literally hundreds of other sites using the exact same duplicate content... I would like to persuade higher management to invest the time into rewriting each individual page but it would be a huge task and be difficult to maintain as changes continually happen. Sorry for the wordy post but this is a big decision that potentially has drastic effects on our business as the vast majority of it is conducted online. Thanks in advance for any helpful replies!
Intermediate & Advanced SEO | | ExperienceOz0 -
Should I let my Apache server compress automatically site information?
My internet service provider has an option to let Apache compress site information. They give you two options: compress all content or compress only the MIME type specific. This is good for SEO?
Intermediate & Advanced SEO | | Naghirniac0