What is best practice to eliminate my IP addr content from showing in SERPs?
-
Our eCommerce platform provider has our site load balanced in a few data centers. Our site has two of our own exclusive IP addresses associated with it (one in each data center).
Problem is Google is showing our IP addresses in the SERPs with what I would assume is bad duplicate content (our own at that).
I brought this to the attention of our provider and they say they must keep the IP addresses open to allow their site monitoring software to work. Their solution was to add robots.txt files for both IP addresses with site wide/root disallows.
As a side note, we just added canonical tags so the pages indexed within the IP addresses ultimately show the correct URL (non IP address) via the canonical.
So here are my questions.
-
Is there a better way?
-
If not, is there anything else we need to do get Google to drop the several hundred thousand indexed pages at the IP address level? Or do we sit back and wait now?
-
-
I would allow Google to crawl those pages for a little while longer just to ensure that they see the rel canonical tags. Then once you feel that they have recrawled the IP address pages you can disallow them again if you want, thought that isn't entirely necessary if you have the rel canonical tag set up properly.
Another option would be to 301 redirect the IP version of the page to the corresponding www. version.
If they still don't drop from the index you can use the URL Removal Tool in GWT, but you will have to set up a GWT account for each of the IP domains.
-
Thanks. Any suggestions on how to get Google to drop these pages (make them inactive)?
-
Hi,
Since doing the disallow on the IP address sites, they are no longer getting crawled.
** The disavow list won't stop google crawl those domain / pages. Google will just treat those links as no follow - so they won't pass Page Rank.
You will still see those in Web master tools, the links will still be active.
-
Sorry - I just thought of something that could pose a problem and was hoping to get your advice.
Since doing the disallow on the IP address sites, they are no longer getting crawled. Does that mean that the canonical tags within those IP address sites wont be able to do their work?
Or
Will the canonicals picked up from the proper domain help the search engines know they should consolidate the indexed pages from the now disallowed IP addresses?
I am seeing that the IP addresses are no longer being crawled, and the pages in their indexes about the same (not going down).
Thoughts?
-
Sorry - I just thought of something that could pose a problem and was hoping to get your advice.
Since doing the disallow on the IP address sites, they are no longer getting crawled. Does that mean that the canonical tags within those IP address sites wont be able to do their work?
Or
Will the canonicals picked up from the proper domain help the search engines know they should consolidate the indexed pages from the now disallowed IP addresses?
I am seeing that the IP addresses are no longer being crawled, and the pages in their indexes about the same (not going down).
Thoughts?
-
Thanks!
-
Thanks. We are getting large daily crawls (nearly 100k a day) so fingers crossed this will sort it out soon.
-
Hi,
The canonical solution should be enough however I would still build some xml sitemaps and submit those via Web master Tools to speed the process. You can also build some html sitemaps with a clear structure and add those in the footer - again, to speed up the proces a little bit.
If you split the content into multiple xml sitemaps you can also track the crawling process.
You should also check your crawling speed in Web Master Tools to see how many pages in avarage the google bot is hitting each day - based on those numbers you can run some prediction on how long it will take more or less for google to re crawl your pages.
If your numbers is "bad" you will need to improve it some how to help with process - it can do wonders...
Hope it helps.
-
The canonical solution you have implemented is perfect. If you have decent authority and get deep crawls every couple days, you should be fine and pages from your IP should start to disappear shortly.
I would not worry about it anymore. You are on the right track. Sit back, relax and enjoy your flight
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Volatile SERPS?
I have a new client whose rank is jumping around a lot. I took over the account 7 weeks ago, but they'd been doing SEO with a decent provider. They switched to me because they thought he wasn't working very hard and he's not very nice. I just took them on in December and they started out at #18. On Dec 24 they were at 9, Jan 1 at 12, Jan 2 at 14, Jan 3 at 24, Jan 12 at 19, Jan 15 at 28, Jan 16 at 25, Jan 19 at 28 and today at 24. Some of the really bad black hattish providers have bounced up to the top. Any idea what's going on?
Intermediate & Advanced SEO | | julie-getonthemap0 -
Five best practice for SEO
Hello Everyone, I am new to SEO and need some help. I have 6 sites and I need to know what are the top 5 strategies for Off page seo. I mean the most important ones. Thanks Abie
Intermediate & Advanced SEO | | signsny0 -
Many pages small unique content vs 1 page with big content
Dear all, I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
Intermediate & Advanced SEO | | Eurasmus.com
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content. The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla). When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages. What would you choose? Let me know what you think. Thanks!0 -
Best practice for expandable content
We are in the middle of having new pages added to our website. On our website we will have a information section containing various details about a product, this information will be several paragraphs long. we were wanting to show the first paragraph and have a read more button to show the rest of the content that is hidden. Whats googles view on this, is this bad for seo?
Intermediate & Advanced SEO | | Alexogilvie0 -
Wrong page in serps
Hi
Intermediate & Advanced SEO | | niclaus78
I've been working with a law firm's website for a couple of years and we've encounter a problem. The pages were divided to target employers and employees separately. For the very targeted keywords mentioning either employees or employers everything was good but for broader less targeted keywords e.g unfair dismissal keywords chooses either one or the other which is a problem. Now I created this ''bridge'' pages where all the topics are explained and then users are directed to and then they will chose where to go. the problem is a lot of off page was created during this years either targeting on or the other. What I plan to do is: -Create a new site map and changing the priority, so the new pages will have a priority 1 and the others less. - bookmarks, articles, etc will be targeting now to the new pages. I place the new pages linked from the home page so that they get the link juice of the home page and they are also now more a category page in the map, so a level up comparing to the previous ones. Questions: 1- Is it worthwhile adding a rel canonical tag to the new pages and rel alternate to previous pages, or if its not a question of duplicate content it shouldn't have an impact? What other things should I take into consideration? Thanks a lot. nico0 -
Old URL showing up in SERPs 4 months after Re-direct
Hi guys, I did a full site redirect back in October to a new URL, SERPS eventually changed to the new URL and everything was fine. However recently i have started to see the old URL showing up? Anyone else seeing this?
Intermediate & Advanced SEO | | Martin_Harris0 -
Breadcrumbs in SERPs
Hi there, Breadcrumbs now showing for my websites in Google.co.uk & Google.ie I have 2 websites .co.uk & .ie both having the same content. There are legalities for having 2 separate domains. Anyway... When I click on one of the breadcrumbs in the SERPs in Google.ie it takes me to the .co.uk website, however the title for that result takes me to the .ie website as it should. I have checked my breadcrumbs on the .ie website and all is fine, however in SERPs it takes me to the .co.uk website, any ideas why this maybe? Kind Regards
Intermediate & Advanced SEO | | Paul780 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840