Can I disallow my subdomain for penguin recover?
-
Hi,
I have a site like BannerBuzz.com, before last penguin my site's all keywords were in good position in google, but after penguin hit on my website, my all keywords are going down and down day by day, i have done some changes in my website for improvement, but in 1 change i have some confusion.
i have one sub domain (http://reviews.bannerbuzz.com/), which display my websites all keywords user reviews, in which every category's 15 reviews are display in my website http://www.bannerbuzz.com so are those user reviews consider as duplicate content between sub domain and main website.
can i disallow sub domain from all search engine? currently sub domain is open for all search engine, is that helpful to block it?
Thanks
-
Hello Rafi,
I am going to make necessary changes on it. And, I have started work to gather backlinks on home page with Vinyl Banners keyword from various sources. It may help me to recover my old ranking!
-
No problem my friend. You are most welcome.
So if you are using 3rd party services to fill in the reviews content on the sub-domain, you can the following:
1. Stop using the sub-domain henceforth for the reviews content and use the new reviews sub-folder to get the reviews content filled in.
2. Redirect the old reviews content on the sub-domain to the new reviews sub-folder via 301.
This will make sure that you don't loose the SEO goodies that the sub-domain has acquired till date and also all (almost all) of those goodies will be passed on to the new sub-folder.
Please feel free to post any or all of your queries if you have any in this regard.
Best regards,
Devanur Rafi.
-
Thanks Devanur Rafi, for your information
You gave us really great information, but i have one question, currently i am using 3rd party reviews services fro customer's users (powerreviews.com), so is it possible to make sub folder and redirect sub-domain to sub-folder?
-
Hi there,
Here are my two cents in this regard. Instead of showing 10 or 15 reviews on the root domain, show no more than 2 and for more reviews you can send the visitors to the reviews sub-domain (using a 'view more reviews' button as you currently have). This will mitigate duplicate content issues to a great extent if at all any. I do not recommend blocking the sub-domain from the search engines. However, you can move the content of the sub-domain to something like a reviews sub-folder as follows:
From an SEO stand point, sub-folder is a safe bet compared to a sub-domain. Here is what Rand Fishkin has to say in this regard (http://www.seomoz.org/q/subdomains-vs-subfolders
_ “All the testing, research and examples I've seen in the past few years (and even the past few months) strongly suggest that the same principles still hold true._
Subdomains SOMETIMES inherit and pass link/trust/quality/ranking metrics between one another.
Subfolders ALWAYS inherit and pass link/trust/quality/ranking metrics across the same subdomain.
Thus, having a single subdomain (even just domainname.tld with no subdomain extension) with all of your content is absolutely ideal from an SEO perspective. It's also more usable and brandable, too IMO.”
Here is an interesting discussion about the same here on Moz.com:
http://www.seomoz.org/q/multiple-subdomains-my-worst-seo-mistake-now-what-should-i-do
Hope these help.
Best regards,
Devanur Rafi.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I avoid too many internal links in my site navigation?
Hi! I always get this notification on my pages 'Avoid Too Many Internal Links' when I run the Page Optimization Score. And this is the message I get how to fix it: Scale down the number of internal links on your page to fewer than 100, if possible. At a minimum, try to keep navigation and menu links to fewer than 100. On my website I got a desktop navigation menu and a mobile variant, so in the source this will show more internal links. If I hide those links with CSS for the view, is the problem then solved? So Does Google then see less internal links? Or does Google crawl everything? I'm curious how I can fix this double internal links issue with my navigation menu.
Technical SEO | | Tomvl
What are you guys ideas / experiences about this?0 -
How can I Improve Loading Speed? - Parker Dubrule Lawyers
Parker Dubrule Lawyers' website at parkerdubrulelawyers.com seemed to be loading quite slow this morning (>5 Seconds). I added a lazyload plugin, minified JS and CSS, and ensured that the images were optimized---all of this seemed to help and brough it down to under 2 seconds. We are looking at more reliable hosting options for our clients---ones that are inherently faster possibly without these plugins being added to the mix. Does anyone have insight on a safe, secure, and fast hosting/server option to enhance the experience from the get go? All of the websites that we build are in Worpress. Your help is much appreciated! Thanks!
Technical SEO | | Web3Marketing870 -
Google Cache can't keep up with my 403s
Hi Mozzers, I hope everyone is well. I'm having a problem with my website and 403 errors shown in Google Webmaster Tools. The problem comes because we "unpublish" one of the thousands of listings on the site every few days - this then creates a link that gives a 403. At the same time we also run some code that takes away any links to these pages. So far so good. Unfortunately Google doesn't notice that we have removed these internal links and so tries to access these pages again. This results in a 403. These errors show up in Google Webmaster Tools and when I click on "Linked From" I can verify that that there are no links to the 403 page - it's just Google's Cache being slow. My question is a) How much is this hurting me? b) Can I fix it? All suggestions welcome and thanks for any answers!
Technical SEO | | HireSpace1 -
Getting mixed signals regarding how Google treats subdomains
All the posts I've read here and elsewhere regarding subdomains come to a similar conclusion, avoid using them because they are treated as a separate site -- and everything that goes along with that. But on my site we have a subdomain on a separate server and it's treated as internal. Also this from Hubspot - "**Use a subdomain of your website like Blog.HubSpot.com. **This is a great idea and this is what we do currently at HubSpot. Many companies have their blog on a subdomain, and it seems to be starting to be somewhat of a standard. The search engines are treating subdomains more and more as just portions of the main website, so the SEO value for your blog is going to add to your main website domain." Any help clarifying this would be greatly appreciated!
Technical SEO | | titleist1 -
Site (Subdomain) Removal from Webmaster Tools
We have two subdomains that have been verified in Google Webmaster Tools. These subdomains were used by 3rd parties which we no longer have an affiliation with (the subdomains no longer serve a purpose). We have been receiving an error message from Google: "Googlebot can't access your site. Over the last 24 hours, Googlebot encountered 1 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 100.00%". I originally investigated using Webmaster Tools' URL Removal Tool to remove the subdomain, but there are no indexed pages. Is this a case of simply 'deleting' the site from the Manage Site tab in the Webmaster Tools interface?
Technical SEO | | Cary_PCC0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Hit by the penguin update
Hi, My site @ www.mortgageadvicecenter.co.uk has been hit by the penguin update. I cannot be found at all in google for certain keywords such as mortgage advice where we ranked page 2. We still rank for certain keywords. I really don't know why we have been penalised and what to do. I have plenty of brand links and nothing out of the ordinary link distribution. This obviously affects our bottem line so do you have any idea's or time scales? Any help would be appreciated, Thanks Ryan
Technical SEO | | pensionadvice0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0