What's the best practise for adding a blog to your site post panda? subdomain or subdirectory???
-
Should i use a subdomain or a subdirectory? i was going to use a subdirectory however i have been reading a lot of articles on the use of subdomains post panda and the advantages of using them instead of using subdirectories.
Thanks
Ari
-
I guess that's what SEO is all about..you need to test the water and find what works for you :-).
Think i'm going to go stick with subfolders for this one! Thanks for getting involved anyway Alan!
Cheers
Ari
-
Yes i know Rand's thought on subfoders and subdirectories, the article i read by Rand, he said it was his personal belief. Unless he has any later info on the subject i dont know.
i have never seen and compelling evidecne to show any difference, the best evidence i have is what google said in the links i supplied.
My own expirenenes with subfolders and subdomains has leads me to think there is little differnece, not that that is proof either.
-
Just spoke to Rand, here's what he said " unless you're specifically trying to segment content that you think Google might penalize because it's low quality/Panda-target, I'd stick with subfolders."
-
Yes i am aware of the theroy, but not aware of any proof.
In SEO there is a lesson i have learnt, dont belive the hype
-
Hey Alan, i read the blogs however i'm still not sure..Keep in mind that those posts were created in 2007/2008. A lot of things have changed since..in the industry that i work in i've seen a lot of sites migrating their blogs to a subdomain. I asked the question because i read a recent thread on SEOBOOK about the use of subdomains vs subdirectories,
Check it out
-
It may be so, but i have never seen any evidence or opinion about any differences.
The link you posted referes to advice from Matt Cutts, i have never seen that advice, and i find it odd that Matt would tell you how to get around Panda.
I can only go buy what i have seen as fact, hub pages have an interest in their pages ranking well.
The best evidence is on those links i posted, if you read matt cutts blog link, read his answer to Deb in the comments
-
So it doesn't make a difference if i use a subdirectory or subdomain?? Surely it must make a difference..i know for a fact that the best practise (pre panda) would be to use a subdirectory for hosting a blog on your site...i'm not sure what the case is post panda...there's a lot of chat online about migrating categories from subdirectories to subdomains in order to ensure better rankings in the long term.
See- http://www.whitefireseo.com/site-architecture/subdomain-or-subfolder-post-panda/360/
-
I suggest to use sub-directory for your blog. But make sure to write original content. Further, it should be for the users and not the search engines.
-
i dont think it makes any difference to ranking or panda duplicate content issues.
http://googlewebmastercentral.blogspot.com/2008/01/feeling-lucky-at-pubcon.html
http://www.mattcutts.com/blog/subdomains-and-subdirectories/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Beta Site Removal best practices
Hi everyone.
Intermediate & Advanced SEO | | bgvsiteadmin
We are doing a CMS migration and site redesign with some structural changes. Our temporarily Beta site (one of the staging environments and the only one that is not behind firewall) started appearing in search. Site got indexed before we added robots.txt due to dev error (at that time all pages were index,follow due to nature of beta site, it is a final stage that mirrors live site) As an remedy, we implemented robots.txt for beta version as : User-Agent: *
Disallow: / Removed beta form search for 90 days. Also, changed all pages to no index/no follow . Those blockers will be changed once code for beta get pushed into production. However, We already have all links redirected (301) from old site to new one. this will go in effect once migration starts (we will go live with completely redesigned site that is now in beta, in few days). After that, beta will be deleted completely and become 404 or 410. So the question is, should we delete beta site and simple make 404/410 without any redirects (site as is existed for only few days ). What is best thing to do, we don't want to hurt our SEO equity. Please let me know if you need more clarification. Thank you!0 -
What's the best URL structure?
I'm setting up pages for my client's website and I'm trying to figure out the best way to do this. Which of the following would be best (let's say the keywords being used are "sell xgadget" "sell xgadget v1" "sell xgadget v2" "sell xgadget v3" etc.). Domain name: sellgadget.com Potential URL structures: 1. sellxgadget.com/v1
Intermediate & Advanced SEO | | Zing-Marketing
2. sellxgadget.com/xgadget-v1
3. sellxgadget.com/sell-xgadget-v1 Which would be the best URL structure? Which has the least risk of being too keyword spammy for an EMD? Any references for this?0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
Duplicate content when changing a site's URL due to algorithm penalty
Greetings A client was hit by penguin 2.1, my guess is that this was due to linkbuilding using directories. Google webmaster tools has detected about 117 links to the site and they are all from directories. Furthermore, the anchor texts are a bit too "perfect" to be natural, so I guess this two factors have earned the client's site an algorithm penalty (no manual penalty warning has been received in GWT). I have started to clean some of the backlinks, on Oct the 11th. Some of the webmasters I asked complied with my request to eliminate backlinks, some didn´t, I disavowed the links from the later. I saw some improvements on mid october for the most important KW (see graph) but ever since then the rankings have been falling steadily. I'm thinking about giving up on the domain name and just migrating the site to a new URL. So FINALLY MY QUESTION IS: if I migrate this 6-page site to a new URL, should I change the content completely ? I mean, if I just copy paste the content of the curent site into a new URL I will incur in dpolicate content, correct?. Is there some of the content I can copy ? or should I just start from scratch? Cheers hRggeNE
Intermediate & Advanced SEO | | Masoko-T0 -
Best approach for a client with another site for the same company
I have a client who has an old website and company A handles the SEO campaign for this site.
Intermediate & Advanced SEO | | ao500000
My client wanted us to create a new website with unique content for the same company aiming to double his chances of ranking on the 1st of SERP's and eventually dominating it.
So we created the new site for him and handled it's SEO campaign. So far we are ranking decently on the search engines but we feel like we could do better. The site we are optimizing for him uses the same company, tracking number and a virtual address in the same city.
Do you think Google has a problem with this set up?
We have listed the new site in the citation directories but I'm worried that we are sending google mixed signals. The company has two listing on each directories, one for the old site and another for the new site.
Another thing, Google+ Local for the new site is created and verified but is not showing up in local pack.
What is the best way to approach this mess?
We are looking into ranking for both local & organic results.0 -
Blog posts not showing in serps for exact match title search
hi- my first client ranks #1 for the exact phrase of each blog post title the 2nd client doesnt rank anywhere when i search for the exact post title 2nd client has robots.txt User-agent: *
Intermediate & Advanced SEO | | Ezpro9
Disallow: /wp-admin/
Disallow: /wp-includes/ so that shouldnt noindex any posts right? his site ranks for many kw's - but oddly none of his blog posts are anywhere to be found - i dont mean for a kw search - i mean for searching for the entire title he doesnt rank anywhere in first 5 pages for any of 6-7 posts i checked any idea what could cause this? thanks0 -
Is it possible for a multi doctor practice to have the practice's picture displayed in Google's SERP?
Google now includes pictures of authors in the results of the pages. Therefore, a single practice doctor can include her picture into Google's SERP (http://markup.io/v/dqpyajgz7jkd). How can a multi doctor practice display the practice's picture as opposed to a single doctor? A search for Plastic Surgery Chicago displayed this (query: plastic surgery Chicago) http://markup.io/v/bx3f28ynh4w5. I found one example of a search result showing a picture of both doctors for a multi doctor practice (query: houston texas plastic surgeon). http://markup.io/v/t20gfazxfa6h
Intermediate & Advanced SEO | | CakeWebsites0 -
Migrating a site from a standalone site to a subdivision of large .gov.uk site
The scenario We’ve been asked by a client, a Non-Government Organisation who are being absorbed by a larger government ministry, for help with the SEO of their site. They will be going from a reasonably large standalone site to a small sub-directory on a high authority government site and they want some input on how best to maintain their rankings. They will be going from the Number 1 ranked site in their niche (current site domainRank 59) to being a sub directory on a domainRank 100 site). The current site will remain, but as a members only resource, behind a paywall. I’ve been checking to see the impact that it had on a related site, but that one has put a catch all 302 redirect on it’s pages so is losing the benefit of a it’s historical authority. My thoughts Robust 301 redirect set up to pass as much benefit as possible to the new pages. Focus on rewriting content to promote most effective keywords – would suggest testing of titles, meta descriptions etc but not sure how often they will be able to edit the new site. ‘We have moved’ messaging going out to webmasters of existing linking sites to try to encourage as much revision of linking as possible. Development of link-bait to try and get the new pages seen. Am I going about this the right way? Thanks in advance. Phil
Intermediate & Advanced SEO | | smrs-digital0