Subdomain vs Subdirectory - Specific Case: A big blog in a subdomain
-
Hi.
First of all, I love MOZ and learned a lot about SEO by reading articles here. Thanks for all the knowledge that i received here.
I read all the articles about "Subdomain vs Subdirectory" in the MOZ community and I have no doubt that subdirectories are the best option for a blog.
But, the company that I work now has a blog with more than 17.000 articles, 1.000 categories and tags, hosted on a subdomain structure.
The website has a Domain Authority of 78 (I am working to improve these numbers) and the blog subdomain has the same (78). We had 2.7 million hits per month in the blog and 4.5 million hits per month in the site.
I am advising the company to change the blog structure to subfolders inside the domain, but I'm finding resistance to the idea, because the amount of work involved in this change is enormous and there is still the fear of losing traffic.
My questions are:
Is there any risk of losing traffic with the amount of articles we have?
What do we probably get if we change the blog structure to subfolders? Could we have increased authority for the domain? More Traffic?
How can I explain to my superiors that we would probably have increase traffic for our keywords?
Is there any way to prove or test the gains from this change before we run it?
Thanks in Advance.
-
One of the advantages of migrating a blog into a site's structure is outside of the realm of SEO, but can be helpful in making a case to do this. I have migrated several blogs into sections of existing e-commerce sites, as "resource centers". These were not pure migrations, but rather restructuring of the content into pages whcih co-exist with merchandising on the e-commerce site. Here, the advantage is that once a visitor is attracted to a content page on the site, they are more effectively converted into an e-commerce prospect. This is especially true if the blog articles are restructured to include merchandising elements. But even if not, if the only difference is that the visitor is now on a site whose navigation structure includes all of the e-commerce merchandising, they are more likely to convert on such a site. So, the benefit is more about converting customers than attracting visitors.
One way I've tested this is to pick a relatively valuable single page, and migrate just that to an on-site page, with a one-to-one redirect. You will have to wait past an initial period where both of your pages may co-exist in the SERPs. But once the old blog page drops out and the new page comes in, then you can test the longer-term impact. Again, the number of visitors may actually be smaller, but you might more than compensate for that by converting more of those visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Specific page does not index
Hi, First question: Working on the indexation of all pages for a specific client, there's one page that refuses to index. Google Search console says there's a robots.txt file, but I can't seem to find any tracks of that in the backend, nor in the code itself. Could someone reach out to me and tell me why this is happening? The page: https://www.brody.be/nl/assistentiewoningen/ Second question: Google is showing another meta description than the one our client gave in in Yoast Premium snippet. Could it be there's another plugin overwriting this description? Or do we have to wait for it to change after a specific period of time? Hope you guys can help
Intermediate & Advanced SEO | | conversal0 -
Big problems with site traffic
Hello! I have big problems with website promotion. It's been 7 months and the attendance on the site is 1-5 people a day. I do not understand the reason. Can you tell me what I'm doing wrong? Site: www.azartlist.com Many thanks.
Intermediate & Advanced SEO | | Bobic1 -
Changing URLs from sentence case to lower case
Hi Guys, We are contemplating of changing our site URL structure from sentence case to all lowercase. www.example.com/All-Products/Bedroom-Furniture/ www.example.com/all-products/bedroom-furniture/ We will use 301 redirect for old to new. Its a 3 year old ecommerce site and currently rank very decent on serps. The agency that does our seo is recommending this change and reckons that all lowecase URLs as preferred over our current URL structure. My worry is we will lose our current ranking but agency advises that rankings will probably go lower or fluctuate for some time and get back to its original position or may even rank better in due course as we are doing a 301 redirect and once the site is crawled Google will know the change. We are approaching Christmas and thenext 2 months are most busiest period of the year, we don't want to risk on traffic. I would really appreciate if the community experts can advise, Is it really that lowercase URLs are better than our current url structure? By doing 301 will our rankings come back to same in "due course" ? How much of a risk is it to do these changes at this time of the year? Thanking you in advance, Sohail
Intermediate & Advanced SEO | | tigersohelll1 -
How to Transfer Content from a Blog to Another
Hello Guys, I have 2 blogs with content. One is getting alot of visitors and the other gets alot less. I´m thinking in transferring all the content from the "weak" blog to the "strong" blog. Both websites are on wordpress. My questions is pretty simple. How can I transfer this content without loosing traffic and how can I avoid duplicate content? Whats the best SEO practices? Thanks!
Intermediate & Advanced SEO | | Kalitenko20140 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Removing A Blog From Site...
Hi Everyone, One of my clients I am doing marketing consulting for is a big law firm. For the past 3 years they have been paying someone to write blog posts everyday in hopes of improving search traffic to site. The blog did indeed increase traffic to the site, but analyzing the stats, the firm generates no leads (via form or phone) from any of the search traffic that lands in the blog. Furthermore, I'm seeing Google send many search queries that people use to get to the site to blog pages, when it would be much more beneficial to have that traffic go to the main part of the website. In short, the law firm's blog provides little to no value to end users and was written entirely for SEO purposes. Now the law firm's website has 6,000 unique pages, and only 400 pages of the site are NON-blog pages (the good stuff, essentially). About 35% of the site's total site traffic lands on the blog pages from search, but again... this traffic does not convert, has very high bounce rate and I doubt there is any branding benefit either. With all that said, I didn't know if it would be best to delete the blog, redirect blog pages to some other page on the site, etc? The law firm has ceased writing new blog posts upon my recommendation, as well. I am afraid of doing something ill-advised with the blog since it accounts now for 95% of the pages of the website. But again, it's useless drivel in my eyes that adds no value and was simply a misguided SEO effort from another marketer that heard blogs are good for SEO. I would certainly appreciate any guidance or advice on how best to handle this situation. Thank you for your kind help!
Intermediate & Advanced SEO | | gbkevin0 -
Local search vs. Organic Listings
Hi ~ I was interested to see if anyone feels there might be an advantage to keeping a business out of Google's Local Search listing area or at least trying to keep it out of the 7-pack display? It seems to me that sites who are not listed in the 7-pack can often be ranked above the maps/7-pack area in the regular organic listings. Also, is there anyway for a homepage to be listed on the 1st page in both the local search and organic listings? Thanks!
Intermediate & Advanced SEO | | hhdentist0 -
SEO value in baclklink from blog.domain VS domain
Will a back-link from "domain.com/abc" and "blog.domain.com/abc" have same value from an SEO perspective? Assume same article written on both sites.
Intermediate & Advanced SEO | | knielsen
I have been told the bots look at the domain value and the only links from blogs that have less value are in case of comments. As long as the "blog.domain/abc" page includes a full article and not a blog comment then it counts fully for SEO. Is this correct?0