Virtual Hub Page Impact
-
I currently have a website structure that has multiple subfolders.
One of the primary sub-folders has hundreds of pages within (e.g. www.mydomain.com/subfolder1/page)
The pages are all accessible through other subfolders, as contextually appropriate, but there is no existing hub page for the specific pages. In other words, while www.mydomain.com/subfolder1/page1....n are all valid URLs, www.mydomain.com/subfolder1/ is a 404.
My question is given that the pages within the subfolder are accessible through multiple other subfolders, how much of an issue is it that the specific subfolder these pages are within 404s.
Does this negatively impact in any way?
-
TL;DR - Yes and no.
There are two cases and i will be descriptive as possible.
1st case is if subfolder are linked from other pages. Like /subfolder2/page2 -> /subfolder1/ or /subfolder1/page1 -> /subfolder2/. In this case you link direct to 404 page and this can frustrate users and bots too. There is also some confirmation that nonuseful 404 page is low quality signal too: http://themoralconcept.net/pandalist.html (#10 in low quality signals). That's why you need to use crawlers and fix 404s - good for users and good for bots too.
2nd case is when users are curios. I'm one of them. Sometime. So let's say we have curious URLs:
http://www.moz-team.com/randfishkin/article1http://www.moz-team.com/randfishkin/article2
http://www.moz-team.com/randfishkin/article3
http://www.moz-team.com/cyrussheppard/article1
http://www.moz-team.com/cyrussheppard/article2
http://www.moz-team.com/cyrussheppard/article3
as you can see urls are very clean and very descriptive. And now add curious user (pick me!) that can want to see more about Rand or Cyrus. This can be page with CVs, short bio or list of all their articles. So just editing URLs to: http://www.moz-team.com/randfishkin/ or http://www.moz-team.com/cyrussheppard/ using backspace. In perfect world this will give information... but in your case 404. And this is not good for users.That's why it's much better if you can create "category" page for each subfolder even if this isn't linked from other pages. This was explained many times as "silo structure":
https://moz.com/blog/site-architecture-for-seo
http://www.bruceclay.com/eu/seo/silo.htm
http://www.stateofdigital.com/optimising-urls-seo-ux/I hope that this answer will help. You MUST optimize site for users and bots too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we rename and update a page or create a new page entirely?
Hi Moz Peoples! We have a small site with a simple site navigation, with only a few links on the nav bar. We have been doing some work to create a new page, which will eventually replace one of the links on the nav bar. The question we are having is, is it better to rename the existing page and replace its content and then wait for the great indexer to do its thing, or perm delete the page and replace it with the new page and content? Or is this a case where it really makes no difference as long as the redirects are set up correctly?
On-Page Optimization | | Parker8180 -
Noindex pages being indexed
Hi all Wondering if anyone could offer a pointer on a problem i am having please. I am developing an affiliate store and to prevent problems with duplicate content I have added name="robots" content="NOINDEX,FOLLOW" /> to all the product pages to avoid google penalties. However, Google appears to be indexing product pages. When I do a site: search I see a few hundred product pages in the engine. This is odd as the site has always had noindex on these pages. Even viewing the cache of the indexed page shows the noindex meta tag to be in place. I'm at a loss as to why these pages are being indexed and could do with removing them asap to stop any penalties on the site. Many thanks for any help.
On-Page Optimization | | carl_daedricdigital0 -
Local Service Pages
We've all been here before if you do local. What type of content should go on a local service page when dealing with multiple service locations? You could: Describe Services List Local News Articles List staff in that location (although I would prefer in the staff page for that city) Testimonials from that location or service But what happens when you are describing something that needs no explanation. Or a medical procedure that requires no localization and altering the wording can actually cause legal problems if misstated. Matt Cuts recommends a few sentences to a paragraph to describe a service, but my experience hasn't found this to hold up locally. Any ideas or suggestions about how this could be remedied?
On-Page Optimization | | allenrocks0 -
Too many links on page -- how to fix
We are getting reports that there are too many links on most of the pages in one of the sites we manage. Not just a few too many... 275 (versus <100 that is the target). The entire site is built with a very heavy global navigation, which contains a lot of links -- so while the users don't see all of that, Google does. Short of re-architecting the site, can you suggest ways to provide site navigation that don't violate this rule?
On-Page Optimization | | novellseo2 -
To Many Links On Page
I'm having a problem on a crawl warning for our main site. The warning is that every one of my pages has to many links, a little over 1,000 on almost all of them. I think this is because our category list on our left hand sidebar has so many categories, and that sidebar appears on every last one of our pages even all the way into our products. Can anyone take a look and tell me if this is the reason why and what I could possibly do about this? Thanks in advance! www.Ocelco.com
On-Page Optimization | | Mike.Bean0 -
To Reduce (pages)... or not to Reduce?
Our site has a large Business Directory with millions of pages. For examples' sake, let's say it's a directory of Restaurants. Each Restaurant has 4 pages on the site, each tied together through a row of tabs across the top of the page: Tab 1 - Basic super 7 info - name, location, contact info Tab 2 - Restaurant menu Tab 3 - Restaurant reviews Tab 4 - Photos of food The Tab 1 page generates 95% of our traffic, and 90% of conversions. The conversion rate on Tab 2 - Tab 4 pages is 6 - 10x greater than Tab 1 conversions. Total Conversions from search queries on menus, reviews and food are 20% higher than are conversions resulting from searches on restaurant name & info alone. We're working with a consultant on a redesign, who wants to consolidate the 4 pages into one. Their advice is to focus on making a better page, featuring all of the content, sacrifice a little organic traffic but make up any losses by improving conversion. My counterpoint is that we shouldn't scrap the Tab 2-4 pages just because they have lower traffic - we should make the pages BETTER. The content we display is thin, and we have plenty of data we could expose to make the pages more robust. By consolidating it will also be hard to optimize a page for people searching for name/location AND menu AND reviews AND photos. We're asking that one page to do too much, and it's likely we will see diminished search volume for queries on menu, reviews and food. I think the decline will be much more significant than the consultant estimates. The consultant says there will be little change to organic traffic. since Tab 1 already generates 95% of traffic. Through basic math, they're saying the risk is a 5% decline in organic traffic. Further, they see little chance of queries for menu, reviews, and food declining because most of those queries tend to send people too the home page or Tab 1 page anyway. Finally, the designer of the new wireframes admitted that potential organic traffic risks were not taken into consideration when they recommended consolidating the pages. I sincerely appreciate your thoughts and consideration! Trisha
On-Page Optimization | | lzhao0 -
How different does each page tilte need to be?
I've got a site that is all about wood countertops. There are a few ways people can find info on wood tops. (main) wood countertops (main) butcher block butcher block counters wood counters hardwood countertops etc. For the most part I want to rank for the two top key phrases because they pretty much cover all the other basis with google being as smart as it is. So they question is how different should each page title be? Examples: Wood Countertops - Butcher Block Counters | by J. Aaron = index page Wood Counter tops - Butcher Block Counters - About Us | J. Aaron = about us page Cleaning Butcher Block - Wood Countertop Maintenance | J. Aaron = care & maintenance page Would it be OK to use: <title>Wood Countertops - Butcher Block Counters | by J. Aaron</title> as the template for the whole site with the addition of the actual page subject as an additional piece of the sentence, like example 2 or would that be too similar? Also is that a good idea or should I commit to optimizing each page for a different key phrase? If so would you optimize the home page for the most searched for phrase and let the other pages back it up with the other search terms?
On-Page Optimization | | JAARON0 -
If a site has https versions of every page, will the search engines view them as duplicate pages?
A client's site has HTTPS versions of every page for their site and it is possible to view both http and https versions of the page. Do the search engines view this as duplicate content?
On-Page Optimization | | harryholmes0070