Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Dead end pages are really an issue?
-
Hi all,
We have many pages which are help guides to our features. These pages do not have anymore outgoing links (internal / external). We haven't linked as these are already 4th level pages and specific about particular topic. So these are technically dead end pages. Do these pages really hurt us? We need to link to some other pages?
Thanks
-
Thanks for the response. Actually we serve our pages differently to Google bots and Users. Users can see related pages but we can't see while browsing as bot. So, ultimately there are no other links on most of the pages for Google. With your brief, its clear that we definitely can have different links. But as I said, these pages are user guides about every feature we provide. So can we link back this page to same page of feature it's been linked from? And we think about what other links we can employ on these pages.
-
Hi vtmoz,
Are you saying that you don't have a header/footer on these pages? No main navigation? No breadcrumbs?
To me, that sounds like a terrible user experience. What is a user supposed to do when they get to these pages? Click the back button in the browser or manually edit the URL in the address bar?
If you do have a header/footer with navigation links, then it's not a dead end page
Cheers,
David
-
Yes, Dead-end pages are harmful to websites.
Solution 1: It should be optimized to include links to the homepage or some important links to the site.
Solution 2: Use sidebar, footer or banner.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix non-crawlable pages affected by CSS modals?
I stumbled across something new when doing a site audit in SEMRUSH today ---> Modals. The case: Several pages could not be crawled because of (modal:) in the URL. What I know: "A modal is a dialog box/popup window that is displayed on top of the current page" based on CSS and JS. What I don't know: How to prevent crawlers from finding them.
Web Design | | Dan-Louis0 -
Is having a site map page necessary?
Hello all! So I know having a sitemap XML file is important to include in your robots.txt file. I also know it is important to submit your XML sitemap to Google and Bing. However, I am wondering if it is beneficial for your site's SEO value to have a sitemap page displayed on your website? Or is this just a redundant action if you have already done the above two actions with your XML sitemap? Thanks in advance!
Web Design | | Myles920 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
2 Menu links to same page. Is this a problem?
One of my clients wants to link to the same page from several places in the navigation menu. Does this create any crawl issues or indexing problems? It's the same page (same url) so there is no duplicate content problems. Since the page is promotional, the client wants the page accessible from different places in the nav bar. Thanks, Dino
Web Design | | Dino640 -
SEO and Squarespace? Is this Really an Option?
Hi all, Any feedback on Squarespace, SEO capabilites and ranking factors? I have a client wishing to use the platform and despite the good reviews, which appear to be from resellers by the way, the forums say not. Although apparently Rand Fishkin, SEOMoz (yes right here!) gave them a big thumbs up “The square space team have put together a remarkable platform, SEO friendliness! Really not sure here and don’t agree, there are many limitations and hosting with a template provider is always big no no. Cheers
Web Design | | VirginiaC
Virginia0 -
What is the best tool to view your page as Googlebot?
Our site was done with asp.net and a lot of scripting. I want to see what Google can see and what it can't. What is the best tool that duplicates Googlebot? I have found several but they seem old or inaccurate.
Web Design | | EcommerceSite0 -
Two home pages?
One of my campaigns shows duplicate page content for domain xxx and xxx/index. There is only one index (home) page, so why does it report on two?
Web Design | | Beemer0 -
Site-wide footer links or single "website credits" page?
I see that you have already answered this question before back in 2007 (http://www.seomoz.org/qa/view/2163), but wanted to ask your current opinion on the same question: Should I add a site-wide footer link to my client websites pointing to my website, or should I create a "website credits" page on my clients site, add this to the footer and then link from within this page out to my website?
Web Design | | eseyo0