Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multi-page articles, pagination, best practice...
-
A couple months ago we mitigated a 12-year-old site -- about 2,000 pages -- to WordPress.
The transition was smooth (301 redirects), we haven't lost much search juice.We have about 75 multi-page articles (posts); we're using a plugin (Organize Series) to manage the pagination.
On the old site, all of the pages in the series had the same title. I've since heard this is not a good SEO practice (duplicate titles). The url's were the same too, with a 'number' (designating the page number) appended to the title text.
Here's my question:
1. Is there a best practice for titles & url's of multi-page articles?
Let's say we have an article named: 'This is an Article' ... What if I name the pages like this:
-- This is an Article, Page 1
-- This is an Article, Page 2
-- This is an Article, Page 3Is that a good idea? Or, should each page have a completely different title? Does it matter?
** I think for usability, the examples above are best; they give the reader context.What about url's ? Are these a good idea? /this-is-an-article-01, /this-is-an-article-02, and so on...
Does it matter?2. I've read that maybe multi-page articles are not such a good idea -- from usability and SEO standpoints. We tend to limit our articles to about 800 words per page. So, is it better to publish 'long' articles instead of multi-page? Does it matter? I think I'm seeing a trend on content sites toward long, one-page articles.
3. Any other gotchas we should be aware of, related to SEO/ multi-page?
Long post... we've gone back-and-forth on this a couple times and need to get this settled.
Thanks much!Jim
-
Guys, thanks.
-
Just to weigh in, I would agree with Jeff in that 1 long page is much better from both a usability and SEO standpoint.
In my view, multiple pages should only exist if it is in the context of a hub page. For example, consider a page that is for slow cooker recipes. Instead of having hundreds of recipes on 1 page, it would make sense to have a sub-page for each recipe. Eg:
- Example.com/slow-cooker-recipes/
- example.com/slow-cooker-recipes/lamb-stew
- example.com/slow-cooker-recipes/chicken-casserole
Check out the site architecture section on the following link for a good explanation:
http://moz.com/blog/how-to-rank
Best of luck!
-Oli
-
Jim-
I'm not a big fan of articles that are broken up onto many pages.
The thinking in the past has been:
- Break up the pages, and you get more page views. (Great if you are serving advertising.)
- The page will (possibly) load more quickly because you have less content on each page.
- Many marketing agencies want everything above the fold, so shorter pages "look better."
The reality, I think:
- Users hate having to go to the bottom of a page, then to click on the "more" option, and then wait 3-6 seconds for the page to load. Especially on a mobile device.
- It is more complex to have duplicate page titles. I'd recommend the rel=next / rel=previous tags, that could help in this case.
My $0.02 is that you should go with the single, long page articles. I have found that search engines love, love, love pages that have a lot of content (as long as it's well written). A page with 12,000 words of content will often outrank something with 250 words of marketing fluff.
If in doubt, though, test it out, and convert one or two over and test out how they're ranked.
Thanks!
- Jeff
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix non-crawlable pages affected by CSS modals?
I stumbled across something new when doing a site audit in SEMRUSH today ---> Modals. The case: Several pages could not be crawled because of (modal:) in the URL. What I know: "A modal is a dialog box/popup window that is displayed on top of the current page" based on CSS and JS. What I don't know: How to prevent crawlers from finding them.
Web Design | | Dan-Louis0 -
How is Single Page Application (SPA) bad for SEO
Hi guys. I am quite inspired of SPA technique. It's really amazing when all your interaction with the site is going on the fly and you don't see any page reloads. I've started implementing the site with this instruction and already found nice guys to make the design. The only downside of the using SPA which I can see **is the **SEO part. That's because the URL does not really change and different pages don't have their unique URL addresses.
Web Design | | Billy_gym
Actually they have, but it looks like: yoursite.com/#/products yoursite.com/#/prices yoursite.com/#/contact So all of them goes after # and being just anchors. For Google this mean all of these pages is just yoursite.com/ My question is what is really proven method to implement the URL structure in Single Page Application, so all the pages indexed by Google correctly (sorry I don't mention the other search engines because of market share). The other question, of course, is examples. It will be great to see real life site examples, better authority sites, which use SPA technique and well indexed by search engines.1 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Best Webhosting Suggestions??
Good morning my fellow Mozzers! I am currently looking at adding some diversity to my current web hosting and I was hoping I could get some suggestions. I dont currently need a VPS or Dedicated Server, I just need some shared hosting, you know, packeges that are sub $20 a month...i mean i will pay more than that, but so far everything i look at that meets my needs(basic hosting, email, ect...). This is for client sites and they are growing in number somewhat rapidly. I currently host with GoDaddy and they are amazing in the support department, but I do question whether their servers are causing slow page loads ect...but all in all I am happy with them. I have used Netword Solutions in the past, but left them because i was not a big fan of talking to support people in india and malasia. I do think that their servers might have performed better than GoDaddy so i am not ruling them out at this point i am looking for a provider that has excellent support and who has servers that are not so overloaded the can render pages and content slowly. Performance is very important to me. I am not looking for the cheapest, I am looking for the overall best. Thanks in advance SEOmoz family!!!
Web Design | | WebbyNabler0 -
Best Practice issue: Modx vs Wordpress
Lately I've been working a lot with Modx to create a new site for our own firm as well for other projects. But so far I haven't seen the advantages for SEO purposes other then the fact that with ModX you can manage almost everything yourself including snippets etc without to much effort. Wordpress is a known factor for blogging and since the last 2 years or so for websites. My question is: Which platform is better suited for SEO purposes? Which should I invest my time in? ModX or Wordpress? Hope to hear your thought on the matter
Web Design | | JarnoNijzing0 -
Where is the best place to put reciprocal links on our website?
Where should reciprocal links be placed on our website? Should we create a "Resources" page? Should the page be "hidden" from the public? I know there is a right answer out there! Thank you for your help! Jay
Web Design | | theideapeople0