undefined
Skip to content
Moz logo Menu open Menu close
  • Products
    • Moz Pro
    • Moz Pro Home
    • Moz Local
    • Moz Local Home
    • STAT
    • Moz API
    • Moz API Home
    • Compare SEO Products
    • Moz Data
  • Free SEO Tools
    • Domain Analysis
    • Keyword Explorer
    • Link Explorer
    • Competitive Research
    • MozBar
    • More Free SEO Tools
  • Learn SEO
    • Beginner's Guide to SEO
    • SEO Learning Center
    • Moz Academy
    • MozCon
    • Webinars, Whitepapers, & Guides
  • Blog
  • Why Moz
    • Digital Marketers
    • Agency Solutions
    • Enterprise Solutions
    • Small Business Solutions
    • The Moz Story
    • New Releases
  • Log in
  • Log out
  • Products
    • Moz Pro

      Your all-in-one suite of SEO essentials.

    • Moz Local

      Raise your local SEO visibility with complete local SEO management.

    • STAT

      SERP tracking and analytics for enterprise SEO experts.

    • Moz API

      Power your SEO with our index of over 44 trillion links.

    • Compare SEO Products

      See which Moz SEO solution best meets your business needs.

    • Moz Data

      Power your SEO strategy & AI models with custom data solutions.

    Track AI Overviews in Keyword Research
    Moz Pro

    Track AI Overviews in Keyword Research

    Try it free!
  • Free SEO Tools
    • Domain Analysis

      Get top competitive SEO metrics like DA, top pages and more.

    • Keyword Explorer

      Find traffic-driving keywords with our 1.25 billion+ keyword index.

    • Link Explorer

      Explore over 40 trillion links for powerful backlink data.

    • Competitive Research

      Uncover valuable insights on your organic search competitors.

    • MozBar

      See top SEO metrics for free as you browse the web.

    • More Free SEO Tools

      Explore all the free SEO tools Moz has to offer.

    NEW Keyword Suggestions by Topic
    Moz Pro

    NEW Keyword Suggestions by Topic

    Learn more
  • Learn SEO
    • Beginner's Guide to SEO

      The #1 most popular introduction to SEO, trusted by millions.

    • SEO Learning Center

      Broaden your knowledge with SEO resources for all skill levels.

    • On-Demand Webinars

      Learn modern SEO best practices from industry experts.

    • How-To Guides

      Step-by-step guides to search success from the authority on SEO.

    • Moz Academy

      Upskill and get certified with on-demand courses & certifications.

    • MozCon

      Save on Early Bird tickets and join us in London or New York City

    Unlock flexible pricing & new endpoints
    Moz API

    Unlock flexible pricing & new endpoints

    Find your plan
  • Blog
  • Why Moz
    • Digital Marketers

      Simplify SEO tasks to save time and grow your traffic.

    • Small Business Solutions

      Uncover insights to make smarter marketing decisions in less time.

    • Agency Solutions

      Earn & keep valuable clients with unparalleled data & insights.

    • Enterprise Solutions

      Gain a competitive edge in the ever-changing world of search.

    • The Moz Story

      Moz was the first & remains the most trusted SEO company.

    • New Releases

      Get the scoop on the latest and greatest from Moz.

    Surface actionable competitive intel
    New Feature

    Surface actionable competitive intel

    Learn More
  • Log in
    • Moz Pro
    • Moz Local
    • Moz Local Dashboard
    • Moz API
    • Moz API Dashboard
    • Moz Academy
  • Avatar
    • Moz Home
    • Notifications
    • Account & Billing
    • Manage Users
    • Community Profile
    • My Q&A
    • My Videos
    • Log Out

The Moz Q&A Forum

  • Forum
  • Questions
  • Users
  • Ask the Community

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

  1. Home
  2. SEO Tactics
  3. Intermediate & Advanced SEO
  4. Can PDF be seen as duplicate content? If so, how to prevent it?

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Can PDF be seen as duplicate content? If so, how to prevent it?

Intermediate & Advanced SEO
7
20
12.9k
Loading More Posts
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as question
Log in to reply
This topic has been deleted. Only users with question management privileges can see it.
  • Gestisoft-Qc
    Gestisoft-Qc Subscriber last edited by Jan 23, 2012, 7:13 PM

    I see no reason why PDF couldn't be considered duplicate content but I haven't seen any threads about it.

    We publish loads of product documentation provided by manufacturers as well as White Papers and Case Studies. These give our customers and prospects a better idea off our solutions and help them along their buying process.

    However, I'm not sure if it would be better to make them non-indexable to prevent duplicate content issues. Clearly we would prefer a solutions where we benefit from to keywords in the documents.

    Any one has insight on how to deal with PDF provided by third parties?

    Thanks in advance.

    1 Reply Last reply Reply Quote 1
    • ilonka65
      ilonka65 last edited by Apr 10, 2015, 2:38 AM Apr 10, 2015, 2:38 AM

      It looks like Google is not crawling tabs anymore, therefore if your pdf's are tabbed within pages, it might not be an issue: https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html

      1 Reply Last reply Reply Quote 0
      • topic:timeago_earlier,11 months
      • ASriv
        ASriv Subscriber last edited by May 1, 2014, 4:05 PM May 1, 2014, 4:05 PM

        Sure, I understand - thanks EGOL

        1 Reply Last reply Reply Quote 0
        • EGOL
          EGOL @ASriv last edited by May 1, 2014, 2:41 PM May 1, 2014, 2:41 PM

          I would like to give that to you but it is on a site that I don't share in forums.  Sorry.

          1 Reply Last reply Reply Quote 0
          • ASriv
            ASriv Subscriber last edited by May 1, 2014, 2:15 PM May 1, 2014, 2:15 PM

            Thanks EGOL

            That would be ideal.

            For a site that has multiple authors and with it being impractical to get a developer involved every time a web page / blog post and the pdf are created, is there a single line of code that could be used to accomplish this in .htaccess?

            If so, would you be able to show me an example please?

            EGOL 1 Reply Last reply May 1, 2014, 2:41 PM Reply Quote 0
            • EGOL
              EGOL last edited by May 1, 2014, 2:08 PM May 1, 2014, 2:08 PM

              I assigned rel=canonical to my PDFs using htaccess.

              Then, if anyone links to the PDFs the linkvalue gets passed to the webpage.

              1 Reply Last reply Reply Quote 0
              • ASriv
                ASriv Subscriber last edited by May 1, 2014, 2:04 PM May 1, 2014, 2:04 PM

                Hi all

                I've been discussing the topic of making content available as both blog posts and pdf downloads today.

                Given that there is a lot of uncertainty and complexity around this issue of potential duplication, my plan is to house all the pdfs in a folder that we block with robots.txt

                Anyone agree / disagree with this approach?

                1 Reply Last reply Reply Quote 0
                • topic:timeago_earlier,9 months
                • Dr-Pete
                  Dr-Pete Staff @ATMOSMarketing56 last edited by Aug 1, 2013, 6:54 PM Aug 1, 2013, 6:54 PM

                  Unfortunately, there's no great way to have it both ways. If you want these pages to get indexed for the links, then they're potential duplicates. If Google filters them out, the links probably won't count. Worst case, it could cause Panda-scale problems. Honestly, I suspect the link value is minimal and outweighed by the risk, but it depends quite a bit on the scope of what you're doing and the general link profile of the site.

                  1 Reply Last reply Reply Quote 0
                  • ATMOSMarketing56
                    ATMOSMarketing56 Subscriber last edited by Aug 1, 2013, 3:30 PM Aug 1, 2013, 3:30 PM

                    I think you can set it to public or private (logged-in only) and even put a price-tag on it if you want. So yes setting it to private would help to eliminate the dup content issue, but it would also hide the links that I'm using to link-build.

                    I would imagine that since this guide would link back to our original site that it would be no different than if someone were to copy the content from our site and link back to us with it, thus crediting us as the original source. Especially if we ensure to index it through GWMT before submitting to other platforms. Any good resources that delve into that?

                    Dr-Pete 1 Reply Last reply Aug 1, 2013, 6:54 PM Reply Quote 0
                    • Dr-Pete
                      Dr-Pete Staff last edited by Aug 1, 2013, 3:14 PM Aug 1, 2013, 3:14 PM

                      Potentially, but I'm honestly not sure how Scrid's pages are indexed. Don't you need to log in or something to actually see the content on Scribd?

                      1 Reply Last reply Reply Quote 0
                      • ATMOSMarketing56
                        ATMOSMarketing56 Subscriber last edited by Aug 1, 2013, 11:30 AM Aug 1, 2013, 11:30 AM

                        What about this instance:

                        (A) I made an "ultimate guide to X" and posted it on my site as individual HTML pages for each chapter

                        (B) I made a PDF version with the exact same content that people can download directly from the site

                        (C) I uploaded the PDF to sites like Scribd.com to help distribute it further, and build links with the links that are embedded in the PDF.

                        Would those all be dup content? Is (C) recommended or not?

                        1 Reply Last reply Reply Quote 0
                        • topic:timeago_earlier,2 years
                        • EGOL
                          EGOL @Gestisoft-Qc last edited by Jan 25, 2012, 11:39 PM Jan 25, 2012, 11:39 PM

                          Thanks!. I am going to look into this.  I'll let you know if I learn anything.

                          1 Reply Last reply Reply Quote 0
                          • Dr-Pete
                            Dr-Pete Staff @Gestisoft-Qc last edited by Jan 26, 2012, 12:54 PM Jan 25, 2012, 8:22 PM

                            If they duplicate your main content, I think the header-level canonical may be a good way to go. For the syndication scenario, it's tough, because then you're knocking those PDFs out of the rankings, potentially, in favor of someone else's content.

                            Honestly, I've seen very few people deal with canonicalization for PDFs, and even those cases were small or obvious (like a page with the exact same content being outranked by the duplicate PDF). It's kind of uncharted territory.

                            1 Reply Last reply Reply Quote 3
                            • EGOL
                              EGOL @Gestisoft-Qc last edited by Jan 25, 2012, 8:13 PM Jan 25, 2012, 8:13 PM

                              Thanks for all of your input Dr. Pete. The example that you use is almost exactly what I have - hundreds of .pdfs on a fifty page site. These .pdfs rank well in the SERPs, accumulate pagerank, and pass traffic and link value back to the main site through links embedded within the .pdf. The also have natural links from other domains. I don't want to block them or nofollow them butyour suggestion of using header directive sounds pretty good.

                              1 Reply Last reply Reply Quote 0
                              • Dr-Pete
                                Dr-Pete Staff @Gestisoft-Qc last edited by Jan 26, 2012, 12:53 PM Jan 25, 2012, 7:15 PM

                                Oh, sorry - so these PDFs aren't duplicates with your own web/HTML content so much as duplicates with the same PDFs on other websites?

                                That's more like a syndication situation. It is possible that, if enough people post these PDFs, you could run into trouble, but I've never seen that. More likely, your versions just wouldn't rank. Theoretically, you could use the header-level canonical tag cross-domain, but I've honestly never seen that tested.

                                If you're talking about a handful of PDFs, they're a small percentage of your overall indexed content, and that content is unique, I wouldn't worry too much. If you're talking about 100s of PDFs on a 50-page website, then I'd control it. Unfortunately, at that point, you'd probably have to put the PDFs in a folder and outright block it. You'd remove the risk, but you'd stop ranking on those PDFs as well.

                                1 Reply Last reply Reply Quote 2
                                • EGOL
                                  EGOL @Gestisoft-Qc last edited by Jan 25, 2012, 1:56 PM Jan 25, 2012, 1:56 PM

                                  @EGOL: Can you expend a bit on your Author suggestion?

                                  I was wondering if there is a way to do rel=author for a pdf document.  I don't know how to do it and don't know if it is possible.

                                  1 Reply Last reply Reply Quote 0
                                  • Gestisoft-Qc
                                    Gestisoft-Qc Subscriber @Dr-Pete last edited by Jan 24, 2012, 5:08 PM Jan 24, 2012, 5:08 PM

                                    To make sure I understand what I'm reading:

                                    • PDFs don't usually rank as well as regular pages (although it is possible)
                                    • It is possible to configure a canonical tag on a PDF

                                    My concern isn't that our PDFs may outrank the original content but rather getting slammed by Google for publishing them.

                                    Am right in thinking a canonical tag prevents to accumulate link juice? If so I would prefer to not use it, unless it leads to Google slamming.

                                    Any one has experienced Google retribution for publishing PDF coming from a 3rd party?

                                    @EGOL: Can you expend a bit on your Author suggestion?

                                    Thanks all!

                                    EGOL Dr-Pete 5 Replies Last reply Jan 25, 2012, 11:39 PM Reply Quote 0
                                    • Dr-Pete
                                      Dr-Pete Staff last edited by Jan 24, 2012, 3:10 PM Jan 24, 2012, 3:09 PM

                                      I think it's possible, but I've only seen it in cases that are a bit hard to disentangle. For example, I've seen a PDF outrank a duplicate piece of regular content when the regular content had other issues (including massive duplication with other, regular content). My gut feeling is that it's unusual.

                                      If you're concerned about it, you can canonicalize PDFs with the header-level canonical directive. It's a bit more technically complex than the standard HTML canonical tag:

                                      http://googlewebmastercentral.blogspot.com/2011/06/supporting-relcanonical-http-headers.html

                                      I'm going to mark this as "Discussion", just in case anyone else has seen real-world examples.

                                      Gestisoft-Qc 1 Reply Last reply Jan 24, 2012, 5:08 PM Reply Quote 2
                                      • EGOL
                                        EGOL last edited by Jan 23, 2012, 9:30 PM Jan 23, 2012, 9:28 PM

                                        I am really interested in hearing what others have to say about this.

                                        I know that .pdfs can be very valuable content.  They can be optimized, they rank in the SERPs, they accumulate PR and they can pass linkvalue.  So, to me it would be a mistake to block them from the index...

                                        However, I see your point about dupe content... they could also be thin content.  Will panda whack you for thin and dupes in your PDFs?

                                        How can canonical be used... what about author?

                                        Anybody know anything about this?

                                        1 Reply Last reply Reply Quote 3
                                        • MargaritaS
                                          MargaritaS last edited by Jan 24, 2012, 3:10 PM Jan 23, 2012, 7:20 PM

                                          Just like any other piece of duplicate content, you can use canonical link elements to specify the original piece of content (if there's indeed more than one identical piece). You could also block these types of files in the robots.txt, or use noindex-follow meta tags.

                                          Regards,

                                          Margarita

                                          1 Reply Last reply Reply Quote 5
                                          • 1 / 1
                                          1 out of 20
                                          • First post
                                            1/20
                                            Last post

                                          Got a burning SEO question?

                                          Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                                          Start my free trial


                                          Browse Questions

                                          Explore more categories

                                          • Moz Tools

                                            Chat with the community about the Moz tools.

                                          • SEO Tactics

                                            Discuss the SEO process with fellow marketers

                                          • Community

                                            Discuss industry events, jobs, and news!

                                          • Digital Marketing

                                            Chat about tactics outside of SEO

                                          • Research & Trends

                                            Dive into research and trends in the search industry.

                                          • Support

                                            Connect on product support and feature requests.

                                          • See all categories

                                          Related Questions

                                          • EdenPrez

                                            Will I be flagged for duplicate content by Google?

                                            Hi Moz community, Had a question regarding duplicate content that I can't seem to find the answer to on Google. My agency is working on a large number of franchisee websites (over 40) for one client, a print franchise, that wants a refresh of new copy and SEO. Each print shop has their own 'microsite', though all services and products are the same, the only difference being the location. Each microsite has its own unique domain. To avoid writing the same content over and over in 40+ variations, would all the websites be flagged by Google for duplicate content if we were to use the same base copy, with the only changes being to the store locations (i.e. where we mention Toronto print shop on one site may change to Kelowna print shop on another)? Since the print franchise owns all the domains, I'm wondering if that would be a problem since the sites aren't really competing with one another. Any input would be greatly appreciated. Thanks again!

                                            Intermediate & Advanced SEO | May 22, 2020, 2:21 AM | EdenPrez
                                            0
                                          • GhillC

                                            Same site serving multiple countries and duplicated content

                                            Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
                                            site.com/us/
                                            site.com/gb/
                                            site.com/fr/
                                            site.com/it/
                                            etc. The first problem was fairly easy to solve:
                                            Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
                                            Given the following requirements/constraints, I can't see any positive resolution to this issue:
                                            1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
                                            2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
                                            3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
                                            Am I right or can you think about anything to sort that out? Many thanks,
                                            Ghill

                                            Intermediate & Advanced SEO | Oct 23, 2018, 12:35 PM | GhillC
                                            0
                                          • jlo7613

                                            Can Google read content that is hidden under a "Read More" area?

                                            For example, when a person first lands on a given page, they see a collapsed paragraph but if they want to gather more information they press the "read more" and it expands to reveal the full paragraph. Does Google crawl the full paragraph or just the shortened version? In the same vein, what if you have a text box that contains three different tabs. For example, you're selling a product that has a text box with overview, instructions & ingredients tabs all housed under the same URL. Does Google crawl all three tabs? Thanks for your insight!

                                            Intermediate & Advanced SEO | Oct 5, 2016, 9:38 AM | jlo7613
                                            0
                                          • nchlondon

                                            Directory with Duplicate content? what to do?

                                            Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.

                                            Intermediate & Advanced SEO | Sep 30, 2016, 8:16 AM | nchlondon
                                            0
                                          • ajiabs

                                            Duplicate content due to parked domains

                                            I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names.  So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.

                                            Intermediate & Advanced SEO | Jan 17, 2016, 6:02 PM | ajiabs
                                            0
                                          • EnvoyWeb

                                            Duplicate Content www vs. non-www and best practices

                                            I have a customer who had prior help on his website and I noticed a 301 redirect in his .htaccess Rule for duplicate content removal : www.domain.com vs domain.com RewriteCond %{HTTP_HOST} ^MY-CUSTOMER-SITE.com [NC]
                                            RewriteRule (.*) http://www.MY-CUSTOMER-SITE.com/$1 [R=301,L,NC] The result of this rule is that i type MY-CUSTOMER-SITE.com in the browser and it redirects to www.MY-CUSTOMER-SITE.com I wonder if this is causing issues in SERPS.  If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites. -----Can you comment on whether this is a best practice for all domains?
                                            -----I've run a report for backlinks.  If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this?

                                            Intermediate & Advanced SEO | Oct 10, 2013, 8:14 PM | EnvoyWeb
                                            0
                                          • knielsen

                                            Copying my Facebook content to website considered duplicate content?

                                            I write career advice on Facebook on a daily basis. On my homepage users can see the most recent 4-5 feeds (using FB social media plugin). I am thinking to create a page on my website where visitors can see all my previous FB feeds. Would this be considered duplicate content if I copy paste the info, but if I use a Facebook social media plugin then it is not considered duplicate content? I am working on increasing content on my website and feel incorporating FB feeds would make sense. thank you

                                            Intermediate & Advanced SEO | Nov 22, 2012, 7:20 AM | knielsen
                                            0
                                          • 360eight-SEO

                                            News sites & Duplicate content

                                            Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
                                            Chris Captivate

                                            Intermediate & Advanced SEO | Aug 24, 2012, 2:46 PM | 360eight-SEO
                                            0

                                          Get started with Moz Pro!

                                          Unlock the power of advanced SEO tools and data-driven insights.

                                          Start my free trial
                                          Products
                                          • Moz Pro
                                          • Moz Local
                                          • Moz API
                                          • Moz Data
                                          • STAT
                                          • Product Updates
                                          Moz Solutions
                                          • SMB Solutions
                                          • Agency Solutions
                                          • Enterprise Solutions
                                          • Digital Marketers
                                          Free SEO Tools
                                          • Domain Authority Checker
                                          • Link Explorer
                                          • Keyword Explorer
                                          • Competitive Research
                                          • Brand Authority Checker
                                          • Local Citation Checker
                                          • MozBar Extension
                                          • MozCast
                                          Resources
                                          • Blog
                                          • SEO Learning Center
                                          • Help Hub
                                          • Beginner's Guide to SEO
                                          • How-to Guides
                                          • Moz Academy
                                          • API Docs
                                          About Moz
                                          • About
                                          • Team
                                          • Careers
                                          • Contact
                                          Why Moz
                                          • Case Studies
                                          • Testimonials
                                          Get Involved
                                          • Become an Affiliate
                                          • MozCon
                                          • Webinars
                                          • Practical Marketer Series
                                          • MozPod
                                          Connect with us

                                          Contact the Help team

                                          Join our newsletter

                                          Access all your tools in one place. Whether you're tracking progress or analyzing data, everything you need is at your fingertips.

                                          Moz logo
                                          © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                                          • Accessibility
                                          • Terms of Use
                                          • Privacy

                                          Looks like your connection to Moz was lost, please wait while we try to reconnect.