Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Track AI Overviews in Keyword Research
      Moz Pro

      Track AI Overviews in Keyword Research

      Try it free!
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Unlock flexible pricing & new endpoints
      Moz API

      Unlock flexible pricing & new endpoints

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. May know what's the meaning of these parameters in .htaccess?

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    May know what's the meaning of these parameters in .htaccess?

    Intermediate & Advanced SEO
    3
    9
    1917
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • esiow2013
      esiow2013 last edited by

      Begin HackRepair.com Blacklist

      RewriteEngine on

      Abuse Agent Blocking

      RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ^(.
      )Zeus.Webster [NC,OR]
      RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
      RewriteRule ^.
      - [F,L]

      Abuse bot blocking rule end

      End HackRepair.com Blacklist

      1 Reply Last reply Reply Quote 1
      • esiow2013
        esiow2013 last edited by

        Now it's clear. Thanks a lot ThompsonPaul! 🙂

        1 Reply Last reply Reply Quote 0
        • ThompsonPaul
          ThompsonPaul @esiow2013 last edited by

          Thanks! 🙂

          Typically these blacklists are created and maintained by security specialists who have done testing on the different bots to determine which are legit/beneficial and which are crapbots. They then provide these lists for others to use. Often the lists are amalgamations of bots detected and analysed on a number of different sites and by a number of different specialists to act as a double-check for each other.

          You do need to be careful that you are using a well-curated list, as carelessly blocking bots can cause problems for legitimate bots. You would check out the creator of such a list the same way you'd check out the creator of a plugin you're considering using - check reviews, look at comments and responses on the post that provides the blacklist etc.

          That answer your question?

          Paul

          1 Reply Last reply Reply Quote 1
          • esiow2013
            esiow2013 last edited by

            Hi ThompsonPaul,

            Wow! Superb explanation. One thing I just want to clarify, how would I know if these bots are "bad bots".

            Thanks a lot! 🙂

            ThompsonPaul 1 Reply Last reply Reply Quote 0
            • ThompsonPaul
              ThompsonPaul last edited by

              As Lynn mentions, these entries form a blacklist for "bad bots". These are bots that are identified as being harmful (or at least non-helpful) to the real use of a website. Bots are essentially spiders that crawl and record the pages of your site the same way the GoogleBot does.There are 2 main reasons for blocking them

              1. Too many unnecessary bots can put a real strain on server resources, causing the site to slow down for real users. This can be especially problematic with bad bots as they do not respect the entries in your robots.txt file and so will crawl even blocked pages. This can mean huge numbers of extra pages get crawled, leading to even more load.

              2. Many (most?) of these bots are collecting data for nefarious purposes. Some are scrapers to collect your site content in order to re-use it illegally on another site, some are scanning for certain files/plugins on your site known to be insecure so they can target them for attack, etc.

              Best case scenario, these bots waste your bandwidth and can cause site slowdowns on low-powered (e.g. shared) servers. Worst case, they can actually cause harm to your site.

              There are literally many thousands of these types of bots out there, and their creators often change their identifying user agents just to get around these types of blacklists. But many have been around for some time and still use the same identifier. So having a blacklist to block the most common of them is actually very good security practice. To be totally proactive however, you'd need to update the list every couple of months.

              Bottom line - those entries are providing some security and overload protection for your site, and there's essentially no downside to having them in place even if they're not catching everything.

              Hope that helps - if any of my explanation isn't clear, just holler 🙂

              Paul

              1 Reply Last reply Reply Quote 2
              • esiow2013
                esiow2013 last edited by

                Thanks Lynn! I'll just remove these parameters and leave this one:

                BEGIN WordPress

                <ifmodule mod_rewrite.c="">RewriteEngine On
                RewriteBase /
                RewriteCond %{REQUEST_FILENAME} !-f
                RewriteCond %{REQUEST_FILENAME} !-d
                RewriteRule . /index.php [L]
                Rewritecond %{http_host} ^domain.com [NC]
                Rewriterule ^(.*)$ http://www.domain.com/$1 [R=301,NC]</ifmodule>

                END WordPress

                1 Reply Last reply Reply Quote 0
                • LynnPatchett
                  LynnPatchett @esiow2013 last edited by

                  I dont use something like this myself. I suppose if you are having some problem with bots it might be useful, maybe someone else can chime in if they have some experience with this kind of blocking.

                  1 Reply Last reply Reply Quote 1
                  • esiow2013
                    esiow2013 last edited by

                    Thanks Lynn! Is this really necessary?

                    LynnPatchett 1 Reply Last reply Reply Quote 0
                    • LynnPatchett
                      LynnPatchett last edited by

                      HI,

                      It is checking to see if the visiting user agent contains any of these strings (NC is telling it non case sensitive) and if it does to return a 403 forbidden message.

                      1 Reply Last reply Reply Quote 1
                      • 1 / 1
                      • First post
                        Last post

                      Got a burning SEO question?

                      Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                      Start my free trial


                      Browse Questions

                      Explore more categories

                      • Moz Tools

                        Chat with the community about the Moz tools.

                      • SEO Tactics

                        Discuss the SEO process with fellow marketers

                      • Community

                        Discuss industry events, jobs, and news!

                      • Digital Marketing

                        Chat about tactics outside of SEO

                      • Research & Trends

                        Dive into research and trends in the search industry.

                      • Support

                        Connect on product support and feature requests.

                      • See all categories

                      Related Questions

                      • henrycabrown

                        Moved company 'Help Center' from Zendesk to Intercom, got lots of 404 errors. What now?

                        Howdy folks, excited to be part of the Moz community after lurking for years! I'm a few weeks into my new job (Digital Marketing at Rewind) and about 10 days ago the product team moved our Help Center from Zendesk to Intercom. Apparently the import went smoothly, but it's caused one problem I'm not really sure how to go about solving: https://help.rewind.io/hc/en-us/articles/***    is where all our articles used to sit https://help.rewind.io/***    is where all our articles now are So, for example, the following article has now moved as such: https://help.rewind.io/hc/en-us/articles/115001902152-Can-I-fast-forward-my-store-after-a-rewind- https://help.rewind.io/general-faqs-and-billing/frequently-asked-questions/can-i-fast-forward-my-store-after-a-rewind This has created a bunch of broken URLs in places like our Shopify/BigCommerce app listings, in our email drips, and in external resources etc. I've played whackamole cleaning many of these up, but these old URLs are still indexed by Google – we're up to 475 Crawl Errors in Search Console over the past week, all of which are 404s. I reached out to Intercom about this to see if they had something in place to help, but they just said my "best option is tracking down old links and setting up 301 redirects for those particular addressed". Browsing the Zendesk forms turned up some relevant-ish results, with the leading recommendation being to configure javascript redirects in the Zendesk document head (thread 1, thread 2, thread 3) of individual articles. I'm comfortable setting up 301 redirects on our website, but I'm in a bit over my head in trying to determine how I could do this with content that's hosted externally and sitting on a subdomain. I have access to our Zendesk admin, so I can go in and edit stuff there, but don't have experience with javascript redirects and have read that they might not be great for such a large scale redirection. Hopefully this is enough context for someone to provide guidance on how you think I should go about fixing things (or if there's even anything for me to do) but please let me know if there's more info I can provide. Thanks!

                        Intermediate & Advanced SEO | | henrycabrown
                        1
                      • benjaminleemd

                        My "search visibility" went from 3% to 0% and I don't know why.

                        My search visibility on here went from 3.5% to 3.7% to 0% to 0.03% and now 0.05% in a matter of 1 month and I do not know why. I make changes every week to see if I can get higher on google results. I do well with one website which is for a medical office that has been open for years. This new one where the office has only been open a few months I am having trouble. We aren't getting calls like I am hoping we would. In fact the only one we did receive I believe is because we were closest to him in proximity on google maps. I am also having some trouble with the "Links" aspect of SEO.  Everywhere I see to get linked it seems you have to pay. We are a medical office we aren't selling products so not many Blogs would want to talk about us. Any help that could assist me with getting a higher rank on google would be greatly appreciated. Also any help with getting the search visibility up would be great as well.

                        Intermediate & Advanced SEO | | benjaminleemd
                        1
                      • HashtagJeff

                        Changed all external links to 'NoFollow' to fix manual action penalty. How do we get back?

                        I have a blog that received a Webmaster Tools message about a guidelines violation because of "unnatural outbound links" back in August. We added a plugin to make all external links 'NoFollow' links and Google removed the penalty fairly quickly. My question, how do we start changing links to 'follow' again? Or at least being able to add 'follow' links in posts going forward? I'm confused by the penalty because the blog has literally never done anything SEO-related, they have done everything via social and email. I only started working with them recently to help with their organic presence. We don't want them to hurt themselves at all, but 'follow' links are more NATURAL than having everything as 'NoFollow' links, and it helps with their own SEO by having clean external 'follow' links. Not sure if there is a perfect answer to this question because it is Google we're dealing with here, but I'm hoping someone else has some tips that I may not have thought about. Thanks!

                        Intermediate & Advanced SEO | | HashtagJeff
                        0
                      • McTaggart

                        Why do people put xml sitemaps in subfolders? Why not just the root? What's the best solution?

                        Just read this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location Yet surely it's better to put the sitemaps at the root so you have:
                        (a) http://example.com/sitemap.xml 
                        http://example.com/sitemap-chocolatecakes.xml
                        http://example.com/sitemap-spongecakes.xml 
                        and so on... OR this kind of approach - 
                        (b) http://example/com/sitemap.xml
                        http://example.com/sitemap/chocolatecakes.xml and 
                        http://example.com/sitemap/spongecakes.xml I would tend towards (a) rather than (b) - which is the best option? Also, can I keep the structure the same for sitemaps that are subcategories of other sitemaps - for example - for a subcategory of http://example.com/sitemap-chocolatecakes.xml I might create http://example.com/sitemap-chocolatecakes-cherryicing.xml - or should I add a sub folder to turn it into http://example.com/sitemap-chocolatecakes/cherryicing.xml Look forward to reading your comments - Luke

                        Intermediate & Advanced SEO | | McTaggart
                        0
                      • teconsite

                        Pagination parameters and canonical

                        Hello, We have a site that manages pagination through parameters in urls, this way: friendly-url.html
                        friendly-url.html?p=2
                        friendly-url.html?p=3
                        ... We've rencently added the canonical tag pointing to friendly-url.html for all paginated results. In search console, we have the "p" parameter identified by google. 
                        Now that the canonical has been added, should we still configure the parameter in search console, and tell google that it is being use for pagination? Thank you!

                        Intermediate & Advanced SEO | | teconsite
                        0
                      • howlusa

                        A few questions on Google's Structured Data Markup Helper...

                        I'm trying to go through my site and add microdata with the help of Google's Structured Data Markup Helper. I have a few questions that I have not been able to find an answer for. Here is the URL I am referring to: http://www.howlatthemoon.com/locations/location-chicago My company is a bar/club, with only 4 out of 13 locations serving food. Would you mark this up as a local business or a restaurant? It asks for "URL" above the ratings. Is this supposed to be the URL that ratings are on like Yelp or something? Or is it the URL for the page? Either way, neither of those URLs are on the page so I can't select them. If it is for Yelp should I link to it? How do I add reviews? Do they have to be on the page? If I make a group of days for Day of the Week for Opening hours, such as Mon-Thu, will that work out? I have events on this page. However, when I tried to do the markup for just the event it told me to use  itemscope itemtype="http://schema.org/Event" on the body tag of the page. That is just a small part of the page, I'm not sure why I would put the event tag on the whole body? Any other tips would be much appreciated. Thanks!

                        Intermediate & Advanced SEO | | howlusa
                        0
                      • rayvensoft

                        Do links to PDF's on my site pass "link juice"?

                        Hi, I have recently started a project on one of my sites, working with a branch of the U.S. government, where I will be hosting and publishing some of their PDF documents for free for people to use.  The great SEO side of this is that they link to my site.  The thing is, they are linking directly to the PDF files themselves, not the page with the link to the PDF files.  So my question is, does that give me any SEO benefit? While the PDF is hosted on my site, there are no links in it that would allow a spider to start from the PDF and crawl the rest of my site.   So do I get any benefit from these great links?  If not, does anybody have any suggestions on how I could get credit for them.  Keep in mind that editing the PDF's are not allowed by the government. Thanks.

                        Intermediate & Advanced SEO | | rayvensoft
                        0
                      • James77

                        Posing QU's on Google Variables "aclk", "gclid" "cd", "/aclk" "/search", "/url" etc

                        I've been doing a bit of stats research prompted by read the recent ranking blog http://www.seomoz.org/blog/gettings-rankings-into-ga-using-custom-variables There are a few things that have come up in my research that I'd like to clear up. The below analysis has been done on my "conversions". 1/. What does "/aclk" mean in the Referrer URL? I have noticed a strong correlation between this and "gclid" in the landing page variable. Does it mean "ad click" ?? Although they seem to "closely" correlate they don't exactly, so when I have /aclk in the referrer Url MOSTLY I have gclid in the landing page URL. BUT not always, and the same applies vice versa. It's pretty vital that I know what is the best way to monitor adwords PPC, so what is the best variable to go on? - Currently I am using "gclid", but I have about 25% extra referral URL's with /aclk in that dont have "gclid" in - so am I underestimating my number of PPC conversions? 2/. The use of the variable "cd" is great, but it is not always present. I have noticed that 99% of my google "Referrer URL's" either start with:
                        /aclk   - No cd value
                        /search - No cd value
                        /url - Always contains the cd variable. What do I make of this?? Thanks for the help in advance!

                        Intermediate & Advanced SEO | | James77
                        0

                      Get started with Moz Pro!

                      Unlock the power of advanced SEO tools and data-driven insights.

                      Start my free trial
                      Products
                      • Moz Pro
                      • Moz Local
                      • Moz API
                      • Moz Data
                      • STAT
                      • Product Updates
                      Moz Solutions
                      • SMB Solutions
                      • Agency Solutions
                      • Enterprise Solutions
                      • Digital Marketers
                      Free SEO Tools
                      • Domain Authority Checker
                      • Link Explorer
                      • Keyword Explorer
                      • Competitive Research
                      • Brand Authority Checker
                      • Local Citation Checker
                      • MozBar Extension
                      • MozCast
                      Resources
                      • Blog
                      • SEO Learning Center
                      • Help Hub
                      • Beginner's Guide to SEO
                      • How-to Guides
                      • Moz Academy
                      • API Docs
                      About Moz
                      • About
                      • Team
                      • Careers
                      • Contact
                      Why Moz
                      • Case Studies
                      • Testimonials
                      Get Involved
                      • Become an Affiliate
                      • MozCon
                      • Webinars
                      • Practical Marketer Series
                      • MozPod
                      Connect with us

                      Contact the Help team

                      Join our newsletter
                      Moz logo
                      © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                      • Accessibility
                      • Terms of Use
                      • Privacy

                      Looks like your connection to Moz was lost, please wait while we try to reconnect.