Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Track AI Overviews in Keyword Research
      Moz Pro

      Track AI Overviews in Keyword Research

      Try it free!
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Unlock flexible pricing & new endpoints
      Moz API

      Unlock flexible pricing & new endpoints

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. What's the best possible URL structure for a local search engine?

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    What's the best possible URL structure for a local search engine?

    Intermediate & Advanced SEO
    3
    8
    2612
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • _nitman
      _nitman last edited by

      Hi Mozzers,

      I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets.

      Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers.

      Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.

      1 Reply Last reply Reply Quote 0
      • AlanBleiweiss
        AlanBleiweiss @_nitman last edited by

        In regard to shorter URLs:

        The goal is to find a proper balance for your needs.  You want to group things into sub-groups based on proper hierarchy, however you also don't want to go too deep if you don't have enough pages/individual listings deep down the chain.

        So the Moz post you point to refers to that - at a certain point, having too many layers can be a problem. However there is one one single correct answer.

        The most important thing to be aware of and consider is your own research and evaluation process for your situation in your market.

        However, as far as what you found most people search for, be aware that with location based search, many people don't actually type in a location when they are doing a search.  Except Google DOES factor in the location when deciding what to present in results.  So the location matters even though people don't always include it themselves.

        The issue is not to become completely lost in making a decision either though - consider all the factors, make a business decision to move forward with what you come up with, and be consistent in applying that plan across the board.

        What I mean in regard to URLs and Breadcrumbs:

        If the URL is www.askme.com/dehli/saket/pizza/pizza-hut/  the breadcrumb should be:

        Home > Dehli > Saket > Pizza > Pizza Hut

        If the URL is www.askme.com/pizza-huts/saket-delhi/ the breadcrumb should be

        Home > Pizza Hut > Saket-Delhi

        1 Reply Last reply Reply Quote 1
        • _nitman
          _nitman @AlanBleiweiss last edited by

          While thinking about the ideal URL structure, I did consider some of the blogs (including this one by Rand: https://moz.com/blog/15-seo-best-practices-for-structuring-urls, check point #11. Attaching a screenshot as well) and websites which were doing really good with their one level static URLs.

          I actually did some keyword research on user's search pattern and google suggest data. Generally, our target search term comes before ("pizza huts" in this case) the geo location, may be people search things in a different way in India. Hence, I thought of keeping the URL structure that way.

          A little confused about this point though "URL, breadcrumb both should match the sequence.  If one has one sequence, and the other has a different sequence, that confuses search algorithms". Because, have seen many website doing tremendously well who're not following these principles.

          U2vC1Ua.png

          AlanBleiweiss 1 Reply Last reply Reply Quote 0
          • AlanBleiweiss
            AlanBleiweiss @_nitman last edited by

            Proximity to root is not a valid best practice, especially in this instance.

            Here's why:

            More people search based on geo-location than actual business name when looking for location based businesses.  So by putting "Pizza Hut" first, that contradicts this notion. It implies "more people look for Pizza Hut than the number of people looking for all the different businesses in this geo-location".

            Also, by using the URL you suggest, that's blatant over-optimization - attempting to stuff exact match keywords into the URL. In reality, people use a very wide range of keyword variations, so that's another conflict that harms your overall focus needs.

            All of the individual factors need to reinforce each other as much as is reasonable for human readability. So URL, breadcrumb both should match the sequence.  If one has one sequence, and the other has a different sequence, that confuses search algorithms.

            _nitman 1 Reply Last reply Reply Quote 0
            • _nitman
              _nitman @AlanBleiweiss last edited by

              Thank you so much once again Sir Alan.

              Well, I'm just thinking aloud here. How about putting my primary keyword in the first level instead of having this well structured URL syntax? For instance:

              • www.askme.com/pizza-huts-in-saket-delhi instead of www.askme.com/dehli/saket/pizza/pizza-hut/

              Here,

              • The complete primary keyword (or target search string) is closer to the domain. "Closer your keywords to the domain, better it is", I heard this somewhere. Is it still true and adds any additional value?
              • We don't have deep URL directory structure and our primary keyword is together too. In the well structure URL (the one you suggested), the target keyword is broken into multiple pieces & the URL directories.
              • But, I'm not exposing the hierarchy/navigation-flow via URL. I hope that's okay as far as I'm handling it cleanly from the breadcrumbs and rich snippets. What's your take on this?

              I know there are chances of URL conflicts. For instance, if we have an area "foo" in the city "bar" vs a city "foo bar". I'll end up having the same URL for both the cases i.e /<search-query>-in-foo-bar. There are many such edge cases, I'm on it.</search-query>

              AlanBleiweiss 1 Reply Last reply Reply Quote 0
              • AlanBleiweiss
                AlanBleiweiss @seowoody last edited by

                Local pack exists, yet is far from complete or consistently helpful.  Business directories thrive even in an age of local packs.  It's all about finding the best way to provide value, and the internet is large enough that many players can play in the game.

                1 Reply Last reply Reply Quote 1
                • seowoody
                  seowoody last edited by

                  Sorry for my ignorance here but does googl.in not show the local pack in its serps, with reviews and ratings?

                  if so, isn't the business model flawed, assuming you're going to be charging companies to be listed in your directory when they can get listed as a local business in Google right now for free?

                  perhaps I've overlooked something...

                  AlanBleiweiss 1 Reply Last reply Reply Quote 0
                  • AlanBleiweiss
                    AlanBleiweiss last edited by

                    Business listing directory environments have a big challenge when it comes to URL structure / information architecture and content organization because:

                    1. Many businesses are searched for based on geo-location
                    2. Many of those require hyper-local referencing while many others can be "in the general vacinity"
                    3. Many other businesses are not as relevant to geo-location

                    So what is a site to do?

                    The best path is to recognize that as mobile becomes more and more critical to searcher needs, hyper-local optimization becomes more critical.  It becomes the most important focus for SEO.

                    As a result, URL structure needs to reflect hyper-local first and foremost. So:

                    • www.askme.com/delhi/
                    • www.askme.com/delhi/saket/
                    • www.askme.com/delhi/saket/pizza/
                    • www.askme.com/dehli/saket/pizza/pizza-hut/

                    This way, if someone searches for "Pizza Hut Dehli", all of the Dehli Pizza Huts will show up, regardless of neighborhood, while anyone searching for "Pizza Hut Saket" will get more micro-locally relevant results.

                    And for those businesses that serve a wider geo-area, even though they too will be assigned a hyper-local final destination page, they will still be related to their broader geo-area as well.  So someone searching "plumbers in Dehli" will get the right results and then they can choose any of the plumbers in Dehli regardless of what neighborhood they are in.

                    Note how I removed /search/ from the URL structure as well. It's an irrelevant level.

                    _nitman 1 Reply Last reply Reply Quote 2
                    • 1 / 1
                    • First post
                      Last post

                    Got a burning SEO question?

                    Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                    Start my free trial


                    Browse Questions

                    Explore more categories

                    • Moz Tools

                      Chat with the community about the Moz tools.

                    • SEO Tactics

                      Discuss the SEO process with fellow marketers

                    • Community

                      Discuss industry events, jobs, and news!

                    • Digital Marketing

                      Chat about tactics outside of SEO

                    • Research & Trends

                      Dive into research and trends in the search industry.

                    • Support

                      Connect on product support and feature requests.

                    • See all categories

                    Related Questions

                    • katseo1

                      Best way to set up URL structure for reviews off of PDP pages.

                      We are adding existing customer reviews to Product Detail Pages pages. There are about 300 reviews per product so we're going to have to paginate reviews off of the PDP page. I'm wondering what the best url structure for reviews pages is to get the most seo benefit. For example, would it be something like this?  site.com/category/product/reviews/page-1 or something that used parameters, such as: site.com/reviews?product=a Also, what is the best way to show that the internal link on the PDP page to "All Reviews" is a higher priority link than the other links on the page?

                      Intermediate & Advanced SEO | | katseo1
                      0
                    • wozniak65

                      How to get local search volumes?

                      Hi Guys, I want to get search volumes for "carpet cleaning" for certain areas in Sydney, Australia. I'm using this process: Choose to ‘Search for new keyword and ad group ideas’. Enter the main keywords regarding your product / service Remove any default country targeting Specify your chosen location (s) by targeting specific cities / regions Click to ‘Get ideas’ The problem is none of the areas, even popular ones (like north sydney, surry hills, newtown, manly) are appearing and Google keyword tool, no matches. Is there any other tools or sources of data i can use to get accurate search volumes for these areas? Any recommendations would be very much appreciated. Cheers

                      Intermediate & Advanced SEO | | wozniak65
                      0
                    • RawkingOut

                      Best Permalinks for SEO - Custom structure vs Postname

                      Good Morning Moz peeps, I am new to this but intending on starting off right! I have heard a wealth of advice that the "post name" permalink structure is the best one to go with however... i am wondering about a "custom structure" combing the "post name" following the below example structure: Www.professionalwarrior.com/bodybuilding/%postname/ Where "professional" and "bodybuilding" is my focus/theme/keywords of my blog that i want ranked. Thanks a mill, RO

                      Intermediate & Advanced SEO | | RawkingOut
                      0
                    • peteboyd

                      URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?

                      A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.

                      Intermediate & Advanced SEO | | peteboyd
                      0
                    • McTaggart

                      Why is /home used in this company's home URL?

                      Just working with a company that has chosen a home URL with /home latched on - very strange indeed - has anybody else comes across this kind of homepage URL "decision" in the past? I can't see why on earth anybody would do this! Perhaps simply a logic-defying decision?

                      Intermediate & Advanced SEO | | McTaggart
                      0
                    • esiow2013

                      May know what's the meaning of these parameters in .htaccess?

                      Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
                      RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist

                      Intermediate & Advanced SEO | | esiow2013
                      1
                    • MTalhaImtiaz

                      How to check a website's architecture?

                      Hello everyone, I am an SEO analyst - a good one - but I am weak in technical aspects. I do not know any programming and only a little HTML. I know this is a major weakness for an SEO so my first request to you all is to guide me how to learn HTML and some basic PHP programming. Secondly... about the topic of this particular question - I know that a website should have a flat architecture... but I do not know how to find out if a website's architecture is flat or not, good or bad. Please help me out on this... I would be obliged. Eagerly awaiting your responses, BEst Regards, Talha

                      Intermediate & Advanced SEO | | MTalhaImtiaz
                      0
                    • Romancing

                      URL Length or Exact Breadcrumb Navigation URL? What's More Important

                      Basically my question is as follows, what's better: www.romancingdiamonds.com/gemstone-rings/amethyst-rings/purple-amethyst-ring-14k-white-gold (this would fully match the breadcrumbs). or www.romancingdiamonds.com/amethyst-rings/purple-amethyst-ring-14k-white-gold (cutting out the first level folder to keep the url shorter and the important keywords are closer to the root domain). In this question http://www.seomoz.org/qa/discuss/37982/url-length-vs-url-keywords I was consulted to drop a folder in my url because it may be to long. That's why I'm hesitant to keep the bradcrumb structure the same. To the best of your knowldege do you think it's best to drop a folder in the URL to keep it shorter and sweeter, or to have a longer URL and have it match the breadcrumb structure? Please advise, Shawn

                      Intermediate & Advanced SEO | | Romancing
                      0

                    Get started with Moz Pro!

                    Unlock the power of advanced SEO tools and data-driven insights.

                    Start my free trial
                    Products
                    • Moz Pro
                    • Moz Local
                    • Moz API
                    • Moz Data
                    • STAT
                    • Product Updates
                    Moz Solutions
                    • SMB Solutions
                    • Agency Solutions
                    • Enterprise Solutions
                    • Digital Marketers
                    Free SEO Tools
                    • Domain Authority Checker
                    • Link Explorer
                    • Keyword Explorer
                    • Competitive Research
                    • Brand Authority Checker
                    • Local Citation Checker
                    • MozBar Extension
                    • MozCast
                    Resources
                    • Blog
                    • SEO Learning Center
                    • Help Hub
                    • Beginner's Guide to SEO
                    • How-to Guides
                    • Moz Academy
                    • API Docs
                    About Moz
                    • About
                    • Team
                    • Careers
                    • Contact
                    Why Moz
                    • Case Studies
                    • Testimonials
                    Get Involved
                    • Become an Affiliate
                    • MozCon
                    • Webinars
                    • Practical Marketer Series
                    • MozPod
                    Connect with us

                    Contact the Help team

                    Join our newsletter
                    Moz logo
                    © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                    • Accessibility
                    • Terms of Use
                    • Privacy

                    Looks like your connection to Moz was lost, please wait while we try to reconnect.