Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
SEO effect of URL with subfolder versus parameters?
-
I'll make this quick and simple. Let's say you have a business located in several cities. You've built individual pages for each city (linked to from a master list of your locations).
For SEO purposes is it better to have the URL be a subfolder, or a parameter off of the home page URL:
https://www.mysite.com/dallas which is essentially https://www.mysite.com/dallas/index.php
or
http://www.mysite.com/?city=dallas which is essentially https://www.mysite.com/index.php?city=dallas
-
Thanks Miriam, This is very helpful and makes a lot of sense. What do you think of towns and villages, or boroughs of a large city. Do you think the close proximity is dangerous territory re: keyword permutations?
I take your point about unique content tailored to the people of the city - it makes a lot of sense. But what about locations that are closer to each other?
I know it's a tricky question but any insight would be most welcome.
-
That's a good question, Andrew. It's true that it's no longer a best practice to build out a set of pages featuring slightly different permutations of a keyword (car repair, auto repair, repairing cars, fixing cars, etc.). That approach is now quite dated. Honestly, it never made any sense beyond the fact that when Google wasn't quite so sophisticated, you could trick your way into some additional rankings with this type of redundant content.
The development of location landing pages is different. These are of fundamental use to consumers, and the ideal is to create each city's landing page in a way that is uniquely helpful to a specific audience. So, for example, your store in Detroit is now having a special on winter clothing right now, because it's still snowing there. Meanwhile, your store in Palm Beach is already stocking swim trunks. For a large, multi-location Enterprise, location landing pages can feature highly differentiated content, including highlights of regional-appropriate inventory and specials, as well as unique NAP, driving directions, reviews from local customers, and so much more.
The key to avoiding the trap of simply publishing a large quantity of near-duplicate pages is to put in the effort to research the communities involved and customize these location pages to best fit local needs.
-
Hi Searchout,
Good for you for creating a unique page for each of your locations. I like to keep URLs as simple as possible, for users, so I'd go with:
etc.
From an SEO perspective, I don't think there's a big difference between root URLs and subfolders. If you're using one structure, I doubt you'd see any difference from doing it differently (unless you were using subdomains, which is a different conversation).
-
Of course that cities will be counted.
That´s why im always reinforcing the idea of creating UNIQUE and Special pages for each keyword.
Google is getting smarter and smarter, so simple variations in a few words are easly detected.Hope it helps.
Best luck.
GR. -
Hi
Thanks for your response I'm interested in this too. I've been targeting cities with their own pages but I head recently that google are going to be clamping down on multiple keyword permutations. Do you think cities will be counted in this?
-
Hi there!
In my opinion, for SEO purposes it is correct to have a unique page (really different from other, not just changing the city name and location) por each big city you are optimizing.
Thus said, a subfolder is useful in order to show google the name of the city in the URL. It is common that google considers parameters different than folders.Also, remember to avoid duplicate content. /dallas/ and /dallas/index.php should not be accesible and indexable for google. Redirect one to the other or canonicalize one to the other. Same with www, non-www, http and https versions.
Hope it helps.
Best luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Infinite Scrolling work with unique URLS as users scroll down? And is this SEO friendly?
I was on a site today and as i scrolled down and viewed the other posts that were below the top one i read, i noticed that each post below the top one had its own unique URL. I have not seen this and was curious if this method of infinite scrolling is SEO friendly. Will Google's spiders scroll down and index these posts below the top one and index them? The URLs of these lower posts by the way were the same URLs that would be seen if i clicked on each of these posts. Looking at Google's preferred method for Infinite scrolling they recommend something different - https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html . Welcome all insight. Thanks! Christian
Intermediate & Advanced SEO | | Sundance_Kidd0 -
Are these URL hashtags an SEO issue?
Hi guys - I'm looking at a website which uses hashtags to reveal the relevant content So there's page intro text which stays the same... then you can click a button and the text below that changes So this is www.blablabla.com/packages is the main page - and www.blablabla.com/packages#firstpackage reveals first package text on this page - www.blablabla.com/packages#secondpackage reveals second package text on this same page - and so on. What's the best way to deal with this? My understanding is the URLs after # will not be indexed very easily/atall by Google - what is best practice in this situation?
Intermediate & Advanced SEO | | McTaggart0 -
Is .ME domain is effective in SEO ?
I am always listening about TLD. com. org .net but what about the .me domain. Can this will be effective in SEO. Can i able to beat down my competitors, if i choose .me . I also have a .com or other TLD option but if i am making my name than .me is for me but i need your suggestion for the seo purpose. Is there really domain affective in term of SEO.
Intermediate & Advanced SEO | | pnb5670 -
Does having a ? on the end of your URL affect your SEO?
I have some redirects that were done with at "?" at the end of the URL to include google coding (i.e. you click on an adwords link and the google coding follows the redirected link). When there is not coding to follow the link just appears as "filename.html?". Will that affect us negatively SEO-wise? Thank you.
Intermediate & Advanced SEO | | RoxBrock1 -
Is CloudFlare bad for SEO?
I have been hit by DDoS attacks lately...not on a huge scale, but probably done by some "script kiddies" or competitors of mine. Still, I need to take some action in order to protect my server and my site against all of this spam traffic that is being sent to it. In the process of researching the tools available for defending a website from a DDoS attack, I came across the service offered by CloudFlare.com. According to the CloudFlare website, they protect your site against a DDoS attack by showing users/visitors they find suspicious an interstitial that asks them if they are a real user or a bot...this interstitial contains a Captcha that suspicious users are asked to enter in order to visit the site. I'm just wondering what kind of an effect such an interstitial could have on my Google rankings...I can imagine that such a thing could add to increased click-backs to the SERPs and, if Google detects this, to lower rankings. Has anyone had experience with the DDoS protection services offered by CloudFlare, who can say a word or two regarding any effects this may have on SEO? Thanks
Intermediate & Advanced SEO | | masterfish1 -
Overly-Dynamic URL
Hi, We have over 5000 pages showing under Overly-Dynamic URL error Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like, http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3 Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there? like: Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y if not how far into the url should be disallowed? So far we have added the following to our robots,txt Disallow: /?sort=title Disallow: /?use_selected_filter=Y Disallow: /?sort=price Disallow: /?clearall=Y Just not sure if they are correct. Any help would be greatly appreciated. Thank you,Kami
Intermediate & Advanced SEO | | dellamoda2 -
Is there any negative SEO effect of having comma's in URL's?
Hello, I have a client who has a large ecommerce website. Some category names have been created with comma's in - which has meant that their software has automatically generated URL's with comma's in for every page that comes beneath the category in the site hierarchy. eg. 1 : http://shop.deliaonline.com/store/music,-dvd-and-games/dvds-and-blu_rays/ eg. 2 : http://shop.deliaonline.com/store/music,-dvd-and-games/dvds-and-blu_rays/action-and-adventure/ etc... I know that URL's with comma's in look a bit ugly! But is there 'any' SEO reason why URL's with comma's in are any less effective? Kind Regs, RB
Intermediate & Advanced SEO | | RichBestSEO0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1