Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
SEO effect of URL with subfolder versus parameters?
-
I'll make this quick and simple. Let's say you have a business located in several cities. You've built individual pages for each city (linked to from a master list of your locations).
For SEO purposes is it better to have the URL be a subfolder, or a parameter off of the home page URL:
https://www.mysite.com/dallas which is essentially https://www.mysite.com/dallas/index.php
or
http://www.mysite.com/?city=dallas which is essentially https://www.mysite.com/index.php?city=dallas
-
Thanks Miriam, This is very helpful and makes a lot of sense. What do you think of towns and villages, or boroughs of a large city. Do you think the close proximity is dangerous territory re: keyword permutations?
I take your point about unique content tailored to the people of the city - it makes a lot of sense. But what about locations that are closer to each other?
I know it's a tricky question but any insight would be most welcome.
-
That's a good question, Andrew. It's true that it's no longer a best practice to build out a set of pages featuring slightly different permutations of a keyword (car repair, auto repair, repairing cars, fixing cars, etc.). That approach is now quite dated. Honestly, it never made any sense beyond the fact that when Google wasn't quite so sophisticated, you could trick your way into some additional rankings with this type of redundant content.
The development of location landing pages is different. These are of fundamental use to consumers, and the ideal is to create each city's landing page in a way that is uniquely helpful to a specific audience. So, for example, your store in Detroit is now having a special on winter clothing right now, because it's still snowing there. Meanwhile, your store in Palm Beach is already stocking swim trunks. For a large, multi-location Enterprise, location landing pages can feature highly differentiated content, including highlights of regional-appropriate inventory and specials, as well as unique NAP, driving directions, reviews from local customers, and so much more.
The key to avoiding the trap of simply publishing a large quantity of near-duplicate pages is to put in the effort to research the communities involved and customize these location pages to best fit local needs.
-
Hi Searchout,
Good for you for creating a unique page for each of your locations. I like to keep URLs as simple as possible, for users, so I'd go with:
etc.
From an SEO perspective, I don't think there's a big difference between root URLs and subfolders. If you're using one structure, I doubt you'd see any difference from doing it differently (unless you were using subdomains, which is a different conversation).
-
Of course that cities will be counted.
That´s why im always reinforcing the idea of creating UNIQUE and Special pages for each keyword.
Google is getting smarter and smarter, so simple variations in a few words are easly detected.Hope it helps.
Best luck.
GR. -
Hi
Thanks for your response I'm interested in this too. I've been targeting cities with their own pages but I head recently that google are going to be clamping down on multiple keyword permutations. Do you think cities will be counted in this?
-
Hi there!
In my opinion, for SEO purposes it is correct to have a unique page (really different from other, not just changing the city name and location) por each big city you are optimizing.
Thus said, a subfolder is useful in order to show google the name of the city in the URL. It is common that google considers parameters different than folders.Also, remember to avoid duplicate content. /dallas/ and /dallas/index.php should not be accesible and indexable for google. Redirect one to the other or canonicalize one to the other. Same with www, non-www, http and https versions.
Hope it helps.
Best luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Lightboxes and SEO
Do lightboxes (AKA popup boxes when you click "learn more" type CTAs) have any negative effect on SEO? We are looking at revamping our sites to have more of a tiled approach, and a lightbox with summary content popping out with additional CTAs, directing to pages with more information or free trial pages. Is there any downside to this approach from an organic perspective? is there anything specific to keep in mind when creating these if not?
Intermediate & Advanced SEO | | Chris81980 -
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Should I disallow all URL query strings/parameters in Robots.txt?
Webmaster Tools correctly identifies the query strings/parameters used in my URLs, but still reports duplicate title tags and meta descriptions for the original URL and the versions with parameters. For example, Webmaster Tools would report duplicates for the following URLs, despite it correctly identifying the "cat_id" and "kw" parameters: /Mulligan-Practitioner-CD-ROM
Intermediate & Advanced SEO | | jmorehouse
/Mulligan-Practitioner-CD-ROM?cat_id=87
/Mulligan-Practitioner-CD-ROM?kw=CROM Additionally, theses pages have self-referential canonical tags, so I would think I'd be covered, but I recently read that another Mozzer saw a great improvement after disallowing all query/parameter URLs, despite Webmaster Tools not reporting any errors. As I see it, I have two options: Manually tell Google that these parameters have no effect on page content via the URL Parameters section in Webmaster Tools (in case Google is unable to automatically detect this, and I am being penalized as a result). Add "Disallow: *?" to hide all query/parameter URLs from Google. My concern here is that most backlinks include the parameters, and in some cases these parameter URLs outrank the original. Any thoughts?0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
SEO time
I wanto to be in the top of the google search. I am usiing a lot of SEO tools but... I have done it during one month. Do I have to wait more?
Intermediate & Advanced SEO | | CarlosZambrana0 -
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100 Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit= Now, I am confuse. Which is best solution to get maximum benefits in SEO?0 -
URL Shorteners. Are they SEO Friendly?
Do URL shortener services like bit.ly act as 301 redirects? I was thinking about utilizing one for longer query based URLs and didn't want to risk losing link juice. Thanks for the insight! Regards - Kyle
Intermediate & Advanced SEO | | kchandler0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1