URL Extensions (with or without??!!)
-
Hello, SEOers~
Today I have a question about URL extensions.
Which one is more search engine friendly between URL with extensions and without extensions?
e.g.
-
URL with extension : www.example.com/tv/lcd.jsp
-
URL without extension : www.example.com/tv/lcd
I heard that URL without extensions is in trend considering user experience.
User experience is also important but I would like to know from SEO perspective.
Please people~ Help me out with this~! Thanks.
-
-
Agree with Marek that you can certainly make URLs a lot firendlier for normal human beings. It can also help keep the urls shorter which can help when people are social sharing.
One significant advantage of dropping the extension is that it makes it so much easier if you ever want/need to change the platform on which your site runs.
Don't make changes to your URL structure on a whim though. You need to take a careful look at both your internal and inbound links and plan and redirections you need to put into place.
Remember that any change can be risky, so understand the risks and effort required before you start!
-
Generally without extension is better as it reduces your URL length, hence improves focus on the keywords used.
-
Hi,
In my opinion you can treat is using KISS - keep it simply stupid. When you ask somebody on a street what does lcd.jsp mean you can't get a correct answer but if you ask about 'lcd" many people will tell you correct answer.
Think robots are created as we are... and now is a big trend to use "friendly links".
so www.example.com/tv/lcd is correct and more accurate.
and follow this rule you can build link pyramid:
www.example.com/tv/lcd/sony/model-xxx
and so on, it creates very good link structure
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Create 100% new content in existing page/URL?
We have about 30 pages of content that we are hiding from the website as these articles had some issues. If these pages ranked well, would you recommend that the new content is written within these pages? Meaning, we would replace the content that's in those pages with the same topic and keywords. Or do you think it's best to start a new page instead?
On-Page Optimization | | kvillalobos0 -
With 301 Redirects Does Changing URLs Matter?
We are redesigning our website in order to give it a more modern visual look. For the most part all the content will remain the same. Our old site is hosted on .asp so all of our current URLs look something like this: www.example.com/products/food.asp We plan on using 301 redirects in order to update every URL and remove the .asp. Since we are going to be doing 301 redirects for every existing URL anyways, does it matter from an SEO and ranking standpoint, if we also change the content and structure of the URL? For example, would we see a ranking impact if we were to change the above example URL to www.example.com/food? Obviously we want to try to retain as much link juice and ranking factors as possible during this redesign. Another issue we are seeing is with the image file names of our existing website images. We are moving to a new CMS platform (WordPress) that automatically saves images using a folder path similar to this: wp-uploads/2015-08/food. Will that change affect our SEO or ranking at all? When Google crawls an image does it care about the full path? Any insight would be much appreciated! 🙂
On-Page Optimization | | BlueLinkERP0 -
Dynamic URL Parameters + Woocommerce create 404 errors
Hi Guys,
On-Page Optimization | | jeeyer
Our latest Moz crawl shows a lot of 404-errors for pages that create dynamical links for users? (I guess it are dynamic links, not sure). Situation: On a page which shows products from brand X users can use the pagination icons on the bottom, or click on: View: / 24/48/All.
When a user clicks 48 the end of the link will be /?show_products=48
I think there were some pages that could show 48 products but do not exist anymore (because products are sold out for example), and that's why they show 404's and Moz reports them. How do I deal with these 404-errors? I can't set a 301-redirect because it depends on how many products are shown (it changes every time).
Should I just ignore these kind of 404-errors? Or what is the best way to handle this situation?0 -
Correct .htaccess settings for canonical url?
I want to forward all urls to http:www.mysite.com but am a little confuse because I am getting duplicate content error: Pages with Duplicate Page Content as of Jan 15http://titanappliancerepair.com/ 1 duplicatehttp://titanappliancerepair.com 1 duplicatehttp://titanappliancerepair.com/index.html 1 duplicate*****************************************************************What should I put ion htaccess file so I can forwardhttp://titanappliancerepair.com/index.htmlhttp://titanappliancerepair.comhttp://titanappliancerepair.com/to http://www.titanappliancerepair.comor what is the correct way to do it?I'm confused because when I enter http://titanappliancerepair.com/ in browser it showshttp://titanappliancerepair.com so how can it be considered duplicate content?.Can someone help?I have godaddy and they have gave me this code to put RewriteEngine on
On-Page Optimization | | webbutler13
rewritecond %{http_host} ^coolexample.com [nc]
rewriterule ^(.)$ http://www.coolexample.com/$1 [r=301,nc]What is correct?0 -
How to exclude URL filter searches in robots.txt
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505 How can I exclude all of these filters in the robots.txt? I think it'll be: Disallow: /*?color=$ Is that the correct syntax with the $ sign in it? Thanks!
On-Page Optimization | | neenor0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
SEF URLs. Should I use / or - ?
I have o activate SEF URLs in a website. Regarding SEO, is there any difference between using / or - ? I mean, Is it better to write URLs like this: http://www.domain.com/folder/folder/page or like this: http://www.domain.com/folder-folder-page ? Is there any difference? Thanks!
On-Page Optimization | | ociosu0 -
How many urls per page is to many
I know it used to be 100 urls per page, but recently Matt cutts has said that they can count a lot more now. I was wonder what you guys thought was how many was to many per page?
On-Page Optimization | | Gordian0