Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
-
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site.
Thanks.
-
Awesome! Thanks for clarifying that.
I'm assuming I'd us a rule similar to below. Is that right/
RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME}\.html -f RewriteRule ^(.*)$
-
If it's for cosmetic reasons then they will likely be causing themselves SEO issues (there is some PageRank leakage with 301 redirects). My suggestion would be to wait until the next site redevelopment and implement extension-less filenames so that the technology doesn't affect the file names in the future. At that time you can implement 301 redirects from the old names with the extensions to the new. It's fairly simple to do in the htaccess file as long as you keep the filename the same (eg /some-filename.php to /some-filename).
The only valid reason to change at this time is if you are experiencing indexation issues and since that is not the case...
-
THey are just seeing sites without them and also would like to not have them.
That's about it, really. Nothing is broken or having issues being indexed.
-
SEs read .php files so this change probably isn't necessary. If you did, you might establish rewrite rules in .htaccess and add 301 redirects. But again, why mess with the file extensions? Are they having indexation issues?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Markups on The Same Page - Best Solution?
Hi there! I have a website that is build in react javascript, and I'm trying to use markup on my pages. They are mostly articles about general topics with common questions (about the topic), and for most articles I would like to use two markups: article markup + FAQ Markup ( for the questions in the article) article markup + how-to markup Can I do this or will Google get confused? Since I have two @type at the same time, for example @type": "FAQPage" and "@type": "Article". How should I think? I'm using https://schema.dev/ right now. Thanks!
Intermediate & Advanced SEO | | Leowa0 -
Why has my website been removed from Bing?
I have a website that has recently been removed from Bing's index, but can't figure out why. The website isn't new, and it is indexed just fine on Google. These are the steps I've tried: The website is verified in Bing Webmaster Tools and successfully submitted the sitemap. I tested the URL to ensure that Bingbot is allowed to crawl the site I submitted URLs to Bing via the URL Submission tool There isn't a "noindex" on the site preventing it from being indexed When I do a URL Inspection, an error message comes up saying "The inspected URL is known to Bing but has some issues which are preventing us from serving it to our users. We recommend you to follow Bing Webmaster Guidelines." I contacted Bing to ask whether the website was removed in error, but received a reply that the website doesn't comply with Bing's quality guidelines, but they wouldn't go into detail as to which guidelines the website isn't meeting. The website URL is https://www.pardeehospital.org. Can anyone offer any advice or insight as to why Bing won't index our site? Thank you!
Intermediate & Advanced SEO | | lindsey.steinkamp0 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
What's the best possible URL structure for a local search engine?
Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
Intermediate & Advanced SEO | | _nitman0 -
Help! The website ranks fine but one of my web pages simply won't rank on Google!!!
One of our web pages will not rank on Google. The website as a whole ranks fine except just one section...We have tested and it looks fine...Google can crawl the page no problem. There are no spurious redirects in place. The content is fine. There is no duplicate page content issue. The page has a dozen product images (photos) but the load time of the page is absolutely fine. We have the submitted the page via webmaster and its fine. It gets listed but then a few hours later disappears!!! The site has not been penalised as we get good rankings with other pages. Can anyone help? Know about this problem?
Intermediate & Advanced SEO | | CayenneRed890 -
Putting "noindex" on a page that's in an iframe... what will that mean for the parent page?
If I've got a page that is being called in an iframe, on my homepage, and I don't want that called page to be indexed.... so I put a noindex tag on the called page (but not on the homepage) what might that mean for the homepage? Nothing? Will Google, Bing, Yahoo, or anyone else, potentially see that as a noindex tag on my homepage?
Intermediate & Advanced SEO | | Philip-DiPatrizio0 -
Do 404 pages pass link juice? And best practices...
Last year Google said bad links to 404 pages wouldn't hurt your site. Could that still be the case in light of recent Google updates to try and combat spammy links and negative SEO? Can links to 404 pages benefit a website and pass link juice? I'd assume at the very least that any link juice will pass through links FROM the 404 page? Many websites have great 404 pages that get linked to: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fretardzone.com%2F404 - that was the first of four I checked from the "60 Really Cool...404 Pages" that actually returned the 404 HTTP Status! So apologies if you find the word 'retard' offensive. According to Open Site Explorer it has a decent Page Authority and number of backlinks - but it doesn't show in Google's SERPs. I'd never do it, but if you have a particularly well-linked to 404 page, is there an argument for giving it 200 OK Status? Finally, what are the best practices regarding 404s and address bar links? For example, if
Intermediate & Advanced SEO | | Alex-Harford
www.examplesite.com/3rwdfs returns a 404 error, should I make that redirect to
www.examplesite.com/404 or leave it as is? Redirecting to www.examplesite.com/404 might not be user-friendly as people won't be able to correct the URL in the address bar. But if I have a great 404 page that people link to, I don't want links going to loads of random pages do I? Is either way considered best practice? If I did a 301 redirect I guess it would send the wrong signal to the crawlers? Should I use a 302 redirect, or even a 304 Not Modified redirect?1 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0