Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it bad to have /index.php at the end of a uri?
-
Is it bad for SEO if traffic is directed to "http://www.example.com/someuri/index.php" instead of "http://www.example.com/someuri/" and would it be works setting up a redirect rule at htaccess level?
-
Yes bad for both. You now have the name of a file acting as the name of a folder.
As mentioned above - kill the use of index.php as your index "file" - just end in a slash.
I know php treats these as routes/queries that then produce a page, but it can get things all messed up real quick.
-
Oops thanks for all you answers, but what i should have said is: Is having "/index.php/" half way through the URI like so
"http://www.example.com/someuri/index.php/more_uri/"
bad for SEO/UX?
To clarify if one searched on Google for more_uri and everything else was equal would the index.php in the middle of that URI be damaging to the ranking?
Sorry about that
-
Whilst I don't think the index.php will have a direct impact on the SEO of your website it could easily have an indirect impact.
As CleverPhD rightly points out it is a pain in the *** to remember and type that sort of URL.
Not only for yourself but also for other websites and customers.
The impact this has is hard to quantify... If I'm a site in your niche and want to link to you does this put me off? What if I link to the wrong site?
Beyond that ending in index isn't as nice a user experience as just ending at the page name and ultimately its my belief that if you do whats best for the user you'll get good results from google.
-
Correct - the duplicate issue is what will hurt you. Whatever you go with, make sure the other variants 301 redirect to the "true" page.
-
OK thanks, so index.php won't effect the SEO results. But not redirecting it, as both /index.php and / work correctly and go to the same php file, will result in the same content being registered twice by Google I'm guessing?
-
It is not "bad", although typical style would be that you can drop it as the extra characters are not needed and nobody likes extra typing - just ask Mr. Twitter. He used brevity to become a billionaire! Hmm .. I wish we could get Moz points for alliteration.
What is key is that you are consistent in your use of it. If you want to use /index.php then go for it. Just make sure every time you link to that URL in your menus or in articles when you Twiddle it of Farcebook it, you include the /index.php at the end as you do not want to have duplicate URLs for the same page. I would also setup 301 redirects so that the / only version redirects to the index.php version.
All of that said, you are going to find that after the 104th time of Twiddling that URL, you will say, "Gee, it sure is a pain to type all those extra characters." You will also find that when people are going to share your URLs they may have a tendency to drop the index.php as again, it is extra work. If you have the redirect in place, you will be ok, but I say, why have you and everyone else do all that extra work to start with. Just start with the URL ending in the slash and stay with that. Have all other versions of the index page (index.php, index.htm or even a non slashed version, etc) 301 redirect to the URL that ends in a /.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
Incorporating Spanish Page/Site
We bought an exact match domain (in Spanish) to incorporate with regular website for a particular keyword. This is our first attempt at this, and while we do have Spanish speaking staff that will translate/create a nice, quality page, we're not going to redo everything in Spanish page. Any advice on how to implement this? Do I need to create a whole other website in Spanish? Will that be duplicate content if I do? Can I just set it up to show the first page in Spanish, but if they click on anything else it redirects to our site? I'm pretty clueless on this, so if anything I've suggested is off-the-wall or a violation, I'm really just spit-balling, trying to figure out how to implement this. Thanks, Ruben
Web Design | | KempRugeLawGroup0 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Multiple websites for different service areas/business functions?
I'm wondering what the implications are for having multiple domains for different service areas of a company? I realize having multiple domains for one company can be troublesome because of the possibility of duplicate content, keyword cannibalization, and linkbuilding to multiple domains. But when the domains are for very different service offerings/unique business functions that each serve their own purpose (and have different positionings), is there a downside to having more than one domain? Any thoughts would be appreciated!
Web Design | | KevinBloom0 -
Accordion Fold Ups Bad For Google
http://fandicoach.com/products Right now I have these accordion things on the website. Are they bad for google in terms of being an SEO best practice? I want to avoid doing anything black hat. Thanks!
Web Design | | OOMDODigital0 -
Redirects (301/302) versus errors (404)
I am not able to convincingly decide between using redirects versus using 404 errors. People are giving varied opinions. Here are my cases 1. Coding errors - we put out a bad link a. Some people are saying redirect to home page; the user at least has something to do PLUS more importantly it does NOT hurt your SEO ranking. b. Counter - the page ain't there. Return 404 2. Product removed - link1 to product 1 was out there. We removed product1; so link1 is also gone. It is either lying in people's bookmarks, OR because of coding errors we left it hanging out at some places on our site.
Web Design | | proptiger0 -
Does Google follow links inside a <noscript>tag?</noscript>
I'm looking at making an embedable calculator and asking users to embed it to their website. I had the idea of using javascript to include the calculator which would also conatain a text link back to my site in order to gain some back links. If it's possible Google won't see the link (as they may not execute the javascript), is it safe to place the link in the <noscript>tag? If so, Will it be indexed and will Page Rank be passed?</span></p> <p>Thanks in advance for your answers. </p> <p>Anthony</p> <p><span style="color: #5e5e5e;"><br /></span></p></noscript>
Web Design | | BallyhooLtd0