Contact form on home page.
-
I am looking to add a contact form onto my home page and I was wondering if it made sense to change my index.html to an index.php.
If i do make this change, would it have any impact on my search rankings?
-
If you don't mind doing a little coding you could try this simple contact form
http://css-tricks.com/nice-and-simple-contact-form/
There are other options out there for a full blown contact form from MailChimp et al, although many seem to be trying to get money for what is essentially a really basic thing.
Mail Chimp or Constant Contact is good if all you want is basic email capture, with Mail Chimp offering free services under 2000 subscribers, although it does include a mail chimp link or small logo.
If you Google "html contact form" you'll get lots of options. Personally I would go with Chris Coyier's option above and include his spam options.
If you dont need to change from html to php I would'nt do it. 301s are great but when I changed a site from html with a 301 it didnt pass all the link juice and there was a slight drop in traffic, plus it stop you needlessly complicating it.
If you need help in setting it up let me know
-
you don't believe it increases the chance a person will fill in the contact page? I'm trying to increase my conversion rates.
-
It's a basic site.
-
I would not put a contact form on the homepage, maybe a link to the contact page.
-
Depending on the platform and depending on the contact form you either dont need php, or you can just output html files with an SEO/SEF plugin.
Is it a basic site or one with a cms?
-
Would it also help if Joel put a canonical on the new index.php page?
-
I'm a big fan of putting some kind of contact form or at least a call-to-action on every single page of your site, so great idea. Anyways, that's not your question, so if you need to switch to index.php in order for your form to work, then go ahead and do so. But make sure you 301 redirect index.html to index.php since search engines will consider them to be two different pages. I am assuming your site is hosted on a Linux server with access to .htaccess so you should be able to accomplish this by adding the following lines to the .htaccess -
RewriteEngine on RewriteRule index\.html index.php [NC,R]
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix non-crawlable pages affected by CSS modals?
I stumbled across something new when doing a site audit in SEMRUSH today ---> Modals. The case: Several pages could not be crawled because of (modal:) in the URL. What I know: "A modal is a dialog box/popup window that is displayed on top of the current page" based on CSS and JS. What I don't know: How to prevent crawlers from finding them.
Web Design | | Dan-Louis0 -
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. How long could Google take to crawl/index the new pages and rank the keywords used within those pages?
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. The 3 locations old domains were redirected to their sites within our main brand domain. How long could Google take to crawl/index the new pages and rank the keywords used within those pages? And possibly increase our domain authority hopefully? We didn't want our brand spread out over multiple websites/domains on the internet. This also allowed for more content to be written on pages, per each of our locations service's, as well.
Web Design | | BurgSimpson0 -
Referring subdirectory pages from 3rd hierarchy level pages. Will this hurts?
Hi all, We have product feature pages at 3rd tier like website.com/product/features. We have the help guides for each of these features on a different subdirectory like website.com/help/guides. We are linking these help guides from every page of features. So, will it hurts us anywhere just because we are encouraging 4th tier pages in website, moreover they are from different sub-directory. Thanks
Web Design | | vtmoz0 -
HELP! IE secure page display issue on new live site
For some reason IE 7, 8, & 9 do not display the following page: https://www.jwsuretybonds.com/protools.htm All they show is the Norton seal. It shows properly in all other browsers without issue (including IE 10+), but the earlier versions flash the page for a split second, then hides everything. Can someone shed some light on this? This is a new live site we just launched minutes ago and these browsers account for 12% of our overall traffic. UGH I hate you microsoft!!! Thanks all 🙂
Web Design | | TheDude0 -
Mobile tab for page speed insight
I am getting mobile error occurred problem.Can somebody help me about this issue? https://developers.google.com/speed/pagespeed/insights/?hl=en&utm_source=wmx&utm_campaign=wmx_otherlinks&url=www.printez.com&tab=mobile Morris
Web Design | | PrintEZ0 -
Multiple Local Schemas Per Page
I am working on a mid size restaurant groups site. The new site (in development) has a drop down of each of the locations. When you hover over a location in the drop down it shows the businesses info (NAP). Each of the location in the Nav list are using schema.org markup. I think this would be confusing for search robots. Every page has 15 address schemas and individual restaurants pages NAP is at the below all the locations' schema/NAP in the DOM. Have any of you dealt with multiple schemas per page or similar structure?
Web Design | | JoshAM0 -
Still too many internal links reported on page
Hi Guys I am new here, and very much learning a lot, and enjoying the benefits of being an SEOMoz user. So here goes with my first question (probably of many). I have known for sometime that our website has a top heavy number of links in the primary navigation. But I wasn't too sure how important this was. Our main objective was to make an east to use nav for customers. All of the feedback we have had says that customers really like our navigation, as it is easy to use etc etc. However, when running an SEOMoz campaign on our site, again we got back that there are too many links on the pages. Example, home page has 500+ links. So I decided to do something about this. I have implemented what I think is a good solution where by the drop down navigation isn't loaded on first load. If the user then hovers over one of our "departments" the sub navigation is loaded via Ajax and dropped in. This means if the user wants it, they get it, if not then it's not loaded with the page. My theory being that Google loads the page without all the links, but a user gets the links as and when they need them. I tested with the SEOMoz toolbar and this tells me that when I load the home page there is 167 links in it vs 500+ previously. However, the my campaign still tells me that my home page has 450+ links (and this is a recent crawl of the page). Our site is here: www.uniquemagazines.co.uk Can you tell me is what I have done is a) a good solution and b) does the SEOMoz crawler have the ability to trigger the hover event and cause the AJAX load of the sub navigation content?
Web Design | | TheUniqueSEO0 -
Do you think it will be a good idea to delete old blog pages off the server
I paid somebody to build my website using Dreamweaver, and at one point I didn't know how to use the template which automatically updates every page in the menu section so I stupidly broke the template on every new page when I made the websites blog and put the pages into a subfolder. I realised this was a silly thing to do and now and I now know how to use the template correctly I've copied every single page over from the subfolder and put it into the main template. Now I can update the template menu and every page changes automatically. The only problem is I've now got two versions of every page of my blog on the website. For some reason when I do a sitemap it comes up with a links to the old blog pages I, don't know why when I've removed the links from the blog page? and also the new copies also. I have basically got a copys of all blog pages. Do you think it will be a good idea to delete old indexed blog pages off the server so that when Google spiders the site it will pick up only the new links to the copy pages?
Web Design | | whitbycottages0