Google W.M.T Missing Meta Title on AJax Pages... Weird!!
-
Hey Mozers,
I was looking through my Google Web Masters Tool under HTML Improvements. It looks like I have 2,200 pages missing Meta Titles and I was about to lose it thinking HOW COULD THIS HAPPEN! I came to realize that the pages were "Ajax Pages". This is specifically a checkprice pop up and I dont want this page crawled by google. It looks like to google I have over 2k pages missing Meta Titles and they are all "check price pop ups". How would you suggest I block this. I thought about going the easy route and removing the subfolder and putting it in the Robots.txt document and I'm scared of that because we use AJax for a bunch of calls. I'm also scared of putting in the head <metaname="robots" =="" noindex,nofollow"="">because it requires hard coding</metaname="robots">
I Know i'm not the first to come across this issue, Any Ideas??
-
Hi Rodrigo,
OK - so I was able to find the page functionality in question on your actual site just to double check what was going on.
Since this content isn't important to the page from an SEO standpoint, it makes sense to just remove these pages from the index.
To do that your best bet is probably robots.txt. Here's a good stackexchange with John Mueller confirming that.
I believe any one of the following entries can do the trick, but highly recommend you double check my work and test this before you implement it. This is also a good time for me to add a liability disclaimer if you accidentally noindex your whole site.
- Disallow: /AjaxPages/PopUp/CheckPrice
- Disallow: /AjaxPages/PopUp/CheckPrice.aspx
- Disallow: /CheckPrice
- (choose one of these, you don't need to use all of them)
Pretty sure Robots.txt is case sensitive, so /checkprice may not work.
There is also something called an X-Robots-Tag that could do this instead of your robots.txt file. Instructions on that at https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag?hl=en
Finally, consider testing this on your staging server before rolling it out live.
-
Hey Kane,
Thank you for answering. According to Webmaster tools There are over 2,200 of these Check price pop ups. They happen when a customer clicks on "Add to cart" which actives the ajax popup which asks the customer to enter a zip code to check the price.
The source code of this pop is very small I almost feel like there is no
** Example URL:**
I was thinking of putting "Disallow: /checkprice" under our robots.txt
-
Hey Rodrigo, some additional questions you try and help here.
- Are these the only files getting loaded within their subfolder?
- What do the price files look like? Full HTML file from to ? Just a section of code like a div?
- Why are the prices stored in static files? Are the prices in a database? Could this popup be accomplished with a single file template that loads a price from the database? If so, then the hard-coding problem becomes much simpler.
Feel free to share URLs if you are able, or DM them.
Also, if hard-coded, you'd want to use . There's no reason to use nofollow in the meta robots tag in the majority of cases.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google ignore duplicate meta descriptions?
Hi there SEO mozzers, I am dealing with a website that has duplicate meta descriptions (we know is bad).As a punishment, Google totally ignores the meta descriptions and picks content from the website and displays it in SERP. I already read the https://moz.com/blog/why-wont-google-use-my-meta-description but I was wondering if there is more information/knowledge out there. Any tips are appreciated!
Intermediate & Advanced SEO | | Europarl_SEO_Team0 -
On 1 of our sites we have our Company name in the H1 on our other site we have the page title in our H1 - does anyone have any advise about the best information to have in the H1, H2 and Page Tile
We have 2 sites that have been set up slightly differently. On 1 site we have the Company name in the H1 and the product name in the page title and H2. On the other site we have the Product name in the H1 and no H2. Does anyone have any advise about the best information to have in the H1 and H2
Intermediate & Advanced SEO | | CostumeD0 -
Why isn't the Google change of address tool working for me?
Last night I switched my site from http to https. Both sites are verified in Webmaster Tools but when I try to use the change of address it says- Your account doesn't contain any sites we can use for a change of address. Add and verify the new site, then try again. How do I fix this?
Intermediate & Advanced SEO | | EcommerceSite0 -
HTTPS pages - To meta no-index or not to meta no-index?
I am working on a client's site at the moment and I noticed that both HTTP and HTTPS versions of certain pages are indexed by Google and both show in the SERPS when you search for the content of these pages. I just wanted to get various opinions on whether HTTPS pages should have a meta no-index tag through an htaccess rule or whether they should be left as is.
Intermediate & Advanced SEO | | Jamie.Stevens0 -
Why Google is not showing right title tags of my website inner pages?
Hello Everyone, I have a same problem with my 3 websites that Google is not showing right title tags of inner pages of my websites goldcoast-plumbers.com: http://screencast.com/t/2AEzDcoTkWF accountants-goldcoast.com.au: metalrecyclers-brisbane.com.au One common thing is all these websites is All in one SEO Pack Plugin for SEO Is it a problem? Thanks in advance for your help! Regards
Intermediate & Advanced SEO | | Asjad0 -
Page loads fine for users but returns a 404 for Google & Moz
I have an e-commerce website that is built using Wordpress and the WP E-commerce plug-in, the products have always worked fine and the pages when you view them in a browser work fine and people can purchase the products with no problems. However in the Google merchant feed and in the Moz crawl diagnostics certain product pages are returning a 404 error message and I can't work out why, especially as the pages load fine in the browser. I had a look at the page headers and can see when the page does load the initial request does return a 404 error message, then every other request goes through and loads fine. Can anyone help me as to why this is happening? A link to the product I have been using to test is: http://earthkindoriginals.co.uk/organic-clothing/lounge-wear/organic-tunic-top/ Here is a part of the header dump that I did: http://earthkindoriginals.co.uk/organic-clothing/lounge-wear/organic-tunic-top/
Intermediate & Advanced SEO | | leapSEO
GET /organic-clothing/lounge-wear/organic-tunic-top/ HTTP/1.1
Host: earthkindoriginals.co.uk
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:21.0) Gecko/20100101 Firefox/21.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Language: en-gb,en;q=0.5
Accept-Encoding: gzip, deflate
Cookie: __utma=159840937.1804930013.1369831087.1373619597.1373622660.4; __utmz=159840937.1369831087.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); wp-settings-1=imgsize%3Dmedium%26hidetb%3D1%26editor%3Dhtml%26urlbutton%3Dnone%26mfold%3Do%26align%3Dcenter%26ed_size%3D160%26libraryContent%3Dbrowse; wp-settings-time-1=1370438004; __utmb=159840937.3.10.1373622660; PHPSESSID=e6f3b379d54c1471a8c662bf52c24543; __utmc=159840937
Connection: keep-alive
HTTP/1.1 404 Not Found
Date: Fri, 12 Jul 2013 09:58:33 GMT
Server: Apache
X-Powered-By: PHP/5.2.17
X-Pingback: http://earthkindoriginals.co.uk/xmlrpc.php
Expires: Wed, 11 Jan 1984 05:00:00 GMT
Cache-Control: no-cache, must-revalidate, max-age=0
Pragma: no-cache
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 6653
Connection: close
Content-Type: text/html; charset=UTF-80 -
Google Ranking Wrong Page
The company I work for started with a website targeting one city. Soon after I started SEO for them, they expanded to two cities. Optimization was challenging, but we managed to rank highly in both cities for our keywords. A year or so later, the company expanded to two new locations, so now 4 total. At the time, we realized it was going to be tough to rank any one page for four different cities, so our new SEO strategy was to break the website into 5 sections or minisites consisting of 4 city-targeted sites, and our original site which will now be branded as more of a national website. Our URL structures now look something like this:
Intermediate & Advanced SEO | | cpapciak
www.company.com
www.company.com/city-1
www.company.com/city-2
www.company.com/city-3
www.company.com.city-4 Now, in the present time, all is going well except for our original targeted city. The problem is that Google keeps ranking our original site (which is now national) instead of the new city-specific site we created. I realize that this is probably due to all of the past SEO we did optimizing for that city. My thoughts are that Google is confused as to which page to actually rank for this city's keyword terms and I was wondering if canonical tags would be a possible solution here, since the pages are about 95% identical. Anyone have any insight? I'd really appreciate it!0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0