Help! Pages not being indexed
-
Hi Mozzers, I need your help.
Our website (www.barnettcapitaladvisors.com) stopped being indexed in search engines following a round of major changes to URLs and content. There were a number of dead links for a few days before 301 redirects were properly put in place. And now, only 3 pages show up in bing when I do the search "site:barnettcapitaladvisors.com". A bunch of pages show up in Google for that search, but they're not any of the pages we want to show up.Our home page and most important services pages are nowhere in search results.
What's going on here?
Our sitemap is at http://www.barnettcapitaladvisors.com/sites/default/files/users/AndrewCarrillo/sitemap/sitemap.xml
Robots.txt is at: http://www.barnettcapitaladvisors.com/robots.txtThanks!
-
Thanks for the help! The Screaming Frog site crawler really makes it much easier to understand the robots.txt file. It looks like it was set to "disallow" robots.
-
Nothing really to "resolve" this.
those tools could simply provide you with cues and signs to figure out what went wrong and when... traffic and referral data on those pages could also indicate when they exactly stopped appearing in search and you can the correlate that with your dev actions if you do have some memory or log of those steps taken and backtrack and find out what was set wrong.
So you see absolutely no errors no messages nothing in GWT per this old post?
if you are using drupal, check this thread about it, it may well be your problem...
also see the very bottom of this page that may help you diagnose if you have a setting wrong in your robots file.
-
Hi Brien
The X-Robots tag has set many of your pages to noindex,nofollow. X-Robots is delivered as part of the HTTP response header so you won't find it in your HTML code.
In addition to that, your canonical tags are a little confused: the homepage canonical tag is set to
http://barnettcapitaladvisors.com/miami-financial-advisor-south-florida-financial-planner
when it should be barnettcapitaladvisors.com
I found these things by using Screaming Frog site crawler.
In your Robots file you're not allowing Yandex or Baidu to crawl the site - are you aware of this?
Have your developer correct the X-Robots and canonical issues.
-
Yes, I've searched Google Webmaster Tools and see a big dropoff in impressions on 5/4/13.
There are 14 pages in the sitemap that should be showing up. But, instead, what's getting indexed (9 pages in Google and 2 of the 3 pages in Bing) are pages that aren't in the sitemap -- ones that were created and then taken down or that we don't care to be indexed.
Yes, some of the URLs are new (three weeks ago) and that may be why they haven't appeared yet, but what about the home page? That isn't a new URL and it isn't appearing. Also, one new URL (which is also one of the important services pages that we want to appear) is appearing in bing: www.barnettcapitaladvisors.com/wealth-management but the other ones are not.I use analytics as well, but what particularly in Google Analytics will help me resolve this?
-
have you checked your google/bing webmasters tool yet? that often will give you the best answer in these situations?
Do you use analytics? Have you checked that as well?
if any pages do show up when you site search your domain on google, then you are being indexed, if pages you made are too new, it may take some time for them to become available in google's SERPs
it all just depends, how long has it been?
I see only 2 links in bing results for your site, but for google we have 9 results, how many should it be? 14?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Customer Reviews on Product Page / Pagination / Crawl 3 review pages only
Hi experts, I present customer feedback, reviews basically, on my website for the products that are sold. And with this comes the ability to read reviews and obviously with pagination to display the available reviews. Now I want users to be able to flick through and read the reviews to help them satisfy whatever curiosity they have. My only thinking is that the page that contains the reviews, with each click of the pagination will present roughly the same content. The only thing that changes is the title tags which will contain the number in the H1 to display the page number. I'm thinking this could be duplication but i have yet to be notified by Google in my Search console... Should i block crawlers from crawling beyond page 3 of reviews? Thanks
Technical SEO | | Train4Academy.co.uk0 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
We have set up 301 redirects for pages from an old domain, but they aren't working and we are having duplicate content problems - Can you help?
We have several old domains. One is http://www.ccisound.com - Our "real" site is http://www.ccisolutions.com The 301 redirect from the old domain to the new domain works. However, the 301-redirects for interior pages, like: http://www.ccisolund.com/StoreFront/category/cd-duplicators do not work. This URL should redirect to http://www.ccisolutions.com/StoreFront/category/cd-duplicators but as you can see it does not. Our IT director supplied me with this code from the HT Access file in hopes that someone can help point us in the right direction and suggest how we might fix the problem: RewriteCond%{HTTP_HOST} ccisound.com$ [NC] RewriteRule^(.*)$ http://www.ccisolutions.com/$1 [R=301,L] Any ideas on why the 301 redirect isn't happening? Thanks all!
Technical SEO | | danatanseo0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0 -
Removing some of the indexed pages from my website
I am planning to remove some of the webpages from my website and these webpages are already indexed with search engine. Is there any way by which I need to inform search engine that these pages are no more available.
Technical SEO | | ArtiKalra0 -
Page not being indexed
Hi all, On our site we have a lot of bookmaker reviews, and we are ranking pretty good for most bookmaker names as keywords, however a single bookmaker seems to have been shunned by Google. For a search "betsafe" in Denmark, this page does not appear among the top 50: http://www.betxpert.com/bookmakere/betsafe All of our other review pages rank in top 10-20 for the bookmaker name as keyword. What to do if Google has "banned" a page? Best regards, Rasmus
Technical SEO | | rasmusbang0 -
So I Get An Interior Page To Rank, But Lower. How does that help?
Hi, So, I have a term that would bounce around #3 to 5 for. I make a page some months ago that is solely targeted to that term. And, voila! Google sees the new page as the best result instead of the home page and the new page ranks, but at #9 or #10. Of course the homepage is a stronger page, but the new page is better targeted to that term. Using the handy SEOMoz toolbar, the homepage has a page authority of 59 and the newer interior page has 32. Both are equally functional to my purpose. Part of me just wants to 301 in the interior page to the homepage and forget about it. It would take forever to get that interior page up to a similar page authority through the magic of links. What do you think I should do. I feel like I'm channeling Wile E. Coyote. Thanks! Best...Mike
Technical SEO | | 945010