Help my site it's not being indexed
-
Hello... We have a client, that had arround 17K visits a month... Last september he hired a company to do a redesign of his website....They needed to create a copy of the site on a different subdomain on another root domain... so I told them to block that content in order to not affect my production site, cause it was going to be an exact replica of the content but different design....
The developmet team did it wrong and blocked the production site (using robots.txt), so my site lost all it's organica traffic, which was 85-90% of the total traffic and now only get a couple of hundreds visits a month... First I thought we had been somehow penalized, however when I the other site recieving new traffic and being indexed i realized so I switched the robots.txt and created 301 redirect from the subdomain to the production site.
After resending sitemaps, links to google+ and many things I can't get google to reindex my site.... when i do a site:domain.com search in google I only get 3 results. Its been now almost 2 month and honestly dont know what to do....
Any help would be greatly appreciated
Thanks
Dan
-
If it makes you feel any better, this is the solution about once a month in Q&A. You're not the first, and you certainly won't be the last!
-
This is way I love the SEOMOZ community no matter how stupid the solution to your problem might be people will let you know.
I feel like an amateur (cause I'm) I think I overtrusted yoast's plugin, Because whenever your are blocking the robots it will tell you, hoewever this time it didn't and the site, through wordpress config, was blocking the site.
I changed it, resubmited the sitemaps, checked the code and updated yoast's great plugin....
Thanks guys... I SEOPromise to check always the code myself
Dan
-
When I go to your page and look at the source code I see this line:
name='robots' content='noindex,nofollow' />
You are telling the bots not to index the page or follow any links on the page. This is in the source code for your home page.
I'd go back into the wordpress settings (you are using Yoast) and make sure to enable the site for search engine visibility!
Once you do that, and verify that the code is changed to "='index,follow'" then resubmit your sitemaps via webmaster tools.
-
Great tool I'm taking a look right now
thanks
Dan
-
I check GWT everyday, not even one page has been indexed... Nor We have any manual action suggested by google
Thanks
Dan
-
-
A suggestion for the future: use some type of code monitoring service, such as https://polepositionweb.com/roi/codemonitor/index.php (no relationship with the company, it's just what I use), and have it alert you to any changes in the robots.txt file on both the live and staging environments.
I was in a situation at a previous employment where the development team wasn't the best at SEO, and I had experienced the robots.txt from the dev site being put on the live site, and the other way around, and also things being added to or removed from the robots.txt without our request or knowledge. The verification files for Google and Bing Webmaster Tools would sometimes go missing, too.
I used that code monitor to check once a day and email me if there were changes to the robots.txt or verification files on the live site and the robots.txt of all of our dev and staging sites (to make sure they weren't accidentally indexed). Was a huge help!
-
Yes take another look at that robots file for sure. If you provided us with the domain we might be able to better help you.
Also, go into Webmaster Tools and poke around. Check how many pages are being indexed, look at sitemaps, do a fetch-as-google, etc.
-
Hi Dan
It sounds like your robot.txt are still blocking your site despite the redirects. You might be best getting rid of the robot.txt and starting again ensuring nothing is blocked that it shouldn't.
regards, David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I prevent 404's from hurting my site?
I manage a real estate broker's site on which the individual MLS listing pages continually create 404 pages as properties are sold. So, on a site with 2200 pages indexed, roughly half are 404s at any given time. What can I do to mitigate any potential harm from this?
Intermediate & Advanced SEO | | kimmiedawn0 -
Can't get auto-generated content de-indexed
Hello and thanks in advance for any help you can offer me! Customgia.com, a costume jewelry e-commerce site, has two types of product pages - public pages that are internally linked and private pages that are only accessible by accessing the URL directly. Every item on Customgia is created online using an online design tool. Users can register for a free account and save the designs they create, even if they don't purchase them. Prior to saving their design, the user is required to enter a product name and choose "public" or "private" for that design. The page title and product description are auto-generated. Since launching in October '11, the number of products grew and grew as more users designed jewelry items. Most users chose to show their designs publicly, so the number of products in the store swelled to nearly 3000. I realized many of these designs were similar to each and occasionally exact duplicates. So over the past 8 months, I've made 2300 of these design "private" - and no longer accessible unless the designer logs into their account (these pages can also be linked to directly). When I realized that Google had indexed nearly all 3000 products, I entered URL removal requests on Webmaster Tools for the designs that I had changed to "private". I did this starting about 4 months ago. At the time, I did not have NOINDEX meta tags on these product pages (obviously a mistake) so it appears that most of these product pages were never removed from the index. Or if they were removed, they were added back in after the 90 days were up. Of the 716 products currently showing (the ones I want Google to know about), 466 have unique, informative descriptions written by humans. The remaining 250 have auto-generated descriptions that read coherently but are somewhat similar to one another. I don't think these 250 descriptions are the big problem right now but these product pages can be hidden if necessary. I think the big problem is the 2000 product pages that are still in the Google index but shouldn't be. The following Google query tells me roughly how many product pages are in the index: site:Customgia.com inurl:shop-for Ideally, it should return just over 716 results but instead it's returning 2650 results. Most of these 1900 product pages have bad product names and highly similar, auto-generated descriptions and page titles. I wish Google never crawled them. Last week, NOINDEX tags were added to all 1900 "private" designs so currently the only product pages that should be indexed are the 716 showing on the site. Unfortunately, over the past ten days the number of product pages in the Google index hasn't changed. One solution I initially thought might work is to re-enter the removal requests because now, with the NOINDEX tags, these pages should be removed permanently. But I can't determine which product pages need to be removed because Google doesn't let me see that deep into the search results. If I look at the removal request history it says "Expired" or "Removed" but these labels don't seem to correspond in any way to whether or not that page is currently indexed. Additionally, Google is unlikely to crawl these "private" pages because they are orphaned and no longer linked to any public pages of the site (and no external links either). Currently, Customgia.com averages 25 organic visits per month (branded and non-branded) and close to zero sales. Does anyone think de-indexing the entire site would be appropriate here? Start with a clean slate and then let Google re-crawl and index only the public pages - would that be easier than battling with Webmaster tools for months on end? Back in August, I posted a similar problem that was solved using NOINDEX tags (de-indexing a different set of pages on Customgia): http://moz.com/community/q/does-this-site-have-a-duplicate-content-issue#reply_176813 Thanks for reading through all this!
Intermediate & Advanced SEO | | rja2140 -
Was my site hit by Panda or Penguin? Looking for diagnosis help
My URL is: www.westlakedermatology.com Hello Mozers, I'm looking for some help or guidance as to why my site fell off the "rankings cliff" on 9/5. In the forums I hear a lot of others with a similar issue, and some speculation it is due to a Panda refresh. However, looking at our site we have unique content with each page having over 300-400 words (so it's not light or duplicate content). We get a lot of leads that verbally tell us our content helped answer some of their questions so I'm pretty confident its good for users. Can anyone see an issue with the content on our site? In terms of Penguin, I think our backlink profile is clean, our physicians do take part in providing content to various high quality and relevant websites/blogs. But we do not buy links or do anything in violation of Google's guidelines. In terms of brand, we are the biggest dermatology and plastic surgery group in the Austin area. So any brand implications to search should be on our side. Just looking for some sort of guidance or help, any suggestions would be great! Thanks,
Intermediate & Advanced SEO | | iderma
Adam Paddock0 -
Huge Google index on E-commerce site
Hi Guys, Refering back to my original post I would first like to thank you guys for all the advice. We implemented canonical url's all over the site and noindexed some url's with robots.txt and the site already went from 100.000+ url's indexed to 87.000 urls indexed in GWT. My question: Is there way to speed this up?
Intermediate & Advanced SEO | | ssiebn7
I do know about the way to remove url's from index (with noindex of robots.txt condition) but this is a very intensive way to do so. I was hoping you guys maybe have a solution for this.. 🙂0 -
Site Indexed by Google but not Bing or Yahoo
Hi, I have a site that is indexed (and ranking very well) in Google, but when I do a "site:www.domain.com" search in Bing and Yahoo it is not showing up. The team that purchased the domain a while back has no idea if it was indexed by Bing or Yahoo at the time of purchase. Just wondering if there is anything that might be preventing it from being indexed? Also, Im going to submit an index request, are there any other things I can do to get it picked up?
Intermediate & Advanced SEO | | dbfrench0 -
Dynamic 301's causing duplicate content
Hi, wonder if anyone can help? We have just changed our site which was hosted on IIS and the page url's were like this ( example.co.uk/Default.aspx?pagename=About-Us ). The new page url is example.co.uk/About-Us/ and is using Apache. The 301's our developer told us to use was in this format: RewriteCond %{REQUEST_URI} ^/Default.aspx$
Intermediate & Advanced SEO | | GoGroup51
RewriteCond %{QUERY_STRING} ^pagename=About-Us$
RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L] This seemed to work from a 301 point of view; however it also seemed to allow both of the below URL's to give the same page! example.co.uk/About-Us/?pagename=About-Us example.co.uk/About-Us/ Webmaster Tools has now picked up on this and is seeing it a duplicate content. Can anyone help why it would be doing this please. I'm not totally clued up and our host/ developer cant understand it too. Many Thanks0 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
A Noob's SEO Plan of attack... can you critique it for me?
I've been digging my teeth into SEO for a solid 1.5 weeks or so now and I've learned a tremendous amount. However, I realize I have only scratched the surface still. One of the hardest things I've struggled with is the sheer amount of information and feeling overwhelmed. I finally think I've found a decent path. Please critique and offer input, it would be much appreciated. Step One: Site Architecture I run an online proofreading & editing service. That being said, there are lots of different segment we would eventually like to rank for other than the catch-all phrases like 'proofreading service'. For example, 'essay editing', 'resume editing', 'book editing', or even 'law school personal statement editing'. I feel that my first step is to understand how my site is built to handle this plan now, and into the future. Right now we simply have the homepage and one segment: kibin.com/essay-editing. Eventually, we will have a services page that serves almost like a site-map, showing all of our different services and linking to them. Step Two: Page Anatomy I know it is important to have a well defined anatomy to these services pages. For example, we've done a decent job with 'above the fold' content, but now understand the importance of putting the same type of care in below the fold. The plan here is to have a section for recent blog posts that pertain to that subject in a section titled "Essay Editing and Essay Writing Tips & Advice", or something to that effect. Also including some social sharing options, other resources, and an 'about us' section to assist with keyword optimization is in the plan. Step Three: Page Optimization Once we're done with Step Two, I feel that we'll finally be ready to truly optimize each of our pages. We've down some of this already, but probably less than 50%. You can see evidence of this on our essay editing page and proofreading rates page. So, the goal here is to find the most relevant keywords for each page and optimize for those to the point we have A grades on our on-page optimization reports. Step Four: Content/Passive Link Building The bones for our content strategy is in place. We have sharing links on blog posts already in place and a slight social media presence already. I admit, the blog needs some tightening up, and we can do a lot more on our social channels. However, I feel we need to start by creating content that our audience is interested in and interacting with them on a consistent basis. I do not feel like I should be chasing link building strategies or guest blog posts at this time. PLEASE correct me if I'm off base here, but only after reading step five: Step Five: Active Link Building My bias is to get some solid months of creating content and building a good social media presence where people are obviously interacting with our posts and sharing our content. My reasoning is that it will make it much easier for me to reach out to bloggers for guest posts as we'll be much more reputable after spending time doing step 4. Is this poor thinking? Should I try to get some guest blog posts in during step 4 instead? Step Six: Test, Measure, Refine I'll admit, I have yet to really dive into learning about the different ways to measure our SEO efforts. Besides being set up with our first campaign as an SEOPro Member and having 100 or so keywords and phrases we're tracking... I'm really not sure what else to do at this point. However, I feel we'll be able to measure the popularity of each blog post by number of comments, shares, new links, etc. once I reach step 6. Is there something vital I'm missing or have forgotten here? I'm sorry for the long winded post, but I'm trying to get my thoughts straight before we start cranking on this plan. Thank you so much!
Intermediate & Advanced SEO | | TBiz2