Help my site it's not being indexed
-
Hello... We have a client, that had arround 17K visits a month... Last september he hired a company to do a redesign of his website....They needed to create a copy of the site on a different subdomain on another root domain... so I told them to block that content in order to not affect my production site, cause it was going to be an exact replica of the content but different design....
The developmet team did it wrong and blocked the production site (using robots.txt), so my site lost all it's organica traffic, which was 85-90% of the total traffic and now only get a couple of hundreds visits a month... First I thought we had been somehow penalized, however when I the other site recieving new traffic and being indexed i realized so I switched the robots.txt and created 301 redirect from the subdomain to the production site.
After resending sitemaps, links to google+ and many things I can't get google to reindex my site.... when i do a site:domain.com search in google I only get 3 results. Its been now almost 2 month and honestly dont know what to do....
Any help would be greatly appreciated
Thanks
Dan
-
If it makes you feel any better, this is the solution about once a month in Q&A. You're not the first, and you certainly won't be the last!
-
This is way I love the SEOMOZ community no matter how stupid the solution to your problem might be people will let you know.
I feel like an amateur (cause I'm) I think I overtrusted yoast's plugin, Because whenever your are blocking the robots it will tell you, hoewever this time it didn't and the site, through wordpress config, was blocking the site.
I changed it, resubmited the sitemaps, checked the code and updated yoast's great plugin....
Thanks guys... I SEOPromise to check always the code myself
Dan
-
When I go to your page and look at the source code I see this line:
name='robots' content='noindex,nofollow' />
You are telling the bots not to index the page or follow any links on the page. This is in the source code for your home page.
I'd go back into the wordpress settings (you are using Yoast) and make sure to enable the site for search engine visibility!
Once you do that, and verify that the code is changed to "='index,follow'" then resubmit your sitemaps via webmaster tools.
-
Great tool I'm taking a look right now
thanks
Dan
-
I check GWT everyday, not even one page has been indexed... Nor We have any manual action suggested by google
Thanks
Dan
-
-
A suggestion for the future: use some type of code monitoring service, such as https://polepositionweb.com/roi/codemonitor/index.php (no relationship with the company, it's just what I use), and have it alert you to any changes in the robots.txt file on both the live and staging environments.
I was in a situation at a previous employment where the development team wasn't the best at SEO, and I had experienced the robots.txt from the dev site being put on the live site, and the other way around, and also things being added to or removed from the robots.txt without our request or knowledge. The verification files for Google and Bing Webmaster Tools would sometimes go missing, too.
I used that code monitor to check once a day and email me if there were changes to the robots.txt or verification files on the live site and the robots.txt of all of our dev and staging sites (to make sure they weren't accidentally indexed). Was a huge help!
-
Yes take another look at that robots file for sure. If you provided us with the domain we might be able to better help you.
Also, go into Webmaster Tools and poke around. Check how many pages are being indexed, look at sitemaps, do a fetch-as-google, etc.
-
Hi Dan
It sounds like your robot.txt are still blocking your site despite the redirects. You might be best getting rid of the robot.txt and starting again ensuring nothing is blocked that it shouldn't.
regards, David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I need to remove pages that don't get any traffic from the index?
Hi, Do I need to remove pages that don't get any traffic from the index? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Facets Being Indexed - What's the Impact?
Hi Our facets are from what I can see crawled by search engines, I think they use javascript - see here http://www.key.co.uk/en/key/lockers I want to get this fixed for SEO with an ajax solution - I'm not sure how big this job is for developers, but they will want to know the positive impact this could have & whether it's worth doing. Does anyone have any opinions on this? I haven't encountered this before so any help is welcome 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Huge Google index on E-commerce site
Hi Guys, Refering back to my original post I would first like to thank you guys for all the advice. We implemented canonical url's all over the site and noindexed some url's with robots.txt and the site already went from 100.000+ url's indexed to 87.000 urls indexed in GWT. My question: Is there way to speed this up?
Intermediate & Advanced SEO | | ssiebn7
I do know about the way to remove url's from index (with noindex of robots.txt condition) but this is a very intensive way to do so. I was hoping you guys maybe have a solution for this.. 🙂0 -
Google Site Extended Listing Not Indexed
I am trying to get the new Site map to be picked up by Google for the extended listing as its pulling from the old links and returning 404 errors. How can I get the site listing indexed quickly and have the extended listing get updated to point to the right places. This is the site - http://epaperflip.com/Default.aspx This is the search with the extended listing and some 404's - Broad Match search for "epaperflip"
Intermediate & Advanced SEO | | Intergen0 -
Can Someone Provide an Example of a Site that Indexes Search Results Successfully?
So, I know indexing search results is a big no-no, but I recently started working with a site that sees 50% of its traffic from search result pages. The user engagement on these pages is very high, and these pages rank well too. Unfortunately, they've been hit by Panda. They already moved the section of the site with search results to a subdomain, and saw temporary success. There must be a way to preserve their traffic from these search result pages and get out from under Panda.
Intermediate & Advanced SEO | | nicole.healthline0 -
Recently created site indexed; no backlinks showing?
I launched a website for a client in mid-March. The site is already indexed, I have built quite a few links to it (links are also indexed), and ranks well for some targeted keywords. However, when I try to check backlinks to the site with Open Site Explorer, it comes back with "No Data Available For This URL". Is this something I should be worried about or merely a case of 'recency' of page creation'? I know it says that it can take 45-60 days for a site to be included in Linkscape but I'm approaching the 60 days mark and still nothing.
Intermediate & Advanced SEO | | Igor-Avidon0 -
Google Indexed the HTTPS version of an e-commerce site
Hi, I am working with a new e-commerce site. The way they are setup is that once you add an item to the cart, you'll be put onto secure HTTPS versions of the page as you continue to browse. Well, somehow this translated to Google indexing the whole site as HTTPS, even the home page. Couple questions: 1. I assume that is bad or could hurt rankings, or at a minimum is not the best practice for SEO, right? 2. Assuming it is something we don't want, how would we go about getting the http versions of pages indexed instead of https? Do we need rel-canonical on each page to be to the http version? Anything else that would help? Thanks!
Intermediate & Advanced SEO | | brianspatterson0 -
Push for site-wide https, but all pages in index are http. Should I fight the tide?
Hi there, First Q&A question 🙂 So I understand the problems caused by having a few secure pages on a site. A few links to the https version a page and you have duplicate content issues. While there are several posts here at SEOmoz that talk about the different ways of dealing with this issue with respect to secure pages, the majority of this content assumes that the goal of the SEO is to make sure no duplicate https pages end up in the index. The posts also suggest that https should only used on log in pages, contact forms, shopping carts, etc." That's the root of my problem. I'm facing the prospect of switching to https across an entire site. In the light of other https related content I've read, this might seem unecessary or overkill, but there's a vaild reason behind it. I work for a certificate authority. A company that issues SSL certificates, the cryptographic files that make the https protocol work. So there's an obvious need our site to "appear" protected, even if no sensitive data is being moved through the pages. The stronger push, however, stems from our membership of the Online Trust Alliance. https://otalliance.org/ Essentially, in the parts of the internet that deal with SSL and security, there's a push for all sites to utilize HSTS Headers and force sitewide https. Paypal and Bank of America are leading the way in this intiative, and other large retailers/banks/etc. will no doubt follow suit. Regardless of what you feel about all that, the reality is that we're looking at future that involves more privacy protection, more SSL, and more https. The bottom line for me is; I have a site of ~800 pages that I will need to switch to https. I'm finding it difficult to map the tips and tricks for keeping the odd pesky https page out of the index, to what amounts to a sitewide migratiion. So, here are a few general questions. What are the major considerations for such a switch? Are there any less obvious pitfalls lurking? Should I even consider trying to maintain an index of http pages, or should I start work on replacing (or have googlebot replace) the old pages with https versions? Is that something that can be done with canonicalization? or would something at the server level be necessary? How is that going to affect my page authority in general? What obvious questions am I not asking? Sorry to be so longwinded, but this is a tricky one for me, and I want to be sure I'm giving as much pertinent information as possible. Any input will be very much appreciated. Thanks, Dennis
Intermediate & Advanced SEO | | dennis.globalsign0