Re-Launched Website: Developer Fogot to Remove noindex tags.
-
Our company's website has maintained decent rankings for the last 12 years we've been in business for our primary keywords. We recently had our website rebuilt from the ground up, and the developers left the noindex tags on all of our 400+ pages when we launched it. I didn't catch the error for 6 days. During which time, I used the Fetch feature in Google, submitting a site-wide fetch, as well as manual submissions for our top 100 URLs . In addition, every page that was indexed previously had a 301 set up for it, which was pointing to a destination with a noindex.
I caught the error today, and the developer removed the tags. Does anyone have any experience with a situation similar to this? In the SERPs, we are still ranking at this moment, and it's displaying our old URLs, and they are 301 redirecting just fine. But, what happens now? For 6 full days, we told Google not to index any of our pages, while also using the Fetch feature, contradicting ourselves.
Any words of wisdom or advice as to what I can do at this point to avoid potential fall out?Thanks
-
I appreciate everyone's feedback. Very helpful- thank you for taking the time to respond. Heading over to upload a sitemap now!
Thanks again,
Kristin -
One of our competitors, who ranked #1 for a good money term (we were #2) had a developer redo their entire site. He had noindex on every page when the new site went up.
When we saw the new site we sniffed the code, saw the noindex in there and laughed really hard.
A couple days later they dropped completely from the SERPs and we started getting all of their sales.
It took them a couple weeks to figure out what happened. But when they fixed it they popped right back into the SERPs at old rankings a couple days later.
We talk to these guys by phone occasionally. If they would have called us we would have told them how to fix it... but since they hired an expensive developer we didn't want to stick our noses in.
-
I've dealt with similar issues with robots.txt blocks of the entire site, as well as robots meta noindex tags. You should be fine now that you've taken the noindex tag off, and the old pages are redirecting. It may take longer for Google to update their index with the new URLs, but otherwise I don't think you need to worry too much. Maybe resubmit the sitemap and do another fetch on key pages.
Good luck!
-
Make sure you send in a sitemap and all should be well.
I've dealt with cases where certain pages were noindex but then removed. As long as you fixed all your errors, it should be back to normal. Think of a site going down intermittently, rankings don't get affected too much (I believe Matt Cutts confirmed this in a youtube video)
-
Hi Kristin
I have no experience of this happening, but I would suggest that you create a full sitemap and submit that to Google Webmaster tools asap.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitted URL marked 'noindex'
Search Console is giving this issue for near 100 pages of my website. I have checked the Yoast plugin settings. We haven't used any meta robots tag for these pages, neither have these pages been disallowed in robots.txt. Previosuly this issue was for some 20+ pages. I tried to reindex them by submitting the urls again. Now the count has risen to 100+. There is also this "Submitted URL blocked by robots.txt" issue for pages which are NOT disallowed in robots.txt. Can anyone please suggest me a solution here?
Reporting & Analytics | | Reema240 -
Do I need a separate Google Analytics ID for a subdomain on my website?
Hello Everyone, I would like to know if I need a separate Google Analytics ID for a subdomain on my website? I have a sub-domain within my website, and the software for the site on the subdomain is asking for a Google Analytics ID. Should I be using the same Google Analytics ID as the main domain or should I be creating a new Google Analytics profile or ID? Thanks. Ryan
Reporting & Analytics | | RyanUK0 -
Track buttons in Google Tag Manager
Hi! First question: I am wondering if it's possible to track two buttons with the same code in Google Tag Manager without changing the code? There are different page URL's.. Second one: My tags are displayed in Google Analytics as 'events'. When I put the events in 'goals', the number of times it fired differs from my events. Someone who can help me with this issue?
Reporting & Analytics | | conversal0 -
How to track mobile actions on my website
Hi, i have a website that have, as many others, a mobile version. I want to know if there exist a way to track the mobile user experience, for example, how many times scroll down the page, where touch the screen, and actions that only have sense with mobile visitors. what do you think? Thanks, Juan Ignacio
Reporting & Analytics | | NachoRetta0 -
How can I remove parameters from the GSC URL blocking tool?
Hello Mozzers My client's previous SEO company went ahead and blindly blocked a number of parameters using the GSC URL blocking tool. This has now caused Google to stop crawling many pages on my client's website and I am not sure how to remove these blocked parameters so that they can be crawled and reindexed by Google. The crawl setting is set to "Let Google bot decide" but still there has been a drop in the number of pages being crawled. Can someone please share their experience and help me delete these blocked parameters from GSC's URL blocking tool. Thank you Mozzers!
Reporting & Analytics | | Vsood0 -
What will be configuration for new version of tag manager for given below code?
Hello Expert, I am using new version of tag manager for enhance ecommerce. Now i have post related to enhance ecommerce for old version of tag manager this one - https://developers.google.com/tag-manager/enhanced-ecommerce In this post, below is the configuration of "Measuring Views of Product Details" for old version of tag manager, can you please tell me what will be configuration for new version of tag manager? ( mainly basic setting and firing rule ) Tag type : Universal Analytics
Reporting & Analytics | | bkmitesh
Track type : Pageview
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
Basic Settings - Document Path: {{url path}}
Firing Rule: {{event}} equals gtm.js Thanks! BK Mitesh0 -
Universal Analytics & Google Tag Manager - Track URLs that include hashes
Does anyone have any experience tracking URLs that include hashes (#) using Universal Analytics and Google Tag Manager? Can it be done using GTM's container for UA, using the "more settings" options? Or building another tag to work with the GTM UA container? The fallback I'm considering is implementing the UA code in GTM for every page as Custom HTML with the "ga('send', 'pageview', location.pathname + location.search + location.hash);" solution, rather than GTM's specialized UA tag. I'm not yet sure what problems may arise from that, if any. Thanks in advance.
Reporting & Analytics | | 352inc0 -
I made 18 websites and the traffic keeps going down over 3 months
I made 18 websites, and have used a analytics web app called piwik. You can google it, but basically it is like google analytics. I have done nothing for the websites no links, no updates. I did do the onpage optimization extremely well. At first I had daily traffic over all the websites at about 200, then like a month went by and it was at 100, then another month has gone by it is hovering around 30 visits -- This is total traffic across all the websites. In addition my websites were ranking much better and alot of them were coming up together in the results in a single google query, now this is no longer true, only one or maybe two come for the same google query and they come up lower in the serp ranking ie. before it was 1st place now 3rd for example, so traffic has decreased respectively. Anybody can tell me what I can do, to regain the positions and traffic I had before.
Reporting & Analytics | | mickey110