Robots.txt file issue.
-
Hi,
Its my third thread here and i have created many like it on many webmaster communities.I know many pro are here so badly needs help.
Robots.txt blocked 2k important URL's of my blogging site
http://Muslim-academy.com/ Especially of my blog area which are bringing good number of visitors daily.My organic traffic declined from 1k daily to 350.
I have removed the robots.txt file.Resubmitted existing Sitemap.Used all Fetch to index options and 50 URL submission option in Bing Webmaster Tool.
What Can I do know to have these blocked URL's back in Google index?
1.Create a NEW sitemap and submit it again in Google webmaster and bing webmaster tool?
2.Bookmark,linkbuilding or share the URL's.I did a lot of bookmarking for blocked URL's.
I fetch the list of blocked URLS Using BING WEBMASTER TOOLS.
-
Robert some good signs of life.New sitemap shows 5080 pages submitted and 4817 indexed.
These remaining pages are surely blocked ones?RightRobert though there is some improvement in Impressions and Clicks.Thanks a lot for staying that long with me solving this issue.
-
Christopher,
Have you looked at indexing in GWMT to see if they have indexed, how many pages, etc.?
-
Got your point but I Resubmit and its status is still pending.
I have test it and it was working but when I submit it 2 days ago up till now its status is pending. -
No, when you resubmit or submit a "new" sitemap, it just tells Google this is the sitemap now. There is no content issue with a sitemap.
Best,Robert
-
Just one last question Robert.Does not the duplicate sitemap creates duplicate pages in searches?
Sorry my question may looks like Crazy to you but at the moment with applying every possible fix I do not mess up and make things even more worse.
-
Given the only issue was the robots.txt error, I would resubmit. I do think it would not hurt to generate a sitemap and submit that in case there may be something you are missing though.
Best
-
Robert the question is either I need to create a new sitemap or resubmit the existing one?
-
Hello Christopher
It appears you have done a good deal to remediate the situation already. I would resubmit a sitemap to Google also. Have you looked in WMT to see what is now indexed? I would look at the graph of indexed and robots.txt and see if you are moving the needle upward again.
This begs a second question of "How did it happen?" You stated, "Robots.txt blocked 2k important URL's of my blogging site" and that sounds like it just occurred out of the ether. I would want to know that I had found the reason and make sure I have a way to keep it from happening going forward. (just a suggestion).Lastly, using the Index Status in WMT should be a great way to learn how effective what you tried in fixing it is. I like knowing that type of data and storing it somewhere retrievable for the future.
Best to you,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Drupal SEO Issues
Hi, I have two questions regarding my enterprise website. It is built on the Drupal CMS. First, and in looking at Google Analytics, I'm seeing more than 6k pages listed, but over 5k have received less than 10 page views in six months. In fact, most of them are not really content pages at all. The URLs I'm seeing listed, which to me indicates actual crawlable content in GA, shows pages like this: http://www.domainname.com/node/2153
Reporting & Analytics | | jaccardi62
http://www.domainname.com/company/careers?gnk=apply&gni=8a87142e4d086a73014d2a0d65242b8e&gns=glassdoor+free
http://www.domainname.com/blog?page=1
http://www.domainname.com/resources/videos?field_video_category_value=all&page=4&page=1
http://www.domainname.com/search/site?search_api_views_fulltext=talent+pool What is the problem here? Why are these non-pages being indexed as content and why are they showing up in GA? Second question is about my blog and blog best practices. While I know blog content is important for SEO, why is my site blog pagination being indexed as content. For example, these "pages" are showing up in SERPs: http://www.domainname.com/blog/tag/business_intelligence?page=2
http://www.domainname.com/blog/topic/expansion?page=5
http://www.domainname.com/blog/weeks_news_april_26 What is the best way to fix this? Thanks in advance for your help!0 -
Oh Japanese Character and 404 issue
This one has me stumped, so I really hope someone can help. I am assisting a company who are having 404 errors on certain japanese pages of their site. I have checked in web master tools and analytics, can see the 404, but have no idea where it is coming from. I am starting to think it may be a encoding issue of some sort. Just wondered if anyone has come across this before? Site uses site finity. What is happening is reader is going to page domain.com/ja/blog/article then coming to this page /404?aspxerrorpath=/Sitefinity/WebsiteTemplates/App_themes/"website"/fonts/open sans regular Really odd - as we cannot re-create this 404 I don't know how they are getting to it? Also in Analytics some of the pages that are written in japanese that are giving 404's look like this C3%A3%E2%80%9A%C2%BD%C3%A3%C6%92%C2%AA%C3%A3%C6%92%C2%A5%C3%A3%C6%92%C2%BC%C3%A3%E2%80%9A%C2%B7%C3%A3%C6%92%C2%A7%C3%A3%C6%92%C2%B394.8%C3%A3%C2%81%C2%AE%C3%A5%C2%AF%C2%BE%C3%A8%C2%B1%C2%A1/%C3%A9%C2%A1%C2%A7%C3%A5%C2%AE%C2%A2%C3%A3%E2%80%9A%C2%B0%C3%A3%C6%92%C2%AB%C3%A3%C6%92%C2%BC%C3%A3%C6%92%E2%80%94%C3%A6%C2%AF%C5%BD%C3%A3%C2%81%C2%AE94.8 Any help much appreciated
Reporting & Analytics | | Kelly33300 -
Referral Traffic Issue
I'm working on a site that has low traffic volumes due to its niche. That's fine but we are daily getting referral traffic from unrelated domains without a link. These visits are always 100% bounce which is impacting the overall click.through rate. The domains are not the same and different ones come through all the time, so it is difficult to keep on top of. Any ideas what could be going on here and an effective way of dealing with this?
Reporting & Analytics | | MickEdwards0 -
Which Algorithm Change Hurt the Site? A causation/correlation issue
The attached graph is from google analytics, a correlation of about 14 months of Organic Google visits with algo changes, data from moz naturally 🙂 Is there any way to tell from this which will have affected the site? for example #1 or #2 seems to be responsible for the first dip, but #4 seems to fix it and it broke around 6, or is the rise between 4 and 7 an anomaly and actually 1 or 2 caused a slip from when it was released all the way to when 7 was released. Sorry if the graph is a little cloak and dagger, that is partly because we don't have permissions to reveal much about the identity, and partly because we were trying to do a kind of double blind, separating the data from our biases 🙂 We can say though the different between the level at the start and end of the graph is at least 10,000 visits per day JarMzoK.png
Reporting & Analytics | | Fammy0 -
When will traffic data be working ? also whats with the spike in duplicate listing issues with everyone.
Hi There, We have no traffic data, is this something we are doing wrong or is this an issue with SEOMOZ ? Also duplicate listings have gone sky high, check goggle analytics's and all ok ? Any answers ? Thanks Charlie
Reporting & Analytics | | pro580 -
2 questions on avoiding issues with Google and while being right in it.
Hi SEOmoz community In fact I have two questions I would like to ask (with future SEO in mind). Do you consider a WordPress Multisite or various Single installs 'safer' for SEO? Theoretically, having various sites packed into one Multisite network seems like an ideal solution. However, is there a chance that once a site in the network encounters a little 'negative turbulence', that your other sites in the network might get impacted too due to the cross-referencing, linked account i.e. Webmaster Tools etc.? It would seem outrageous, but then again I wouldn't rule it out. Do I even have to go as far as setting up new Gmail, Google Analytics and Webmaster Tools accounts, so they (the sites) are technically not linked? You can see, I don't trust search engines one bit... Is there still a point posting articles once Google is having a hissy fit with your site? Basically I am currently going through a 'rankings and traffic drops storm'. It's not as bad as being de-indexed, but it's still having enough of an impact. In addition, Google does not seem to treat my new articles (unique content) with the same attention anymore i.e. does not seem to index them 'fully' or not at all (i.e. posting the headline in Google should return the article, but it doesn't). Is there even a point spending time now and posting new material or may it pick it up again once I am through this low phase? Does Google still index what it considers worth or is it a waste of time right now to keep posting, posting and posting more? Thanks for your help. I really appreciate it.
Reporting & Analytics | | Hermski0 -
Google Analytic Tracking Issue (&utm_nooverride=1)
Hello, We have a problem that means we are unable to track our AdWords and organic work at all. Looking at "/All Traffic Sources" and clicking on "Ecommerce Tab" in Analytics we can see that (made up ratio :)):
Reporting & Analytics | | jannkuzel
£2 is attributed to Google/ CPC
£1 is attributed to Google / Organic
But £100 to Payment Provider/ referral and also various referrals from banking transaction pages. All of the revenue/conversions are being credited to the payment provider or the bank security checks the payment goes through. After having done some research we have found that the problem may be that Google Analytics attributes the purchase to the most recent click (on the payment provider button) rather than the initial click on the cpc campaign/organic or direct etc. Some people have suggested using the "&utm_nooverride=1"
tag which we wanted to run past you guys and confirm whether adding
this tag to the payment provider 'buy now' button on our website will
presumably fix this referral problem? Alternatively does the tag need
to be entered into our CPC campaigns as well? Or can you please guide
us in another way? We have also heard that "cross-domain" tracking could be the solution. So we are really confused what to do and where hoping someone had maybe been through something similar and could advice before we fully launch into a solution. In addition, it should be noted that our 'Goals Funnel Visualisation'
of 'checkout' breaks up at the penultimate stage of the checkout. All
customers exit through the /checkout_process (penultimate) but are recognised returning to the successful checkout page but there is a missing link in between these
two stages as 0% pass through is shown even though they do return? Thank you so much in advance for all your help.0 -
Google Analytics Goal Funnel Visualization Issue
I've setup a goal funnel but am having an issue when I look at the funnel visualization. It doesn't appear to be recognizing the 1st step of the funnel that I've defined in the goal edit page. The "Property Listing page view" is located at /listings/xxx where xxx is the number of the property. Within the funnel, I've added /listings/*, but when I go to see the funnel visualization, I see 0 counts for this step (even though it clearly shows on the entrance page to the left "/listings/622, etc". I've attached a .pdf with a few images to help make this clearer. Any thoughts? CRD-Funnel.pdf
Reporting & Analytics | | chrisfree0