Website is not indexed in Google, please help with suggestions
-
Our client website was removed from Google index. Anybody could recommend how to speed up process of re index:
- Webmaster tools done
- SM done (Twitter, FB)
- sitemap.xml done
- backlinks in process
- PPC done
- Robots.txt is fine
Guys any recommendations are welcome, client is very unhappy.
Thank you
-
What do you mean you had no other choice? What forced you to add these tags to your client's site?! Because they were updating it?
...uhh.. This thread is bizarre. Anyway sounds like all of it was a non-issue. Mark is absolutely correct about your real issue though. Get some redirects in there asap.
-
Just an aside - you're going to have indexation issues - you have both www and non-www versions live on the site, with no canonicals pointing to one version. You also have index.php as a live page linked to from the logo. I'd definitely recommend implementing canonical tags across the site.
Mark
-
HaHa - did not even check, yeah you are indexed... recent though, no cache on some of the pages has been created yet, so prob within a week?
Homepage was crawled on April 25th
-
When I search new homes developer st modwen in Google.com (no quote marks & I'm in the UK), this page from your site is at No3 and your homepage is at No4.
When I search st modwen homes, you're at No1. I'm no expert, but that doesn't look like being de-indexed to me.
Or do you simply mean your rankings for the term new homes developer have dropped?
Also, I don't understand this: _buit we had no other option client was changing content on live site, so we had to noindex, nofollow. _
-
Thank you, I already done G+. Regarding noindex, no follow I completely agree, we had no other choice. Thank you again.
-
Getting G+'s on fresh QUALITY content is one of the best ways to quick index with Google in my opinion these days.
Just a suggestion, I would NEVER noindex an indexed site just because of content changes.
make a clone, point the domain using vhost, temporarily to a subdirectory make your changes then re-point domain in vhost (or if cpanel just use a pointer) - This way no one is the wiser to the changes INCLUDING Google.
Or just make the changes, the ramificatrions of noindex are much more long lasting than a content change (unless it was left in shambles for weeks)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Google Crawl & Index my Schema in CSR JavaScript
We currently only have one option for implementing our Schema. It is populated in the JSON which is rendered by JavaScript on the CLIENT side. I've heard tons of mixed reviews about if this will work or not. So, does anyone know for sure if this will or will not work. Also, how can I build a test to see if it does or does not work?
Intermediate & Advanced SEO | | MJTrevens0 -
Sudden Rankings Drop for Good Keywords.. Did I Do This? Please Help :(
Hello, I noticed a gradual rankings drop for 3 important keywords over the last month, with a pretty big plummet the last two weeks. Overall in the last month+ we dropped from position 9 to 41.I noticed this when I dug further after noticing traffic dropping since February (not a drastic traffic drop). I should note that the keywords took people to my client's homepage. Their branded keywords have no suffered and I looked at a couple others that haven't either. Now, there is a link in the site footer (we have site wide header and footer) that takes you to a static page that contains links for the 2 digital flipbook catalogs the customer has (one for US and one for Canada). My concern is that at the end of January I had a developer implement a noindex/nofollow meta robot & robots.txt disallow specifically on the HTML pages/URL of the Canadian catalog ONLY. It specifically pointed to that flipbook URL. This catalog is nearly identical to the US catalog and I thought I'd be eliminating duplicate content and helping with crawl budget. After looking further into it last week (reading up about internal nofollows not necessarily being detrimental, but not recommended) and noticing the drop in search visibility traffic (starting gradually in March), I had the disallow/nofollow removed. This was last week, and over this last week the traffic took an even bigger drop (not amazingly drastic but enough to be concerned) and I noticed the keywords that we did ok for dropped even more this last week (down to 41). I'm concerned this has to do with the change I made at the end of January and reversed back. I should note that I don't think these catalogs or the static page that links to them brought any traffic. The keywords I am concerned about fell on our homepage (where the link to the static page that contains the links to both catalogs is in the sitewide footer) The catalogs are a couple hundred pages. I honestly don't see how this could do it, unless it has something to do with the footer being sitewide? There have been site upgrades/dev changes over the last couple months too (although I am not sure if that affected other clients who received the same upgrade), so this is hard to pinpoint. Sorry this is so long but I'd appreciate someone offering some insight to help ease my mind a bit!
Intermediate & Advanced SEO | | AliMac260 -
Does integration of external supplemenatry data help or hurt regarding googles perception of content quality? (e.g weather info, climate table, population info, currency exchange data via API or open source databases)
We just lost over 20% traffic after google algo update at June 26.
Intermediate & Advanced SEO | | lcourse
In SEO forums people guess that there was likely a Phantom update or maybe a Panda update. The most common advice I found was adding more unique content. While we have already unique proprietary content on all our pages and we plan to add more, I was also considering to add some content from external sources. Our site is travel related so I thought about adding for each city page external data such as weather, climate data, currency exchange data via APIs from external sources and also some data such as population from open source databases or some statistical info we would search on the web. I believe this data would be useful to the visitors. I understand that purely own content would be ideal and we will work on this as well. Any thoughts? Do you think the external data may rather help or hurt how google perceives content quality?0 -
Does Google Index URLs that are always 302 redirected
Hello community Due to the architecture of our site, we have a bunch of URLs that are 302 redirected to the same URL plus a query string appended to it. For example: www.example.com/hello.html is 302 redirected to www.example.com/hello.html?___store=abc The www.example.com/hello.html?___store=abc page also has a link canonical tag to www.example.com/hello.html In the above example, can www.example.com/hello.html every be Indexed, by google as I assume the googlebot will always be redirected to www.example.com/hello.html?___store=abc and will never see www.example.com/hello.html ? Thanks in advance for the help!
Intermediate & Advanced SEO | | EcommRulz0 -
How to fully index big ecommerce websites (that have deep catalog hierarchy)?
When building very large ecommerce sites, the catalog data can have millions of product SKUs and a massive quantity of hierarchical navigation layers (say 7-10) to get to those SKUs. On such sites, it can be difficult to get them to index substantially. The issue doesn’t appear to be product page content issues. The concern is around the ‘intermediate’ pages -- the many navigation layers between the home page and the product pages that are necessary for a user to funnel down and find the desired product. There are a lot of these intermediate pages and they commonly contain just a few menu links and thin/no content. (It's tough to put fresh-unique-quality content on all the intermediate pages that serve the purpose of helping the user navigate a big catalog.) We've played with NO INDEX, FOLLOW on these pages. But structurally it seems like a site with a lot of intermediate pages containing thin content can result in issues such as shallow site indexing, weak page rank, crawl budget issues, etc. Any creative suggestions on how to tackle this?
Intermediate & Advanced SEO | | AltosDigital-10 -
New website strategy concerning Google Spider
Hello, I have a question concerning a new website. What should I do, SEO wise? Should I place all my content on my pages at once? And thus let the spider crawl everything at once? Or should I place my content in different phases? So the spider could crawl my pages multiple times in some days/weeks time? Or do both ways come to the same result? Thank you,
Intermediate & Advanced SEO | | MarikeP0 -
Panda Recovery - What is the best way to shrink your index and make Google aware?
We have been hit significantly with Panda and assume that our large index with some pages holding thin/duplicate content being the reason. We have reduced our index size by 95% and have done significant content development on the remaining 5% pages. For the old, removed pages, we have installed 410 responses (Page does not exist any longer) and made sure that they are removed from the sitempa submitted to Google; however after over a month we still see Google spider returning to the same pages and the webmaster tools shows no indicator that Google is shrinking our index size. Are there more effective and automated ways to make Google aware of a smaller index size in hope of Panda recovery? Potentially using the robots.txt file, GWT URL removal tool etc? Thanks /sp80
Intermediate & Advanced SEO | | sp800 -
Google Places / Google Analytics
I apologize first if this comes across as extremely novice, but I realized I really didn't know the answer and so - here I am. 🙂 Is anyone familiar with tracking google place traffic in google analytics? Is it possible? I'd love to know how many of our visitors are coming from our google place listings (we have several locations throughout the state.) Much gratitude in advance ~ Alicia
Intermediate & Advanced SEO | | Aaronetics0