2000 pages indexed in Yahoo, 0 in Google. NO PR, What is wrong?
-
Hello Everyone,
I have a friend with a blog site that has over 2000 pages indexed in Yahoo but none in Google and no page rank. The web site is http://www.livingorganicnews.com/ I know it is not the best site but I am guessing something is wrong and I don't see it.
Can you spot it? Does he have some settings wrong? What should he do?
Thank you.
-
The site is just looking like a site of a blog network. The domain is 5 years old & the home page has DA & PA of 34 still not indexed by Google. I searched for site:livingorganicnews.com in Google which is not giving any results. So it shows that the site is penalized by Google. Use Google webmaster tools for further verification so as to find the reason.
Most probably it's penalized because of being a site of a paid blog network.
-
LOL, the fact that there's a tonne of clearly spun content won't help. I gather this is part of a content scraping or sharing network like LinkVine?
Have you tried reading the articles published? Could do with some quality guidelines for what gets accepted imho
Even when it gets indexed, it's not going to rank anywhere... this is exactly the kind of site that Panda wanted to stop. Regurgitated, nonsensical, spun, tosh that looks as if it was written by a lunatic and only really exists for the sake of its outgoing links, that point to other rubbish.
I'd tell your friend to give up on this site entirely and start looking at less automated ways of doing things. Google is only going to get tougher and tougher on these sites so he's fighting a losing battle.
I don't mean to be rude but I hope it doesn't get indexed ever, what value does it offer to anyone for anything? Most people don't want stuff like that clogging up the web. I don't mean to sound harsh but tell your friend the problem with the site is.... it's crap.
-
Another one of the many not-quite-right things on the site are some of the older posts like http://www.livingorganicnews.com/games/2010/panasonic-announced-the-jungle-handheld-gaming-platform/1965/ which end with "incoming search terms" and several search terms that all hyperlink to that exact same article. Search engines will not see that as providing any value to the user (users are already on that page, they don't need to link to that page) and they will see it as just another attempt to manipulate the engines.
-
It is interesting to have a new set of eyes here. I had noticed his different writing but figured it was because English is not his first language. I will ask if he is actually writing this.
-
Keri is absolutely right.
I did not look at the site's content. It couldn't be much worse. It is a 100% spam site which should never be indexed. Clearly the site is under a penalty.
Google's job is to satisfy a user's search query by giving them the content they seek. If you create a site like that, NO ONE will ever want to get that site as the result of a search query. Google correctly recognizes this fact and removes the site from their database.
-
When there are a couple of thousand other pages like this, yes.
http://www.livingorganicnews.com/games/2011/get-cool-with-selected-berber-carpet-tiles-now/3215/
The subject of the article is about berber carpet tiles, yet the text has links (I used bold) that are totally off base and make no sense. For example:
"The berber carpet tiles might also be renowned for the durability and stain resistance at extended stay motel rates."
"To get rid of the difficult to vacuum Provillus scam dust particles..."
"An important benefit in using berber carpet tiles is a likelihood to eliminate the damaged location alone and replace it with a new carpet tile, a comparatively low-cost way of capatrex scam damage control, to make your ground look just like new."
-
Absolutely.
It is entirely possible he has been removed from Google's index as a result of a penalty. If he links to sites which receive a penalty (mobile casinos would be a very bad choice of sites to link to) then his site could receive a penalty as well.
My suggestion is not to jump to the conclusion the site is under penalty. Start by checking WMT, then if nothing is discovered submit the sitemap. If you don't see any results after a few days, then proceed to inquiring with Google about the site being under a penalty.
-
The text doesn't really seem like a human wrote it. The current most recent article has the title "Religious Credit card debt Enable Provides You With the Meaningful and Economical You Need". Other posts are about acne treatment reviews, alcoholism, and other seemingly random things.
It really looks like it's been through an article spinner. The article about alcoholism ends with "So, Think before you Beverage." Uh..really? Or what about "As emission safety glasses are put on in the office, they need to provide ease and comfort, safe healthy and crystal clear eyesight to make sure they are usually not golf clubs to the wearer." An article I found that wasn't spun is instead indexed 94 other times on the web.
I would say the content is why Google has not indexed it. They can't find the value to the user for returning this in a search result. Is this truly the content that your friend has put up, or has the site gotten hacked?
-
Hello Bryce,
That sounds possible to loose credibillity but could it be the reason for no index?
-
Thank you Ryan,
I will ask him about GWT. Perhaps it is just a sitemap issue but I wonder why Yahoo would spot it and Google would totally miss it. I often see that they have a difference in pages indexed but this is the first time I have seen thousands verses zero.
-
'm thinking that by linking out to Mobile Casinos and Polish Rock Bands, he's probably losing credibility.
-
I didn't notice any obvious problem with your site. Have you logged into Google Webmaster Tools and looked at the site? That would be the logical next step.
The robots.txt file looks fine, there is not any "noindex" tag on the home page, a GA code is present on the page, etc. I would suggest reviewing the site in Google's WMT and look for any issues.
If none are present, the next step would be to submit a sitemap. If your friend does not have a sitemap already set up, you can use http://www.xml-sitemaps.com/ I think the free version only maps 500 pages, but that is enough to get you started.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google tries to index non existing language URLs. Why?
Hi, I am working for a SAAS client. He uses two different language versions by using two different subdomains.
Technical SEO | | TheHecksler
de.domain.com/company for german and en.domain.com for english. Many thousands URLs has been indexed correctly. But Google Search Console tries to index URLs which were never existing before and are still not existing. de.domain.com**/en/company
en.domain.com/de/**company ... and an thousand more using the /en/ or /de/ in between. We never use this variant and calling these URLs will throw up a 404 Page correctly (but with wrong respond code - we`re fixing that 😉 ). But Google tries to index these kind of URLs again and again. And, I couldnt find any source of these URLs. No Website is using this as an out going link, etc.
We do see in our logfiles, that a Screaming Frog Installation and moz.com w opensiteexplorer were trying to access this earlier. My Question: How does Google comes up with that? From where did they get these URLs, that (to our knowledge) never existed? Any ideas? Thanks 🙂0 -
What's going on with google index - javascript and google bot
Hi all, Weird issue with one of my websites. The website URL: http://www.athletictrainers.myindustrytracker.com/ Let's take 2 diffrenet article pages from this website: 1st: http://www.athletictrainers.myindustrytracker.com/en/article/71232/ As you can see the page is indexed correctly on google: http://webcache.googleusercontent.com/search?q=cache:dfbzhHkl5K4J:www.athletictrainers.myindustrytracker.com/en/article/71232/10-minute-core-and-cardio&hl=en&strip=1 (that the "text only" version, indexed on May 19th) 2nd: http://www.athletictrainers.myindustrytracker.com/en/article/69811 As you can see the page isn't indexed correctly on google: http://webcache.googleusercontent.com/search?q=cache:KeU6-oViFkgJ:www.athletictrainers.myindustrytracker.com/en/article/69811&hl=en&strip=1 (that the "text only" version, indexed on May 21th) They both have the same code, and about the dates, there are pages that indexed before the 19th and they also problematic. Google can't read the content, he can read it when he wants to. Can you think what is the problem with that? I know that google can read JS and crawl our pages correctly, but it happens only with few pages and not all of them (as you can see above).
Technical SEO | | cobano0 -
Is Google suppressing a page from results - if so why?
UPDATE: It seems the issue was that pages were accessible via multiple URLs (i.e. with and without trailing slash, with and without .aspx extension). Once this issue was resolved, pages started ranking again. Our website used to rank well for a keyword (top 5), though this was over a year ago now. Since then the page no longer ranks at all, but sub pages of that page rank around 40th-60th. I searched for our site and the term on Google (i.e. 'Keyword site:MySite.com') and increased the number of results to 100, again the page isn't in the results. However when I just search for our site (site:MySite.com) then the page is there, appearing higher up the results than the sub pages. I thought this may be down to keyword stuffing; there were around 20-30 instances of the keyword on the page, however roughly the same quantity of keywords were on each sub pages as well. I've now removed some of the excess keywords from all sections as it was getting in the way of usability as well, but I just wanted some thoughts on whether this is a likely cause or if there is something else I should be worried about.
Technical SEO | | Datel1 -
Does google know every time you change content on your page
What i mean by the question is, so on our home page www.in2town.co.uk we change the article under lifestyle story of the day, if this changes every hour, will this encourage google to visit that page more often or will then just ignore that and just visit each day would love to hear your thoughts on this
Technical SEO | | ClaireH-1848860 -
How should i knows google to indexed my new pages ?
I have added many products in my ecommerce site but most of the google still not indexed yet. I already submitted sitemap a month ago but indexed process was very slow. Is there anyway to know the google to indexed my products or pages immediately. I can do ping but always doing ping is not the good idea. Any more suggestions ?
Technical SEO | | chandubaba1 -
Http VS https and google crawl and indexing ?
Is it true that https pages are not crawled and indexed by Google and other search engines as well as http pages?
Technical SEO | | sherohass0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0 -
Getting Google to index new pages
I have a site, called SiteB that has 200 pages of new, unique content. I made a table of contents (TOC) page on SiteB that points to about 50 pages of SiteB content. I would like to get SiteB's TOC page crawled and indexed by Google, as well as all the pages it points to. I submitted the TOC to Pingler 24 hours ago and from the logs I see the Googlebot visited the TOC page but it did not crawl any of the 50 pages that are linked to from the TOC. I do not have a robots.txt file on SiteB. There are no robot meta tags (nofollow, noindex). There are no 'rel=nofollow' attributes on the links. Why would Google crawl the TOC (when I Pinglered it) but not crawl any of the links on that page? One other fact, and I don't know if this matters, but SiteB lives on a subdomain and the URLs contain numbers, like this: http://subdomain.domain.com/category/34404 Yes, I know that the number part is suboptimal from an SEO point of view. I'm working on that, too. But first wanted to figure out why Google isn't crawling the TOC. The site is new and so hasn't been penalized by Google. Thanks for any ideas...
Technical SEO | | scanlin0