My website hasn't been cached for over a month. Can anyone tell me why?
-
I have been working on an eCommerce site www.fuchia.co.uk.
I have asked an earlier question about how to get it working and ranking and I took on board what people said (such as optimising product pages etc...) and I think i'm getting there.
The problem I have now is that Google hasn't indexed my site in over a month and the homepage cache is 404'ing when I check it on Google. At the moment there is a problem with the site being live for both WWW and non-WWW versions, i have told google in Webmaster what preferred domain to use and will also be getting developers to do 301 to the preferred domain. Would this be the problem stopping Google properly indexing me? also I'm only having around 30 pages of 137 indexed from the last crawl.
Can anyone tell me or suggest why my site hasn't been indexed in such a long time?
Thanks
-
Fair point about the Sitemap. Thanks a lot, I'll take these on board and see what happens from there.
Thanks,
-
Cache won't be built or updated overnight so sometimes the first few caches are a waiting game. How long has this site been live? If it's fairly new, what you're experiencing is common. If it's an older site and you recently started changing a lot of the technical stuff - redirecting, canonicals, etc. it may just take a little while to settle in.
The other major recommendation I would give you is to change your sitemap "change frequency" to be slightly more accurate. Does this page http://www.fuchia.co.uk/products/clothing/dresses/dog-tooth-print-dress.aspx really change "daily"? By having daily on every page you aren't helping Google prioritize their crawl, which means you may get a cache for your dog tooth print dress before you get a new cache for your main page.
So I would fix that, resubmit sitemap and then it's a waiting game. Could be a week, could be two, I've seen it go almost a month but not if you use G+.
-
Hi Matt,
I used ping device and it's pinging fine.
I will work on the Google+ suggestion.
I have resubumitted a Sitemap for both fuchia.co.uk and www.fuchia.co.uk as I verified ownership of both to allow me set preferred domain. I submitted one this morning, so maybe that will help. But we will see.
It seems like the main priority at the moment is getting everything redirected and canonicalised and see if that helps anything.
-
Hi Sanket,
The site has been live for around 3 months I would say.
-
I've found that if you manually ping Google, they often update their cache at the same time.
Google doesn't have a cache for either cache: www.fuchia.co.uk. or cache: fuchia.co.uk. so I don't think it's a canonical issue.
I would suggest a few things:
-
Use PingDevice http://www.pingdevice.com/
-
Put your main domain in a Google Plus post every now and then.
-
Resubmit a sitemap. Usually this gets you crawled fairly quickly and possibly updates your cache.
-
-
Hi,
Your site is open with or without WWW so it is major problem you have to do proper 301 redirect in .htaccess file. Need to implement rel=canonical into your site i did not find that code. I see 243 pages are indexed of your site by google. can i know about the domain edge of your site?? when you have live this site?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
Content Of Dead Websites Can be resused?
I have 2 websites. One website links are from spamy techniques (wrong guy hired) which still has massive links so I started a new website with a fresh domain. Now when the new website (only white hate methods used) has started to show positive movements I feel like its the right time to shut the other website down. Since, I have a lot of content on my first site (spamy links) can i reuse the content again on my new site after I shut down my first site?
Intermediate & Advanced SEO | | welcomecure0 -
Hreflang targeted website using the root directory's description & title
Hi there, Recently I applied the href lang tags like so: Unfortunately, the Australian site uses the same description and title as the US site (which was the root directory initially), am i doing something wrong? Would appreciate any response, thanks!
Intermediate & Advanced SEO | | oliverkuchies0 -
My landing pages don't show up in the SERPs, only my frontpage does.
I am having some trouble with getting the landing pages for a clients website to show up in the SERPs.
Intermediate & Advanced SEO | | InmediaDK
As far as I can see, the pages are optimized well, and they also get indexed by Google. The website is a danish webshop that sells wine, www.vindanmark.com Take for an instance this landing page, http://www.vindanmark.com/vinhandel/
It is optimzied for the keywords "Vinhandel Århus". Vinhandel means "Winestore" and "Århus" is a danish city. As you can see, I manage to get them at page 1 (#10), but it's the frontpage that ranks for the keyword. And this goes for alle the other landing pages as well. But I can't figure out, why the frontpage keep outranking the landingpages on every keyword.
What am I doing wrong here?1 -
Is a Rel Canonical Sufficient or Should I 'NoIndex'
Hey everyone, I know there is literature about this, but I'm always frustrated by technical questions and prefer a direct answer or opinion. Right now, we've got recanonicals set up to deal with parameters caused by filters on our ticketing site. An example is that this: http://www.charged.fm/billy-joel-tickets?location=il&time=day relcanonicals to... http://www.charged.fm/billy-joel-tickets My question is if this is good enough to deal with the duplicate content, or if it should be de-indexed. Assuming so, is the best way to do this by using the Robots.txt? Or do you have to individually 'noindex' these pages? This site has 650k indexed pages and I'm thinking that the majority of these are caused by url parameters, and while they're all canonicaled to the proper place, I am thinking that it would be best to have these de-indexed to clean things up a bit. Thanks for any input.
Intermediate & Advanced SEO | | keL.A.xT.o0 -
Can anyone explain this?
On Sunday 26th May, for about 40 minutes, we had about 25-30 direct visits from San Jose (we are a UK site). During this time our rankings increased dramatically and then as soon as the direct visits disappeared, our rankings went back to how they were prior to them visiting the site.
Intermediate & Advanced SEO | | Jonnygeeuk0 -
Preferred domain can't set in Web master Tool
I have put my domain name as xxxxxtours.com without www in web master tool. i have redirect to www version using htaccess file .So I wanna put Preferred domain "Display urls as www.xxxxtours.com .When trying it give error as attached image.but i have verified site the .waiting for expert help . Ar5qx.png
Intermediate & Advanced SEO | | innofidelity0 -
Any idea why I can't add a Panoramio image link to my Google Places page?
Hey guys & gals! Last week, I watched one of the Pro Webinars on here related to Google Places. Since then, I have begun to help one of my friends with his GP page to get my feet wet. One of the tips from the webinar was to geotag images in Panoramio to use for your images on the Places page. However, when I try to do this, I just get an error that says they can't upload it at this time. I tried searching online for answers, but the G support pages that I have found where someone asks the same question, there is no resolution. Can anyone help? PS - I would prefer not to post publicly the business name, URL, etc. So, if that info is needed, I can PM. Thanks a lot!
Intermediate & Advanced SEO | | strong11