You could use Linksleuth to crawl your site. It will tell you how many pages it found, then match it against the total of pages google has indexed.
Best posts made by smarties954
-
RE: How to determine which pages are not indexed
-
RE: Meta description & Meta keywords
meta keywords are no longer read by google. It's been abused in the past and it's not relevant anymore.
Drop the meta keywords as it gives valuable information to your competition on which keywords you target .
the meta description is still working and relevant, and should be part of your SEO strategy,
-
RE: Organic SEO for Local Towns
This might work, but could be seen as doorway pages and could be penalized down the road.
After all you are creating pages for the sole purpose of targeting a keyword/area.
If you submit your company address, I guess google will show results in your area or near your area, so if you are competitve, it might not be necessary for you to create pages for each city near you.
-
RE: Google insists robots.txt is blocking... but it isn't.
24 hours is a short time and probably google did not reindex or even looked at your new robot.txt
Webmaster tools is way slower than bing tools, so be patient.
As a rule of thumb, I wait at least a week with google before worrying (my 2 cents)
-
RE: Duplicate content or not ?
the .mx and .es will be treated as 2 separate websites, so as far as google is concerned you can use each site to target a specific country.
I am not 100% sure, but if both sites are exactly the same, I think one of the two might be discounted because of the duplicate content.
French Version:
le site .mx and .es sont considere comme 2 differents site pour google. Tu peux utilise un profile pour chaque pays.
Je suis pas 100% certain, mais si les deux sites sont identique, il y a beaucoups de chances que google penalise l'un des deux a cause du duplicata et manque d'originalite.
-
RE: For a mobile website, is it better to use a 301 vs. a 302 redirect?
I'd stay away from 302 redirections as much as I can, and simply serve a site based on the user agent,
or you may simply offer your clients the choice between the two versions using some kind modal popup.
Here this video might help answer your question.
http://support.dudamobile.com/entries/20308913-does-redirecting-to-a-mobile-site-create-seo-issues
-
RE: Language Selection Splash Page- Impact on SEO
The 302 redirection is not great after you selected the language, do you really need to do that?
You could use a modal popup like jquery UI has, this will avoid some back and forth to your splash page.
-
RE: Duplicate content or not ?
I don't think you can...
Your case is valid in a way, since you only intend to deliver each content to specific countries and after all it should not matter if both sites are the same since users from spain should not go on the .mx and vice versa.
However google is not human and might not see it the same way.
I would not take a chance on this and make both sites kind of unique.
You may for example allow people to comment on your pages, this will create a "uniqeiness" since comments from spain should be different from Mexico.
Even very large websites like walmart, staples, etc... are slightly different from country to country.
-
RE: Reducing Booking Engine Indexation
-
You could use rel=nofollow on links pointing to pages variations.
-
If you can you could also dynamically add a meta noindex, no follow, when a variant of the initial page is generated.
-
You could also add a link rel=canonical pointing to the initial page, this will tell bots that this page is the original page.
In other word, you have to tell crawlers when it is a page variant and that you don't want him to index them.
-
-
RE: SEO without SEM, Social networking and advertisements
This is tricky, since you are limited.
You should focus content (quality & quantity) and try to share as much knowledge as possible about your product or service, maybe offer expert advice.
You could setup a blog to talk about what you do.
Try to get some good links in, even maybe from some recognized directory like dmoz, yahoo, BBB, yellow pages, etc... (I don't think it's advertisement.)
With good content and some links you could pickup some spots on the web and maybe have other sites will talk about you that will feed your page rank.
That's all I have so far
-
RE: Duplicate Content?
The best way to go is to put all your newsletters in on folder and and disallow the folder in your robot.txt.
rel nofollow & robot.txt are only read by google bot, your visitors won't be affected and will be able to navigate & search the archives without problem.
-
RE: Broken sitemaps vs no sitemaps at all?
I think you can remove the sitemap since it returns so many warnings.
I don't think sitemaps have so much seo benefits but rather helps google find pages that are hard to find in your site or no accessible through regular href.
So make sure your site has a good structure and that all page can be found by browsing your site (click on links from pages to pages) and you will be fine sitemap or not.
Use linksleuth to crawl your site, if you are not sure of the accessibility of all pages.
-
RE: Writing of url query strings to be seo frinedly
It's not too hard to do in .net. Since .net 3.5, Routing has been added to webforms and is native to MVC.
Basically, this will allow you to "rewrite", thus create friendly url such as
www.123.com/florida/miami/city-detail/
You can map your url so your pages understand wich section is city, state etc... so you can query your database.
Here is an article about routing in .net:
Note: Keep the url in lower case and replace spaces with a dash this way (Cap Cod > cap-cod)
In our case, this page uses routing actually, I left the .aspx cause I don't care much about it, but this way our entire catalog is written in 2 pages and it pulls the cat and subcats from the url itself:
http://www.smartresolution.com/printing/envelopes/10-envelopes.aspx
Here: envelopes = category and 10-envelopes=subcategory. The .aspx can be dropped too if you are picky
All you have to do in your application is find a pattern and you should able to handle the rewrite in no time and with little code.
Note, don't forget to redirect (301) the old pages to the new url as well