How do I find out which pages are being indexed on my site and which are not?
-
Hi,
I doing my first technical audit on my site. I am learning how to do an audit as i go and am a lost. I know some page won't be indexed but how do I:
1. Check the site for all pages, both indexed and not indexed
2. Run a report to show indexed pages only (i am presuming i can do this via screaming Frog or webmaster tool)
3. I can do a comparison between the two list and work out which pages are not being indexed.
I'll then need to figure out way. I'll cross this bridge once i get to it
Thanks Ben
-
Hi Ben,
I'd echo what Patrick has said and probably recommend his first suggestion the most. Google Webmaster Tools is a good way of checking indexation and if you have a large site with lots of categories, you can even break down the sitemaps by category so that you can see if certain areas are having problems.
Here is an old, but still relevant post on the topic:
http://www.branded3.com/blogs/using-multiple-sitemaps-to-analyse-indexation-on-large-sites/
In terms of creating the sitemap, Screaming Frog has an option under Advanced Export for creating an XML sitemap file for you which works very well. You just need to make sure you're only including pages that you want indexed in there.
Cheers.
Paddy
-
Hi Patrick,
Thanks for replying.
Can you recommend any tools for creating the site map i've had a look around and the few i've found seem to all deliver different results? One has been submitted previously so i need to go through the process for myself so i can under these basics.
I've had a read up on robot txt so i understand what is happening there from an exclusion perspective and once i understand how the XML site works ill be able to do an audit as mentioned above.
Ben
-
Ben,
You can check a couple things:
- Have you submitted your XML site map to Google? If not, create one and get it submitted so you tell Google what pages you want indexed.
- Submit your domain and all pages through Google Webmasters Tool as well (Login > left side bar > Crawl > Fetch as Google
- Screaming Frog is an awesome software, so yes, if you have it, use it to scan your pages
- Try and do a simple "site:domainname.com" search in Google to see what is being indexed from your domain
Cross reference it all and you will then have a better understanding. I do believe, your sitemap is crucial in telling Google exactly what pages you do and do not want indexed. They will follow that. You're on the right track and hope my input was helpful! - Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does moz give different page authority to the same page if a visit comes from adwords vs organic search?
When clicking on an adwords ad the page the landing page has a page authority of 26. When clicking on organic search to the same exact landing page the page authority is 37. Why is this. Does moz or, more importantly Google see these as the same or separate pages? Thanks Tom
Moz Pro | | ffctas1 -
Keyword stuffing - on page grader count
I used the on page grader for my homepage, www.cprnj.com, and it said the keyword "physical therapy" was used 26 times in the body of the page. But I can't find more than 4. How exactly is keyword stuffing accounted for, and is there some place that I am not seeing the keywords?
Moz Pro | | CPRNJ-JC0 -
Concerned About Individual Pages
Okay. I've setup a campaign for www.site.com and given a list of keywords. So after the initial crawl we'll have some results. What I'm looking for tho is how do individual pages on my site rank for the list of keywords given. And then be able to go to a screen in seomoz with data for that particular page with recommendations and stuff like that. Is this what's going to happen or do I need to create a campaign for each url i want to track? If all will work as I'd like in the example above, should I then add the second list of keywords that some other pages should rank for? Will it get to be a big mess or can I relate the keywords to pages in some way? It seems like what I'm looking for is what this program should be... Thanks!
Moz Pro | | martJ0 -
How can I find out why my Domain authority has gone down?
I need to find out why the domain authority has dropped 6 points in the last 6 months - any ideas where to start looking?
Moz Pro | | RedC0 -
Is it possible to exclude pages from Crawl Diagnostic?
I like the crawl diagnostic but it shows many errors due to a forum that I have. I don't care about the SEO value of this forum and would like to exclude any pages in the /forum/ directory. Is it possible to add exclusions to the crawl diagnostic tool?
Moz Pro | | wfernley2 -
Duplicate Content being caused by home page?
Hello everyone, I am new to SEOmoz and SEO in general and I have a quick questions. When running a SEO Web Crawler report on my URL, I noticed in the report that my home page (also known as my index page) was listed twice. Here is what the report was showing: www.example.com/ www.example.com/index.php So are these 2 different urls? If so, is this considered duplicate content and should I block crawler access to the index.php? Thanks in advance for the help!
Moz Pro | | threebiz0 -
How fast can page authority be grown
I understand that it is easier to rank for a particular keyword given a higher DA score. How fast can page authority be established and grown for a given keyword if DA is equal to 10/20/30/50? What are the relative measures that dictate the establishment and growth of this authority? Can it be enumerated to a percentage of domain links? or a percentage of domain links given an assumed C-Block ratio? For example you have a website with DA of 40, and you want to target a new keyword, the average PA of the top ranked pages is 30, the average domain links are 1,000, and the average number of linking domains is 250 - if you aim to build 1,000 links per month from 500 linking domains, how fast can you approximate the establishment of page authority for the keyword?
Moz Pro | | NickEubanks0 -
Best Practices for having Social Profiles indexed
There has been a lot of talk lately around social profiles potentially improving your brand as well as search. What I'd like to know is the best practices for getting those social profiles crawled and indexed so they actually provide a good link to my site. I'm also wondering what the difference between what Linkscape sees and what Google sees and when I'm looking at Open Site Explorer's rankings on one of those social profiles how can I be sure that Google sees it the same way. I ask this because a lot of these profiles are not well internally linked to. An example is about.me, it's a potentially great link, but it's essentially an island, and even after dropping a couple Twitter links to my profile, Open Site Explorer shows and Page Authority of 1, and it's not even indexed with Google. What I did last night was put a link to my about.me, flickr and wedding wire in my Connect menu drop down on my site to get that crawled hopefully soon. Are there other methods of getting those crawled and indexed so it starts passing some juice?
Moz Pro | | WilliamBay
What do you guys do?0