I've hit a wall. What's next?
-
I've been working on creating content and optimizing my website for the past 18 months, and have seen decent gains in organic traffic. Recently my keyword ranking increases have slowed dramatically, despite adding additional content. I'm hoping a fresh set of eyes can spot something that may be holding me back. I appreciate any feedback the MOZ community is able to provide.
One area of concern/confusion is the number of pages indexed according to GSC. The number tends to go up and down, and I can't figure out why. Of the roughly 8,000 pages that are submitted through the sitemap, it says only around 1,100 are indexed (sitemap page). The index status page says I've got 7,500 pages indexed. Kind of confused on that one.
Anyway, my website is MetroAtlantaHome dot com. Thanks everyone!
-
Matt,
Thank you for taking a look at my site. I really appreciate the feedback.
I had noticed the search results with parameter tags being indexed as well. I tried limiting Google's crawler to non-parameter pages about 8 months ago, but didn't seem to do much. I was afraid one of my settings was messing things up so I deleted it and selected let GoogleBot decide for all of them. Not sure why the paginated pages are being indexed too. I thought I'd read where the paginate tag was similar to the canonical tag and Google would treat it as a signal not to index. I would think if GoogleBot can crawl and index the paginated search pages, it could also crawl the listing results on all those pages. I do have a large portion of listing pages on my site no-indexed since they don't fall within my service area and I used to get a ton of calls about them. Maybe that is causing issues for all the listing results pages to not be indexed.
In regards to Screaming Frog, which settings did you use? I use the preset user agent GoogleBot Regular and have thousands of pages that it crawls. Wish I could replicate your scan to see what might be holding things up.
Regarding the Wistia sitemap, I put it in the robots.txt file to try and get the videos embedded on my site indexed. I didn't realize this was hurting more than helping.
Thanks again for your feedback and taking the time to review my site.
-
As far as the discrepancy in numbers, Google is indexing a lot of ?search= parameter results (as well as /listing/ results) but the search ones are not in your sitemap. Your site doesn't crawl all the way very well - I put it in ScreamingFrog twice and only came up with about 400 pages - some listings pages but not the thousands Google has in the index.
You have a number of paginated pages:
- http://www.metroatlantahome.com/contemporary-style-homes/300000-400000/?p=5
- http://www.metroatlantahome.com/contemporary-style-homes/300000-400000/?p=3
Your canonicals take care of this issue for the search index but your search page results seem to be "indexed" even though they aren't in your sitemap. See the attachment.
So I think you're gaining indexed pages from the search parameter but losing them from /listings/ that aren't indexed and harder to crawl. I would suggest submitting sitemap_0.xml directly to Search Console as a start. Then trying to figure out why those pages don't crawl very well.
The Wistia sitemap in your robots.txt is also NOT something I'd want Google to ever see but having it in robots.txt ensures they do. I'd try to remove that from robots.txt
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I've seen 4webmasters.org/default.apsx as a referrer, but now it's showing as top page visited
I've seen 4webmasters.org/default.apsx as a referrer, but now it's showing as top page visited. How is this happening and how do I fix it?
Reporting & Analytics | | Stamats0 -
Pro's & Con's of Wordpress Categorys & Tags
Good Afternoon! I touched on this question a while back in another post specifically regarding a plethora of duplicate pages that I was finding due to inappropriate tagging in wordpress. As I am going through our website, I am starting to notice it happening again with categories as well. I am including some pictures where you can see the URL structures and titles etc of how everything is laid out. I would like to clarify that I was not the one who did any of this Is it wrong/bad to cross categorize? What I mean by that is put something in more than one category? Would there be any drawback to converting any of these into subcategories? Would that even do anything? Does having two pages that are named the same thing, hurt you? It would seem to me that Google wouldn't like that. I have recently come into the field of thought that Google is getting more and more human, and If it makes a human uncomfortable/confused it will make Google confused. In my pictures you can see we clearly have numerous hard copies of the same thing, not just duplicate elements created by wordpress, that is a separate issue. I personally want to change all of the titles and make everything as different and individual as possible, but i also could be very wrong in my desire to do that. Any thoughts are appreciated! eY4iX2N N3AVqss JZpU7Rq
Reporting & Analytics | | HashtagHustler0 -
How to safely exclude search result pages from Google's index?
Hello everyone,
Reporting & Analytics | | llamb
I'm wondering what's the best way to prevent/block search result pages from being indexed by Google. The way search works on my site is that search form generates URLs like:
/index.php?blah-blah-search-results-blah I wanted to block everything of that sort, but how do I do it without blocking /index.php ? Thanks in advance and have a great day everyone!0 -
Google Webmaster Tools - spike in 'not selected' under Index Status
Hi fellow mozzers Has anyone seen a huge shift in the number of pages 'Not Selected' under Index Status in Google WMT, and been able to identify what the problem has been? My new client recently moved their site to wordpress - and in doing so the number of pages 'not selected' rose from ~200 to ~1100, It was high before but is ridiculous now. I am thinking there must be a new duplicate content issue which should be cleaned up in my quest to improve their SEO. Could it be the good old WP tag/category issue? In which case I won't worry as Joost is doing its job of keeping stuff out of the index. There are loads of image pages which could well appear as dupe as have no content on them (i do need to fix this), but Google is already indexing these so doesn't explain the ones 'not selected'. I've tried checking dupe title tags but there are very few of them so that doesn't help Any other ideas of how to identify what these problem pages maybe? Thanks very much! Wendy
Reporting & Analytics | | Chammy0 -
'Search Queries Report' in Webmaster Tools Question
Hi, How much do you use the search queries report in webmaster tools to research current rankings/movements? It does look like a great tool but the data doesn't seem to be spot on. For example a keyword over a week might have flux in position so lets say 6.0 then 9.2 for 3 days then back to 6.0. But I check the serp's for this keyword everyday and didn't see any movement?!?! Is this a good tool for you?
Reporting & Analytics | | activitysuper0 -
Increase in 'Googlebot-Image' visits in analytics
Hi, I noticed a substantial increase in 'Googlebot-Image' visits data under Technology>Browser & OS in Google analytics for a few clients. Is this a bug? Are there any known fixes apart from just adding a filter to exclude the data? Regards Niladri
Reporting & Analytics | | neildomain0 -
Will Google start trimming 'stale' sites rank?
With the recent focus on Google to reduce rank of farms and low value sites, I am interested to get SEO view on if you think Google will start devaluing stale sites. I do find it a bit frustrating that in the top 5 for my main key phrase, there is one site that has NO content just an error and another blog that has not updated content in 2 years. How can blogs that do not blog be considered high enough value by Google to rank in the top 5? How can sites that just return 404 or 500 for ALL their pages be even considered a site let alone rank 2nd. I am interested so see others experiences and thoughts on 'user experience' clean ups by Google and why these types of sites get missed?
Reporting & Analytics | | oznappies0