Site Audit: Indexed Pages Issue
-
Over the last couple of months I've been working through some issues with a client. One of my starting points was doing a site Audit. I'm following a post written by Geoff Kenyon https://moz.com/blog/technical-site-audit-for-2015 .
One of the main issues of the site audit seems to be that when I run a "site:domain.com" query in Google my homepage isn't the first page listed in fact it isn't listed in this search when I go through all of the listings. I understand that it isn't required to have your homepage listed first when running this type of query, but I would prefer it.
Here are some things I've done
- I ran another query "info:homepage.com" and the home page is indexed by Google.
- When I run a branded search for the company name the home page does come up first.
- The current page that is showing up first in the "site:domain.com" listing is my blog index page.
- Several months back I redirected the index.php page to the root of the domain. Not sure if this is helping or hurting.
- In the sitemap I removed the index.php and left only the root domain as the page to index.
- Also all interior links are sent to the root, index.php has been eliminated from all internal links everything links to root
- The main site navigation does not refer to the "Home" page, but instead my logo is the link to the Home page.
- Should I noindex my blog/index.php page? This page is only a compilation of posts and does not have any original content instead it actually throws up duplicate content warnings.
Any help would be much appreciated. I apologize if this is a silly question, but I'm getting frustrated/ annoyed at the whole situation.
-
Thanks Seoman,
That was why I was wondering if I should noindex the blog index page. It is purely a listing of blog entries and not original content. It seems to throw up duplicate content issues and Google seems to give it the most page power on the site even though it is not my most important page.
I would want Google to still follow all of the links because those are the blog posts and the original content. I don't know if the noindex is the best choice but I think it at least it would tell Google "Hey guys the blog page is not my most important page. In fact it is just a compilation of posts"
I haven't pulled the trigger on it yet, because I don't know if it will hurt me more than it is helping. I just don't know. If anyone has any other thoughts on the noindex of the blog index page which is not my home page feel free to drop me a line.
-
Apologies I'd slightly misunderstood your question, I see exactly what you mean now. I think this is purely down to the way Google associates the search intent and tries to deliver the most appropriate result.
The site parameter is obviously intended to help users find a specific item on the specified site, therefore if the blog has more content than the other pages there is more chance that it will have what the user is looking for hence Google will deliver that page out of preference.
Don't know for sure but just an assumption.As you said branded searches are fine, there certainly doesn't look to be any issues as far as I can see although I haven't done a full audit.
Would be interested to see what anyone else says but my gut feeling is there is nothing to be worried about, the main thing is you come up for your company name and search terms that you want.
Sorry hope that helps somewhat.
All the best
-
Feel free to take a look www.denverilluminations.com & www.denverilluminations.com/_blog/ .
Also the domain authority is 19 for the site I was looking at the individual page authorities. Thanks again Seoman.
-
Anyway you could let me have the two links and I can give them a quick look over?
Also bear in mind that DA isn't everything.
-
Seoman,
Thanks for the response. I appreciate any and all suggestions
- Blog page has a page authority of 1 out of 100 the home page has a page authority of 33 out of 100
- I looked at google's cache for pages and reviewed the text only version and everything is showing.
- Checked robots and I'm disallowing certain directories that I don't want indexed or crawled but those are all in order and I tested the robots.txt just to make sure it was written properly and it came back clean.
I don't believe noindexing my blog page is absolutely necessary, but I'm kind of wondering if Google thinks that it is my home page instead of my regular root directory? I know it sounds a little weird but I'm wondering if something is confusing the spiders. Thanks again for your time and thoughts.
-
Few quick thoughts come to mind (in order of priority)
- Blog page may have more authority than the homepage
- Could be a technical issue with the homepage (Maybe Google can't see anything there)
- Check your robots.txt to make sure it's not blocked (Sounds crazy but can happen)
I would strongly advise against noindexing unless it is absolutely necessary.
Personally I wouldn't be too worried about the homepage not showing although, I agree it's a good idea to know why. After all no customers are going to be using Google search parameters like site or info. They are going to be searching for what they want and expecting an answer on the page that Google provides them with.
Not sure if that helps or not but just a few thoughts.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Matching page for keyword doesn't show in search
Hello! I'm having an issue with my website Rooms Index, the website is in Hebrew so I'll provide examples in English for better understandings. When I'm searching Rooms by Hour in Haifa, google doesn't show the intended category page which is this, instead it shows my homepage in the results, this happens only for certain areas, while other areas are working well such as Tel aviv. For example if I searched day use in Las Vegas it'd show me the Las Vegas page dayuse.com/las-vegas, but searching for Brooklyn I'd only see dayuse.com. the pages are indexed and I can find them if I search site:roomsindex.co.il what could cause such problem?
Local Website Optimization | | AviramAdar0 -
Page Title Local SEO - 2 places
Hello guys, I am from azores are 9 islands in portugal. I live in São Jorge is one island. My question is. If one person seach by Azores Canyoning or São Jorge Canyoning. Because Azores is one region and São Jorge is one island inside Azores. And i want have this two exact keywords in title page. Canyoning is a service. Azores Canyoning - São Jorge Canyoning | Brand Name what is best way to write this title? Or is not good?
Local Website Optimization | | Flaske0 -
Seeking advise about my new landing pages for different cites
I have just created 6 new location landing pages for my Dallas insurance agency. Each one is for a different city, but I have a feeling I did it wrong 😞 Because my site is rather large, I put two different lines of insurance on each page. Homeowners insurance and business insurance. Now I'm wondering if I should of done 12 different pages? i.e **1 city + 1 product = 1 page ** Here's one of the new pages: http://thumannagency.com/personal-insurance/frisco-insurance I'm having a guess here, but would it be better if the Navigation was; thumannagency.com/personal-insurance/frisco thumannagency.com/business-insurance/frisco ??? Thank you so much in advance!!
Local Website Optimization | | MissThumann0 -
Site Not Rankings After a Few Months
I have a client site that I am beating my head against the wall for right now. Three months into a 100% white hat campaign, we can't get him ranking in the top 150. Here's the cliffsnotes: Built a new wordpress website All on page SEO has been done and score an A+ for his primary kws Robots.txt is setup correctly .htaccess is setup correctly new domain multiple 95 DA, 50 PA links from reputable, national sites. Yext Local listings SSL, CDN, Speed optimized Has 19 pages indexed by Google Posting one blog a week for him Granted his primary keyword is a hyper competitive kw, but still, I've been doing this for 8 years and never seen a guy be stuck on the 16th page for so long for the sort of links we are building him. I'm genuinely stumped here and could use some help.
Local Website Optimization | | BrianJGomez0 -
Website ranking issues
Hi Moz, I have a question about one of our websites that has been ranking very poorly on it's current domain (fancydoorsedmonton.com) lately, but was ranked at #1 for the search term "Edmonton Doors" until last month. The main search terms we're targeting are "Edmonton Doors" and "Doors Edmonton". I made another post regarding the on-page SEO value and had some feedback from that, but there is another issue that seems more likely to cause an issue. There are 2 more domains set up to forward to their main domain: fancydoors.com was their old domain but was registered by someone else and had some questionable, X-rated content put on it. The domain has now been reacquired and redirected to their main domain. There isn't any more questionable content on there anymore. Would this domain's past affect it's current ranking? fancy-doors.com was another old domain of theirs now set up as a redirect. In the past they had another SEO provider work with this domain and did some bad SEO work for them with automated citations, etc. We changed the domain to fancydoorsedmonton.com to get away from that and also include Edmonton in the domain. If you have any ideas or feedback to provide based on this information it would definitely be a huge help to us. Thanks!
Local Website Optimization | | Web3Marketing870 -
What is the effect of CloudFlare CDN on page load speeds, hosting IP location and the ultimate SEO effect?
Will using a CDN like CloudFlare.com confuse search engines in terms of the location (IP address) of where the site is actually physically hosted especially since CloudFlare distributes the site's content all around the globe? I understand it is important that if customers are mostly in a particular city it makes sense to host on an IP address in the same city for better rankings, all things else being equal? I have a number of city-based sites but does it make having multiple hosting plans in multiple cities/ countries (to be close to customers) become suddenly a ridiculous thing with a CDN? In other words should I just reduce it down to having one hosting plan anywhere and just use the CDN to distribute it? I am really struggling with this concept trying to understand if I should consolidate all my hosting plans under one, or if I should get rid of CloudFlare entirely (can it cause latency in come cases) and create even more locally-based hosting plans (like under site5.com who allow many city hosting plans). I really hope you can help me somehow or point me to an expert who can clarify this confusing conundrum. Of course my overall goal is to have:
Local Website Optimization | | uworlds
1. lowest page load times
2. best UX
3. best rankings I do realise that other concepts are more important for rankings (great content, and links etc.) but assuming that is already in place and every other factor is equal, how can I fine tune the hosting to achieve the desirable goals above? Many thanks!
Mark0 -
Bing ranking a weak local branch office site of our 200-unit franchise higher than the brand page - throughout the USA!?
We have a brand with a major website at ourbrand.com. I'm using stand-ins for the actual brandname. The brand is a unique term, has 200 local offices with sites at ourbrand.com/locations/locationname, and is structured with best practices, and has a well built sitemap.xml. The link profile is diverse and solid. There are very few crawl errors and no warnings in Google Webmaster central. Each location has schema.org markup that has been checked with markup validation tools. No matter what tool you use, and how you look at it t's obvious this is the brand site. DA 51/100, PA 59/100. A rouge franchisee has broken their agreement and made their own site in a city on a different domain name, ourbrandseattle.com. The site is clearly optimized for that city, and has a weak inbound link profile. DA 18/100, PA 21/100. The link profile has low diversity and generally weak. They have no social media activity. They have not linked to ourbrand.com <- my leading theory. **The problem is that this rogue site is OUT RANKING the brand site all over the USA on Bing. **Even where it makes no sense at all. We are using whitespark.ca to check our ranking remotely in other cities and try to remove the effects of local personalization. What should we do? What have I missed?
Local Website Optimization | | scottclark0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0