What more can be done to get Google to change the landing pages it uses for certain search terms?
-
For one of my SEO campaigns, Google is using the website's home page as the landing page for the majority of search terms being tracked. The website splits its products by region and so we want specific region pages to rank for search terms related to that region, rather than the home page. We have optimised each regional page to a reasonably high standard and we have ensured that there is a good amount of internal linking and sign-posting to those region pages, however, Google is still using the home page. The only complication is that for the first few months there were canonical tags on these pages to the home page. These were removed around 3 months ago and we've checked that the region pages are indexed properly.
Is there anything we are missing?
Has anyone had any success in getting Google to change its landing pages?
-
That's partially a question of how soon the linking pages get indexed/recrawled, but in general it's very fast. On our last project where we had an issue like this it only took ~4 days before our pages were showing up correctly in the search results. We were targeting two pages specifically, and built around 8 links per page.
Keep in mind though that the homepage is almost certainly going to be stronger than any of your other pages, so rankings might slip a bit at first when the engines start to attribute the pages correctly.
-
Hi Jared,
That's very helpful and your response is greatly appreciated! From your experience, what sort of time frame would you expect from implementing these signals and the pages being reindex to seeing an effect in the ranking?
Many thanks
-
Hi Harry,
Logan is right, this happens a lot when new projects get started, especially when a site is newer.
The easiest way that I've found to combat this (in addition to what you've already done) is to build some almost "over optimized" links to each of the pages in question. When our team does this, we make sure 1) to include exact match anchor text of links that are built, 2) the links and linking domain itself are extremely relevant to the destination page, and obviously, 3) they are pointed at the very specific pages in question that you're trying get ranked properly.
I certainly wouldn't be this blatant all the time, but when trying to "separate" these pages in the search results I would definitely make sure this is on your checklist.
The search engines are incredibly "intelligent" but they're still machines and can't necessarily infer what page needs to be ranked without the proper signals. So making sure the on and off page signals are there to provide as much context as possible is really important.
Hope that helps Harry!
-
Thanks Logan, I'll check out!
-
Hi Harry,
I've done this a number of times when taking over campaigns for other 'agencies'. It's a pretty common task for most SEOs. It usually involves some de-optimization of the ranking page in order to shift that emphasis over to your preferred page. Check out Moz's on-page grader, that might give you some insight into why the homepage is overpowering the interior pages.
Rand did a really good Whiteboard Friday on this topic about a month ago. It's definitely worth watching if you haven't already, you might find the key to what you're looking for: https://moz.com/blog/wrong-page-ranks-for-keywords-whiteboard-friday
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Get google to get to index New URL and not the OLD url
Hi Team, We are undertaking a Domain migration activity to migrate our content frrom one domain to another. 1. the Redirection of pages is handeled at Reverse proxy level. 2. We do have 301 redirects put in place. However we still see that google is indexing pages with our Old domain apart from the pages from new domain. Is there a way for us to stop google from indexing our pages from Old domain. The recommendations to have Noindex on Page mete title and disallow does not work since our redirection is setup at RP and google crawlers always discover the new pages after redirection.
Local Website Optimization | | bhaskaran0 -
Using posts to make static pages - A best practice or a bad idea?
I have started working with a few law firms this year. They already have websites and I am doing various marketing tasks such as copywriting, redesigns, and, of course, SEO. In a couple of cases I was surprised to find that they had made the pages describing their various practice areas post content. I'm not sure why. But I suspect that the idea might have been to have the phrase: /practice-areas/ as a part of their URL. I didn't really like the idea of treating pages like posts. It seems a bit like working the system. But apart from that, wouldn't pages have a higher value as "permanent" content? As posts - their publish date has more weight, right? And they'd get old? But maybe the previous developers were on to something and the category/post approach to listing practice areas is the way to go? I am starting a new site for a new firm and I'd like to feel more confident about the right structure to choose for this kind of website before I start. Does anybody know the right answer? Thanks!
Local Website Optimization | | Dandelion1 -
Duplicate content, hijacked search console, crawl errors, ACCCK.
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel. Of course I was not around to manage the build. I also do not have ftp access I'm also dealing with major search console issues. The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation. The site prefers the www version and does not read the same traffic for the non www version We also have something like 90,000 backlinks from 13 sites. And a shit ton of ghost spam. Help!
Local Website Optimization | | beth_thesomersteam0 -
RE: Keep Losing Keyword Ranking Position for Targeted Keyword Terms Can't Figure It Out, Please Help!!!
Hey Mozzers, I am pulling my hair out trying to figure out why one of my clients keeps losing their SERP for their targeted keyword terms. We're actively pursuing local citations, making sure their NAP is consistent across the board and refining on-page content to make sure that we're maximizing opportunities. The only thing I've found is a 4xx error that my Moz 'crawl diagnostics' keep returning back to me, however, when I check to see if there's any problems with Google Webmaster Tools, it doesn't return any errors. Is this 4xx error the culprit? Are there any suggestions any of you could give me to help me improve the SERP for my targeted keyword terms. Anyway, any and all insight can help. I'm at my wits end. Thanks for reading and for all of your help!
Local Website Optimization | | maxcarnage0 -
What's with Google? All metrics in my favor, yet local competitors win.
In regards to local search with the most relevant keyword, I can't seem to get ahead of the competition. I've been going through a number of analytics reports, and in analyzing our trophy keyword (which is also the most relevant, to our service and site) our domain has consistently been better with a number of factors. There is not a moz report that I can find that doesn't present us as the winner. Of course I know MOZ analytics and google analytics are different, but I'm certain that we have them beat with both. When all metrics seem to be in our favor, why might other competitors continue to have better success? We should be dominating this niche industry. Instead, I see a company using blackhat seo, another with just a facebook page only, and several others that just don't manage their site or ever add unique, helpful content. What does it take to get ahead? I'm pretty certain I've been doing everything right, and doing everything better than our local competitors. I think google just has a very imperfect algorythm, and the answer is "a tremendous amount of patience" until they manage to get things right.
Local Website Optimization | | osaka730 -
Search Result Discrepancy: Keyword "Dresses" shows international sites in the search results of Google.co.in.
Hi All, What would be the reason that Google shows international websites in the first page results while there are huge local players available. Eg: Dresses - Keyword that shows results with almost all the results from International websites whereas the local big players in the same category are not shown. This is not the case for other keywords like Women dresses, Clothing, Shoes etc., Is it a bug or any particular reasons? Thanks,
Local Website Optimization | | Myntra0 -
Local Area SEO - Directions Page and Multiple Use of Direction pages
Hello, We are looking to focus on multiple local areas and it has been suggested one way to mention lots of different locations on pages without doing lists or using grey SEO practices is to create directions pages. We are trying this with a client who has 2 business at the same address. The layout is:- Introduction - 2-3 sentences Directions by Car Park Parking info Directions by Public Transports Closing - 3-4 sentences - using clients keywords The hope is the having locations/areas and the clients keywords on the same page will capture some of the local areas with the clients keywords. I have some questions:- 1. If we use the same directions text and just change the opening and closing paragraphs on the different website will this be enough to not have a duplicate content issue. 2. Are the directions pages the best way to capture keywords and local area/locations on the same page. 3. Is there anything I am missing or could do instead? Looking forward to everyone's input....
Local Website Optimization | | JohnW-UK0 -
Bing ranking a weak local branch office site of our 200-unit franchise higher than the brand page - throughout the USA!?
We have a brand with a major website at ourbrand.com. I'm using stand-ins for the actual brandname. The brand is a unique term, has 200 local offices with sites at ourbrand.com/locations/locationname, and is structured with best practices, and has a well built sitemap.xml. The link profile is diverse and solid. There are very few crawl errors and no warnings in Google Webmaster central. Each location has schema.org markup that has been checked with markup validation tools. No matter what tool you use, and how you look at it t's obvious this is the brand site. DA 51/100, PA 59/100. A rouge franchisee has broken their agreement and made their own site in a city on a different domain name, ourbrandseattle.com. The site is clearly optimized for that city, and has a weak inbound link profile. DA 18/100, PA 21/100. The link profile has low diversity and generally weak. They have no social media activity. They have not linked to ourbrand.com <- my leading theory. **The problem is that this rogue site is OUT RANKING the brand site all over the USA on Bing. **Even where it makes no sense at all. We are using whitespark.ca to check our ranking remotely in other cities and try to remove the effects of local personalization. What should we do? What have I missed?
Local Website Optimization | | scottclark0