Page A Best for Users, but B Ranks
-
This is real estate MLS listings related. I have a page "B" with lots of unique content (MLS thumbnails mixed with guide overview writing, pictures etc) which outranks "A" which is a page simply showing MLS thumbnails with map feature included. I am linking from "B" to "A" with anchor "KEYWORD for sale" to indicate to search engines that "A" is the page I want to rank, even though "B" has more unique content. It hasn't worked so far.
Questions:- Should I avoid linking from "B" to "A" as that could impact how well "B" ranks?
- Should I leave this setup and over time hope search engines will give "A" a chance to rank?
- Include some unique content on "A" mostly not viewable without clicking "Read more" link? I don't foresee many users will click "Read more" as they are really just looking for the properties for sale and do rarely care about written material when searching for "KEYWORD for sale".
- Should I "no index, follow" A as there are limited to none unique content and this could enhance chance of ranking better for B?
- When I write blog posts and it includes "KEYWORD for sale" should I link to "A" (best for users) or link to "B" since that page has more potential to rank really well and still is fairly good for users?
Ranking for "B" is not creating a large bounce rate, just that "A" is even better.
Thank you,
Kristian -
Hi Gregory,
A and B are quite different. B has a lot of written content, pictures and a few MLS thumbnails and a "see all condos for sale" type link leading to "A".
A shows 10 thumbnails per page and has a large Google integrated map and an H1 and breadcrumbs. Basically, A offers exactly what users want: quick view of property details on a thumbnails and when scroll over they can see a map with location. If click a thumbnail they are taken to the specific property. Perfect for users.
B is OK too, but the written content is mostly behind a "Read more" so users have option to read, list some thumbnails (but not many, to keep page as original and unique as possible). that is the structure and I can see search engines prefer these pages with more written content.
If I interlink to B I imagine I will more quickly rank well because search engines like the page to start with. So if I instead interlink to A I may risk creating a scenario where A and B are both doing OK, but none are great. That is a concern I have. I am basically looking for a solution where search engines can see that all the "greatness" of B should be transferred as power to A. -
So is "B" just the More Details page for "A"? And is there a different A,B pair for each individual property listing (A1,B1 and A2,B2 and A3,B3 and etc)?
If so, pick a few and do a little where experiment where B has a canonical tag pointing to A. I would also put a self-referential canonical tag on A pointing to itself (A), just in case there is extra junk on the end of link URLs that are pointing to A.
It will take awhile for Google to re-index. To speed it up, you can use GWMT to "Fetch as Google" the experimental B pages and do "Submit to Index". A few days after that do the same thing for the experimental A pages.
let us know if it works...
-
great, thx Keri. appreciated
-
An older post, but Rand has some feedback on this at http://moz.com/blog/wrong-page-ranking-in-the-results-6-common-causes-5-solutions.
-
Hi,
1)By simply linking B to A with an anchor text won't impact B's ranking as you are only interlinking.
2)Internal linking B to A will help visitors explore more of your content and does help search engine find page A. However, in order to increase it's ranking, you should generate more backlinks to increase its ranking
4)If Page A is not harming your site and is not causing any duplicate content, i don't think you should noindex/nofollow it.
I think there is a bit of confusion. First of all, is page A and B duplicate? If they are, interlinking wouldn't solve the problem and I think you are talking about using the Canonical tag to tell search engines which page to rank. If that's the case, it will answer most of your questions, if you tell Search Engines to crawl B over A, then you should definitely improve page B and vice versa.
Hope this answers your question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why rankings dropped from 2 page to 8th page and no penalization?
Dear Sirs, a client of mine for more than 7 years used to have his home page (www.egrecia.es) between 1st and 2nd page in the Google Serps and suddenly went down to 8 page. The keyword in question is "Viajes a Grecia". It has a good link profile as we have built links in good newspapers from Spain, and according to Moz it has a 99% on-page optimization for that keyword, why why why ??? What could I do to solve this? PD: It has more than 20 other keywords in 1st position, so why this one went so far down? Thank you in advance !
Intermediate & Advanced SEO | | Tintanus0 -
Why is Google no longer Indexing and Ranking my state pages with Dynamic Content?
Hi, We have some state specific pages that display dynamic content based on the state that is selected here. For example this page displays new york based content. But for some reason google is no longer ranking these pages. Instead it's defaulting to the page where you select the state here. But last year the individual state dynamic pages were ranking. The only change we made was move these pages from http to https. But now google isn't seeing these individual dynamically generated state based pages. When I do a site: url search it doesn't find any of these state pages. Any thoughts on why this is happening and how to fix it. Thanks in advance for any insight. Eddy By the way when I check these pages in google search console fetch as google, google is able to see these pages fine and they're not being blocked by any robot.txt.
Intermediate & Advanced SEO | | eddys_kap0 -
Possible to Improve Domain Authority By Improving Content on Low Page Rank Pages?
My sites domain authority is only 23. The home page has a page authority of 32. My site consists of about 400 pages. The topic of the site is commercial real estate (I am a real estate broker). A number of the sites we compete against have a domain authority of 30-40. Would our overall domain authority improved if we re-wrote the content for several hundred of pages that had the lowest page authority (say 12-15)? Is the overall domain authority derived by an average of the page authority of each page on a domain? Alternatively could we increase domain authority by setting the pages with the lowest page authority to "no index". By the way our domain is www.nyc-officespace-leader.com Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
How do I best deal with pages returning 404 errors as they contain links from other sites?
I have over 750 URL's returning 404 errors. The majority of these pages have back links from sites, however the credibility of these pages from what I can see is somewhat dubious, mainly forums and sites with low DA & PA. It has been suggested placing 301 redirects from these pages, a nice easy solution, however I am concerned that we could do more harm than good to our sites credibility and link building strategy going into 2013. I don't want to redirect these pages if its going to cause a panda/penguin problem. Could I request manual removal or something of this nature? Thoughts appreciated.
Intermediate & Advanced SEO | | Towelsrus0 -
Duplicate page content and Duplicate page title errors
Hi, I'm new to SeoMoz and to this forum. I've started a new campaign on my site and got back loads of error. Most of them are Duplicate page content and Duplicate page title errors. I know I have some duplicate titles but I don't have any duplicate content. I'm not a web developer and not so expert but I have the impression that the crawler is following all my internal links (Infact I have also plenty of warnings saying "Too many on-page links". Do you think this is the cause of my errors? Should I implement the nofollow on all internal links? I'm working with Joomla. Thanks a lot for your help Marco
Intermediate & Advanced SEO | | marcodublin0 -
Do pages with irrelevant keywords hurt the domain overall for ranking for relevant keywords?
I have been doing SEO for the University I work at. We are optimizing our degree pages on a page-by-page basis. So hypothetically we have a page optimized for "online accounting degree" and another for "online marketing degree", etc. Although our focus is on specific page optimization, we hope the by-product is that the whole domain will start to rank better for "online degree". First of all, is this a reasonable expectation? Second, if this IS the case, will pages full of irrelevant keywords hurt the overall strategy? For example, our registrar and financial aid PDFs that are full of legal/financial mumbo-jumbo. Are these lowering our keyword density of relevant keywords across the domain?
Intermediate & Advanced SEO | | SNHU0 -
Category Pages - Canonical, Robots.txt, Changing Page Attributes
A site has category pages as such: www.domain.com/category.html, www.domain.com/category-page2.html, etc... This is producing duplicate meta descriptions (page titles have page numbers in them so they are not duplicate). Below are the options that we've been thinking about: a. Keep meta descriptions the same except for adding a page number (this would keep internal juice flowing to products that are listed on subsequent pages). All pages have unique product listings. b. Use canonical tags on subsequent pages and point them back to the main category page. c. Robots.txt on subsequent pages. d. ? Options b and c will orphan or french fry some of our product pages. Any help on this would be much appreciated. Thank you.
Intermediate & Advanced SEO | | Troyville0