Interlinking from unique content page to limited content page
-
I have a page (page 1) with a lot of unique content which may rank for "Example for sale". On this page I Interlink to a page (page 2) with very limited unique content, but a page I believe is better for the user with anchor "See all Example for sale". In other words, the 1st page is more like a guide with items for sale mixed, whereas the 2nd page is purely a "for sale" page with almost no unique content, but very engaging for users.
Questions:
-
Is it risky that I interlink with "Example for sale" to a page with limited unique content, as I risk not being able to rank for either of these 2 pages
-
Would it make sense to "no index, follow" page 2 as there is limited unique content, and is actually a page that exist across the web on other websites in different formats (it is real estate MLS listings), but I can still keep the "Example for sale" link leading to page 2 without risking losing ranking of page 1 for "Example for sale"keyword phrase
I am basically trying to work out best solution to rank for "Keyword for sale" and dilemma is page 2 is best for users, but is not a very unique page and page 2 is very unique and OK for users but mixed up writing, pictures and more with properties for sale.
-
-
As a new website I think safe bet is to include all content on 1 page. Keyword variation means content stays seperated. Better combine and make 1 powerful page.....thx for the insight
-
Both of those are good solutions - so choose one or the other. If you choose the keyword variation route, then make sure you go through and edit the content on each page to properly reflect the new focus.
-
Page 1 has great statistics. Page 2 is more of a guide with videos, pics and more. If page 1 tried to rank for "NEIGHBORHOOD Homes for Sale" and Page 2 "NEIGHBORHOOD Real Estate" would this be different enough? Or, you feel I should really move all that unique content (pics, videos etc) on lower part of Page 1 and use a 301 redirect and shut down Page 2?
-
The issue is two pages on the same site trying to rank for the same keyword. They're going to be fighting against each other and confusing search engines.
It's better to either combine the pages (option 1), or to give them a separate target keyword (option 2). Option #2 is probably easier, but you'll still need to make the user-friendly page more search friendly, and vice versa. Option 1 is probably better if you can add search-friendly text content beneath the map portion of the page, and ditch page 2.
-
sorry, Nakul. I missed your message and just saw it now. Page 1 and 2 are very different pages. Please see my below response to Kane
-
thanks, Kane. This is the page best for user: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/ - I have added stats on lower part of page and will soon add more unique written content so other similar MLS result pages look apart and not too similar. I have noindex, follow on page 2 to n to avoid looking like duplicate content as many other real estate sites will have same listings, just in a different format.
This is the page search engines will like (but not ideal for users): http://www.honoluluhi5.com/waikiki-condos-real-estate/ - short-term I will probably rank better for that page and long-term the page best for the user.
Question: what is the issue trying to rank for similar keyword? As you can see my H1, title tag and meta des are different on those 2 pages, but similar. I am interested in "NEIGHBORHOOD condos for sale" users and not users searching "Guide to NEIGHBORHOOD". Unless it has a negative impact on my pages ranking potential, I believe this is best structure. If you have examples with issues that would be great
-
I would make some slight variations from the two pages. For example, make page 1 "Seattle Homes For Sale" and page 2 "Seattle Home Listings". This avoids the issue of having two pages going after that same keyword and allows you to get more granular for the terms you want to rank for.
If both pages are almost identical content, then I would consider canonicals as a solution, but it doesn't sound to me like that's the case here.
-
Would you say page 2 is a subset of page 1 ? Is there duplicate content between the 2 pages ?If yes, you can consider doing a canonical tag to page 1 on both page 1 and page 2. This way only your page 1 will rank.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
Is it a good strategy to link older content that was timely at one point to newer content that we would prefer to guide traffic and value to
Hi All, I've been working for a website/publisher that produces good content and has been around for a long time but has recently been burdened by a high level of repetitious production, and a high volume in general with pages that don't gather as much traffic as desired. One such fear of mine is that every piece published doesn't have any links pointing to when it is published outside of the homepage or syndicated referrals. They do however have a lot (perhaps too many) outbound internal links away from it. Would it be a good practice, especially for new content that has a longer shelf life, to go back to older content and place links pointing to the new one? I would hope this would boost traffic via internal recircultion and Page Authority, with the added benefits of anchor text boosts.
Intermediate & Advanced SEO | | ajranzato91 -
301 redirect for page 2, page 3 etc of an article or feed
Hey guys, We're looking to move a blog feed we have to a new static URL page. We are using 301 redirects but I'm unsure of what to regarding page 2, page 3 etc. of the feed. How do I make sure those urls are being redirected as well? For example: Moving FloridaDentist.com/blog/dental-tips/ to a new page url FloridaDentist.com/dental-tips. So, we are using a 301 on that old url to the new one. My questions is what to do with the other pages like FloridaDentist.com/blog/dental-tips/page/3. How do we make sure that page is also 301'd to the new main url?
Intermediate & Advanced SEO | | RickyShockley0 -
Any downsides of (permanent)redirecting 404 pages to more generic pages(category page)
Hi, We have a site which is somewhat like e-bay, they have several categories and advertisements posted by customers/ client. These advertisements disappear over time and turn into 404 pages. We have the option to redirect the user to the corresponding category page, but we're afraid of any negative impact of this change. Are there any downsides, and is this really the best option we have? Thanks in advance!
Intermediate & Advanced SEO | | vhendriks0 -
Incorrect cached page indexing in Google while correct page indexes intermittently
Hi, we are a South African insurance company. We have a page http://www.miway.co.za/midrivestyle which has a 301 redirect to http://www.miway.co.za/car-insurance. Problem is that the former page is ranking in the index rather than the latter. The latter page does index occasionally in the same position, but rarely. This is primarily for search phrases like "car insurance" and "car insurance quotes". The ranking was knocked down the index with Penquin 2.0. It was not ranking at all but we have managed to recover to 12/13. This abnormally has only been occurring since the recovery. The correct page does index for other search terms like "insurance for car". Your help would be appreciated, thanks!
Intermediate & Advanced SEO | | miway0 -
Same content pages in different versions of Google - is it duplicate>
Here's my issue I have the same page twice for content but on different url for the country, for example: www.example.com/gb/page/ and www.example.com/us/page So one for USA and one for Great Britain. Or it could be a subdomain gb. or us. etc. Now is it duplicate content is US version indexes the page and UK indexes other page (same content different url), the UK search engine will only see the UK page and the US the us page, different urls but same content. Is this bad for the panda update? or does this get away with it? People suggest it is ok and good for localised search for an international website - im not so sure. Really appreciate advice.
Intermediate & Advanced SEO | | pauledwards0 -
End of March we migrated our site over to HubSpot. We went from page 3 on Google to non existent. Still found on page 2 of Yahoo and Bing. Beyond frustrated...HELP PLEASE "www.vortexpartswashers.com"
End of March we migrated our site over to HubSpot. We went from page 3 on Google to non existent. Still found on page 2 of Yahoo and Bing under same keywords " parts washers" Beyond frustrated...HELP PLEASE "www.vortexpartswashers.com"
Intermediate & Advanced SEO | | mhart0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0