Unique Content Below Fold - Better Move Above Fold?
-
I have a page with a Google Map taking up 80% of space above the fold (rest is content which is not unique to my site) and all unique written content and copyrighted pictures are from a visual stand point right below the fold. I am considering making the Google map 1/4 in size so I can get my unique content up higher. Questions:
- Do we have any evidence or sound reasoning why I should / should not make this move?
- Is the content really considered below the fold or will Google see that it is simply a large map I have on the site and therefore will actually consider the content to be above the fold?
Thank you
-
Thx. I am going to make non unique pages "noindex, nofollow" and I am going to get rid of "rel=next prev". This keeping "follow" on noindex pages is so minor and I think might hurt my site since it allows Google to read what is on those pages....(non unique duplicate looking content).
I will update in a few months with the results.
-
Does my logic make sense?
I don't want to guess.
If this was my site, I would want that answer coming from someone who has seen a lot of real estate sites and knows how the most successful in highly competitive real estate markets handle this problem.
-
" lot of people think that noindexing duplicate content is "best practice"... but I am not sure that is true when you have a lot of duplicate content that is out there on a lot of other websites. Google is probably saying... "Oh no! Another one!"
If I add page 2 to n and specific property pages to robots.txt that would sent a stronger signal to Google and Google may not say "oh no, another one"?
Does my logic make sense?
-
Thx a lot. You really know your stuff. Maybe I should add those noindex pages to robots.txt instead and get rid of "rel=next prev" signals. Basically isolate page 1 as a stand alone page and in this way search engines do not see pages 2 to n with the "noindex, follow" tag.
-
Now that I see the site, I might understand why Google does not like it.
The community pages like the one that you gave as an example are signposts for a large number of noindex pages that mostly contain content that can be seen verbatim on many other websites. A lot of people think that noindexing duplicate content is "best practice"... but I am not sure that is true when you have a lot of duplicate content that is out there on a lot of other websites. Google is probably saying... "Oh no! Another one!"
After seeing, I agree with you that the map is way oversized. But I don't think that changing it is going to solve your problem.
I can't tell you how to solve your problem. I think that real estate is tough because the content changes rapidly and lots and lots of websites are publishing the same stuff. So, if this site belonged to me I would find an SEO consultant with deep experience in working on successful real estate sites in highly competitive markets who can study the site and give me advice.
Good luck.
-
I have added "noindex, follow" on page 2 to n as well. View all not possible. I only index pages where I have unique quality content added. Therefore, I also have many similar pages where page 1 is also "noindex, follow" and all property pages noindex, follow.
I have basically done everything to only index high quality pages and none of the MLS pages that look like on 100+ other real estate websites...
-
Thx very much. Ex: http://www.honoluluhi5.com/oahu/honolulu-homes/
As you will see, I have lots of quality unique content below the fold and all pictures in slideshow below the fold are my original photos. My pages are higher quality than any competitor but do not rank. I suspect a reason is the unique content is below the fold. I do understand link profile still isn't strong (9 month old site) but the link profile is still relative to many competitors strong.
Idea I am playing with is to reduce map to 1/4 the size (chop 75% off) and place unique content higher.
Your opinion would be highly appreciated.
-
I have LOTS of pages with a nice Google map, wonderful photo, interesting graph hogging the above-the-fold space.
I am not changin' anything.
If you have great stuff, one of the best presentations of your subject, above the fold and people are responding well to it then don't let kibitzers spreadin' rumors about "above the fold" content tanking your rankings scare you away from it.
I am out every day looking for, spending lots of money on, consulting with my photographer.... to get great face-slapping content to post above the fold to impress the Hell out of my visitors when they land.
When that stops working, I will be in here complaining.
One thing concerns me about your post and that is.....
"(rest is content which is not unique to my site)"
Note the word NOT in bold. If the rest of your page is duplicate content then I think Google will probably discover that eventually and your page will be treated poorly.
If you have a little bit of content from elsewhere on this page then just take the time to rewrite it or put it in the image if you are allowed to use it.
One more note. I am quite confident that Google can figure out when images are reused from other websites. I am not sure that can reduced your rankings at this time, but it might in the future.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
ROI on Policing Scraped Content
Over the years, tons of original content from my website (written by me) has been scraped by 200-300 external sites. I've been using Copyscape to identify the offenders. It is EXTREMELY time consuming to identify the site owners, prepare an email with supporting evidence (screen shots), and following up 2, 3, 15 times until they remove the scraped content. Filing DMCA takedowns are a final option for sites hosted in the US, but quite a few of the offenders are in China, India, Nigeria, and other places not subject to DMCA. Sometimes, when a site owner takes down scraped content, it reappears a few months or years later. It's exasperating. My site already performs well in the SERPs - I'm not aware of a third party site's scraped content outperforming my site for any search phrase. Given my circumstances, how much effort do you think I should continue to put into policing scraped content?
Intermediate & Advanced SEO | | ahirai1 -
Content position and topic modelling
Hi, Two questions here, First: Does the position of content have any impact on performance? For example say a page displays a league table (20 rows) so eats up most of the above-fold space. Would that table being top followed by content have a negative impact? Would creating 'some' content before a table help? Second: Does topic modelling actually help relevance signals? So say I sold guitars and the page had the word 'guitar' throughout the content, would including electric, acoustic, strings, amps etc also in the content help the page become more relevant for the term 'guitar'? Or would it just expand the terms the page would be eligible to show for? Thanks.
Intermediate & Advanced SEO | | followuk1 -
Opinion on Duplicate Content Scenario
So there are 2 pest control companies owned by the same person - Sovereign and Southern. (The two companies serve different markets) They have two different website URLs, but the website code is actually all the same....the code is hosted in one place....it just uses an if/else structure with dynamic php which determines whether the user sees the Sovereign site or the Southern site....know what I am saying? Here are the two sites: www.sovereignpestcontrol.com and www.southernpestcontrol.com. This is a duplicate content SEO nightmare, right?
Intermediate & Advanced SEO | | MeridianGroup0 -
Moving from Shopp to WooCommerce...
My site is ranking well on most of my products and honestly, I don't do the best job in "on product page" SEO. I've got solid URLs for each, but my "on page" grades are mostly C, D and Fs. 😞 I honestly feel fortunate that I'm ranking well and have "put off" doing much work knowing that I was going to change from the Shopp wordpress plugin to Woocommerce. Is there anything structurally that will potentially hurt me from changing framework? Do I need to make sure that the URLs are the EXACT same for each of my products or will it be okay to make them better? Ex. some pages have a less desirable URL because I just "copied" the previous item. Thanks in advance for your experience.
Intermediate & Advanced SEO | | dp428280 -
Should I move our blog internal....
I wanted to also ask the wider moz community this question. Our blogs are currently run on blogger/wordpress using a subdomain strategy - blog.website.com and has now gained a home page PR3. It's been running for 2-3 years. This runs contrary to best practice of website.com/blog. I'm now considering making the blog internal but want to get your opinion as the longer I leave it, the bigger a decision it will be.... Do the pro's of making the blog internal outweigh the cons of doing so ? Pro's Blog benefits from root domain Fresh content on the site that people can interact with Root domain benefits from links the content gains Easier to analyse user activity Con's Loss of Page Rank Effort to 301 all URL's and content CMS altered to allow creation of blog content
Intermediate & Advanced SEO | | RobertChapman0 -
Duplicate Content Error because of passed through variables
Hi everyone... When getting our weekly crawl of our site from SEOMoz, we are getting errors for duplicate content. We generate pages dynamically based on variables we carry through the URL's, like: http://www.example123.com/fun/life/1084.php
Intermediate & Advanced SEO | | CTSupp
http://www.example123.com/fun/life/1084.php?top=true ie, ?top=true is the variable being passed through. We are a large site (approx 7000 pages) so obviously we are getting many of these duplicate content errors in the SEOMoz report. Question: Are the search engines also penalizing for duplicate content based on variables being passed through? Thanks!0 -
Blog content - what to do, and what to avoid in terms of links, when you're paying for blog content
Hi, I've just been looking at a restaurant site which is paying food writers to put food news and blogs on their website. I checked the backlink profile of the site and the various bloggers in question usually link from their blogs / company websites to the said restaurant to help promote any new blogs that appear on the restaurant site. That got me wondering about whether this might cause problems with Google. I guess they've been putting about one blog live per month for 2 years, from 12/13 bloggers who have been linking to their website. What would you advise?
Intermediate & Advanced SEO | | McTaggart0 -
Is My Competitor Beating Me With A Better URL Structure?
A competitor is consistently beating my website on non-competitive, long tail keywords. His DA is 32 compared to my 46. His average PA is 23 to my 28. His average On Page Optimization Grade is a C compared to my A. His page speed score using YSlow is a 71 compared to my 78. The only thing I can think of at this point is that he has a better URL structure. We both have the keyword in the URL, but his structure goes like this (keyword: apw wyott parts): www.competitor.com/apw-wyott/parts While mine goes like this (I had nothing to do with this site's architecture; this is what I'm stuck with for the time being): http://www.etundra.com/APW_Wyott_Parts-C347.html It should be noted that the last word in these keywords is always the same - "parts." These keywords are for parts by different manufacturers so they follow a consistent pattern: [manufacturer-name] followed by "parts." Also, the "C347" on the end of my URL is the category number given to this particular category of products in our database. Are his URLs beating me or should I continue to look for other factors? If so, what other factors should I consider?
Intermediate & Advanced SEO | | eTundra0