Unique Content Below Fold - Better Move Above Fold?
-
I have a page with a Google Map taking up 80% of space above the fold (rest is content which is not unique to my site) and all unique written content and copyrighted pictures are from a visual stand point right below the fold. I am considering making the Google map 1/4 in size so I can get my unique content up higher. Questions:
- Do we have any evidence or sound reasoning why I should / should not make this move?
- Is the content really considered below the fold or will Google see that it is simply a large map I have on the site and therefore will actually consider the content to be above the fold?
Thank you
-
Thx. I am going to make non unique pages "noindex, nofollow" and I am going to get rid of "rel=next prev". This keeping "follow" on noindex pages is so minor and I think might hurt my site since it allows Google to read what is on those pages....(non unique duplicate looking content).
I will update in a few months with the results.
-
Does my logic make sense?
I don't want to guess.
If this was my site, I would want that answer coming from someone who has seen a lot of real estate sites and knows how the most successful in highly competitive real estate markets handle this problem.
-
" lot of people think that noindexing duplicate content is "best practice"... but I am not sure that is true when you have a lot of duplicate content that is out there on a lot of other websites. Google is probably saying... "Oh no! Another one!"
If I add page 2 to n and specific property pages to robots.txt that would sent a stronger signal to Google and Google may not say "oh no, another one"?
Does my logic make sense?
-
Thx a lot. You really know your stuff. Maybe I should add those noindex pages to robots.txt instead and get rid of "rel=next prev" signals. Basically isolate page 1 as a stand alone page and in this way search engines do not see pages 2 to n with the "noindex, follow" tag.
-
Now that I see the site, I might understand why Google does not like it.
The community pages like the one that you gave as an example are signposts for a large number of noindex pages that mostly contain content that can be seen verbatim on many other websites. A lot of people think that noindexing duplicate content is "best practice"... but I am not sure that is true when you have a lot of duplicate content that is out there on a lot of other websites. Google is probably saying... "Oh no! Another one!"
After seeing, I agree with you that the map is way oversized. But I don't think that changing it is going to solve your problem.
I can't tell you how to solve your problem. I think that real estate is tough because the content changes rapidly and lots and lots of websites are publishing the same stuff. So, if this site belonged to me I would find an SEO consultant with deep experience in working on successful real estate sites in highly competitive markets who can study the site and give me advice.
Good luck.
-
I have added "noindex, follow" on page 2 to n as well. View all not possible. I only index pages where I have unique quality content added. Therefore, I also have many similar pages where page 1 is also "noindex, follow" and all property pages noindex, follow.
I have basically done everything to only index high quality pages and none of the MLS pages that look like on 100+ other real estate websites...
-
Thx very much. Ex: http://www.honoluluhi5.com/oahu/honolulu-homes/
As you will see, I have lots of quality unique content below the fold and all pictures in slideshow below the fold are my original photos. My pages are higher quality than any competitor but do not rank. I suspect a reason is the unique content is below the fold. I do understand link profile still isn't strong (9 month old site) but the link profile is still relative to many competitors strong.
Idea I am playing with is to reduce map to 1/4 the size (chop 75% off) and place unique content higher.
Your opinion would be highly appreciated.
-
I have LOTS of pages with a nice Google map, wonderful photo, interesting graph hogging the above-the-fold space.
I am not changin' anything.
If you have great stuff, one of the best presentations of your subject, above the fold and people are responding well to it then don't let kibitzers spreadin' rumors about "above the fold" content tanking your rankings scare you away from it.
I am out every day looking for, spending lots of money on, consulting with my photographer.... to get great face-slapping content to post above the fold to impress the Hell out of my visitors when they land.
When that stops working, I will be in here complaining.
One thing concerns me about your post and that is.....
"(rest is content which is not unique to my site)"
Note the word NOT in bold. If the rest of your page is duplicate content then I think Google will probably discover that eventually and your page will be treated poorly.
If you have a little bit of content from elsewhere on this page then just take the time to rewrite it or put it in the image if you are allowed to use it.
One more note. I am quite confident that Google can figure out when images are reused from other websites. I am not sure that can reduced your rankings at this time, but it might in the future.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content suggestions
Hi, In moz pro you get content suggestions. I was wondering if you can still rank if the topics you cover for a specific keyword on your page are not listed there ? I guess the key is that all the topics covered are related to each other, correct ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
IFrames and Thin Content Worries
Hi everyone, I've read a lot about the impact of iFrames on SEO lately -- articles like http://www.visibilitymagazine.com/how-do-iframes-affect-your-seo/ for example. I understand that iFrames don't cause duplicate content or cloaked content issues, but what about thin content concerns? Here's my scenario: Our partner marketing team would like to use an iframe to pull content detailing how Partner A and my company collaborate from a portal the partners have access to. This would allow the partners to help manage their presence on our site directly. The end result would be that Partner A's portal content would be added to Partner A's page on our website via an iFrame. This would happen about across at least 100 URLs. Currently we have traditional partner pages, with unique HTML content. There's a little standalone value for queries involving the bigger partners' names + use case terms, but only in less than 10% of cases. So I'm concerned about those pages, but I'm more worried about the domain overall. My main concern is that in the eyes of Google I'd be stripping a lot of content off the domain all at once, and then replacing it with these shell pages containing nothing (in terms of SEO) but meta, a headline, navigation links, and an iFrame. If that's the case, would Google view those URLs as having thin content? And could that potentially impact the whole domain negatively? Or would Google understand that the page doesn't have content because of the iFrames and give us a pass? Thoughts? Thanks, Andrew
Intermediate & Advanced SEO | | SafeNet_Interactive_Marketing0 -
Concerns of Duplicative Content on Purchased Site
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine. Upon purchasing the domain, I did the following: Rehosted the old site and content that had been down for 9-12 months on oldsite.com Allowed a week or two for indexation on oldsite.com Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules Issued a Press Release declaring the acquisition of oldsite.com for newsite.com Performed a site "Change of Name" in Google from oldsite.com to newsite.com Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline. For Example: Oldsite.com has full attribution prior to going offline Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution) Oldsite.com goes offline Scraper sites continue hosting content Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content) Google reassigns original attribution to a scraper site Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen Google then silently punished Oldsite.com and Newsite.com (which it is redirected to) QUESTIONS Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache? Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution? Unrelated: Are there any other steps that are recommend for a Change of site as described above.
Intermediate & Advanced SEO | | PetSite0 -
What is Best Way to Scale RCS Content?
SEO has really moved away from the nitty gritty analysis of backlinking factors, link wheels, and the like and has shifted to a more holistic marketing approach. That approach is best described around MOZ as “Real Company S#it”. RCS is a great way to think about what we really do because it is so much more than just SEO or just Social Media. However, our clients and business owners do want to see results and want it quantified in some way. The way most of our clients understand SEO is by ranking high on specific terms or online avenues they have a better possibility of generating traffic/sales/revenue. They understand this more from the light of traditional marketing, where you pay for a TV ad and then measure to see how much revenue that ad generated. In the light of RCS and the need to target a large number of keywords for a given client, how do most PROs handle this situation; where you have a large number of keywords to target but with RCS? Many I’ve asked tend to use the traditional approach of creating a single content piece that is geared towards a given target keyword. However, that approach can get daunting if you have say 25 keywords that a small business wants to target. In this case is not really a case of scaling down the client expectations? What if the client wants all of the keywords and has the budget? Do you just ramp your RCS content creation efforts? It seems that you can do overkill and quickly run out of RCS content to produce.
Intermediate & Advanced SEO | | AaronHenry0 -
Showing Duplicate Content in Webmaster Tools.
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
Intermediate & Advanced SEO | | EcommerceSite0 -
Duplicate content resulting from js redirect?
I recently created a cname (e.g. m.client-site .com) and added some js (supplied by mobile site vendor to the head which is designed to detect if the user agent is a mobi device or not. This is part of the js: var CurrentUrl = location.href var noredirect = document.location.search; if (noredirect.indexOf("no_redirect=true") < 0){ if ((navigator.userAgent.match(/(iPhone|iPod|BlackBerry|Android.*Mobile|webOS|Window Now... Webmaster Tools is indicating 2 url versions for each page on the site - for example: 1.) /content-page.html 2.) /content-page.html?no_redirect=true and resulting in duplicate page titles and meta descriptions. I am not quite adept enough at either js or htaccess to really grasp what's going on here... so an explanation of why this is occurring and how to deal with it would be appreciated!
Intermediate & Advanced SEO | | SCW0 -
Bi-Lingual Site: Lack of Translated Content & Duplicate Content
One of our clients has a blog with an English and Spanish version of every blog post. It's in WordPress and we're using the Q-Translate plugin. The problem is that my company is publishing blog posts in English only. The client is then responsible for having the piece translated, at which point we can add the translation to the blog. So the process is working like this: We add the post in English. We literally copy the exact same English content to the Spanish version, to serve as a placeholder until it's translated by the client. (*Question on this below) We give the Spanish page a placeholder title tag, so at least the title tags will not be duplicate in the mean time. We publish. Two pages go live with the exact same content and different title tags. A week or more later, we get the translated version of the post, and add that as the Spanish version, updating the content, links, and meta data. Our posts typically get indexed very quickly, so I'm worried that this is creating a duplicate content issue. What do you think? What we're noticing is that growth in search traffic is much flatter than it usually is after the first month of a new client blog. I'm looking for any suggestions and advice to make this process more successful for the client. *Would it be better to leave the Spanish page blank? Or add a sentence like: "This post is only available in English" with a link to the English version? Additionally, if you know of a relatively inexpensive but high-quality translation service that can turn these translations around quicker than my client can, I would love to hear about it. Thanks! David
Intermediate & Advanced SEO | | djreich0 -
Geo-Targeting Content
I'm to get some ideas on restructuring existing content for geo-targeting. Example: Botox Page Tis is a hypothetical situation -- laser cosmetics clinic in Atlanta trying to rank for Atlanta Botox. The existing content is general information about botox procedures. The problem is editing the content to add Atlanta to the H1 tag and page copy. I'm wondering if there are some techniques to make the edits flow better? My idea is to add a geo-page for each procedure, but I'm wondering if this might interrupt or confuse users in the navigation funnel. Your thoughts? Thanks!
Intermediate & Advanced SEO | | 190west0