Is Content Location Determined by Source Code or Visual Location in Search Engine's Mind?
-
I have a page with 2 scroll features. First 1/3 of the page (from left) has thumb pictures (not original content) and a vertical scroll next to. Remaining 2/3 of the page has a lot of unique content and a vertical scroll next to it.
Question: Visually on a computer, the unique content is right next to the thumbs, but in the source code the original content shows after these thumbs. Does that mean search engines will see this content as "below the fold" and actually, placing this content below the thumbs (requiring a lot of scrolling to get to the original content) would in a search engine's mind be the exact same location of the content, as the source code shows the same location?
I am trying to understand if search engines base their analysis on source code or also visual location of content? thx
-
That sounds like a reasonable approach. If you wanted to be extra careful you could also ad a robots follow,noindex tag to the header of the paginated pages since they all have very little unique content to add.
A third option, which I would only use if people are linking into those paginated pages (very rare), is to rel canonical the paginated pages to the first page.
-
thx, again. That is my big concern: should I put in the effort to move the content higher on page. It is year 2014 and Google does not give real estate websites or e-commerce sites any clue as to how they want us to deal with duplicate issues (content appearing across a bunch of other websites). I am using "noindex, follow" for the "MLS result pages" where I do not have unique content added, and when I have unique content on Page 1, then I keep entire serious of paginated pages (sometime Page 1 - 100) indexed but add rel=next prev.
Any thoughts on that?
-
I think Google is looking for more extreme situations than the one you have. The content is well-written, useful and isn't so far down the page that someone isn't going to see it. However, I don't have to tell you that it's going to take a LOT to compete in that niche.
Good luck.
-
th, Everett. Appreciate the input. Take a look here: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/ - if I move all my "unique content" (currently below the thumbs and large map) up to location where the map is and get rid of that map, you are saying that most likely that will be seen as being located more "above the fold"?
-
Hello Khi5,
I can't say with 100% certainty, but I feel confident that Google looks at both. I'm not sure about other search engines. Specifically, "page layout" algorithm needs to render the html/CSS - and increasingly javascript - in order to determine if there are too many ads "above the fold". Google also used to render the page to provide "instant previews" of each website in the SERPs.
In other words, the all-seeing eye of Google knows if your "unique content" shows up above or below the fold, or even 6,000 pixels off-screen to the left.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Magento 1.9 SEO. I have product pages with identical On Page SEO score in the 90's. Some pull up Google page 1 some won't pull up at all. I am searching for the exact title on that page.
I have a website built on Magento 1.9. There are approximately 290,000 part numbers on the site. I am sampling Google SERP results. About 20% of the keywords show up on page 1 position 5 thru 10. 80% don't show up at all. When I do a MOZ page score I get high 80's to 90's. A page score of 89 on one part # may show up on page one, An identical page score on a different part # can't be found on Google. I am searching for the exact part # in the page title. Any thoughts on what may be going on? This seems to me like a Magento SEO issue.
Intermediate & Advanced SEO | | CTOPDS0 -
Duplicate Content That Isn't Duplicated
In Moz, I am receiving multiple messages saying that there is duplicate page content on my website. For example, these pages are being highlighted as duplicated: https://www.ohpopsi.com/photo-wallpaper/made-to-measure/pop-art-graffiti/farm-with-barn-and-animals-wall-mural-3824 and https://www.ohpopsi.com/photo-wallpaper/made-to-measure/animals-wildlife/little-elephants-garden-seamless-pattern-wall-mural-3614. As you can see, both pages are different products, therefore I can't apply a 301 redirect or canonical tag. What do you suggest?
Intermediate & Advanced SEO | | e3creative0 -
Of the two examples of markup (microdata, schema) code below, which of the two is better designed for its purpose of Q&A, and what might be suggested to improve upon these lines of code (context: questions and answers within article content.
ANSWER SEEN 'WITHIN THE QUESTION' BRACKET So you ask, why is the sky blue?
Intermediate & Advanced SEO | | RedFrog
Well, the answer is not so simple; In the day-time, when it's clear and cloudless,
the sky is blue because molecules in the air scatter blue light from the sun more than they scatter red light.
When we look towards the sun at sunset, we see red and orange colours because the blue light has been scattered out and away from the line of sight. See Structured Data Testing Results 'QUESTION' AND 'ANSWER' IN 2 SEPARATE BRACKETS Why Is The Sky Blue? Well, the answer is not so simple; In the day-time, when it's clear and cloudless,
the sky is blue because molecules in the air scatter blue light from the sun more than they scatter red light.
When we look towards the sun at sunset, we see red and orange colours because the blue light has been scattered out and away from the line of sight. See Structured Data Testing Results Thanks, Mark0 -
Partial Match or RegEx in Search Console's URL Parameters Tool?
So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them. Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789= All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe? Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?
Intermediate & Advanced SEO | | Ria_0 -
Impact of simplifying website and removing 80% of site's content
We're thinking of simplifying our website which has grown to a very large size by removing all the content which hardly ever gets visited. The plan is to remove this content / make changes over time in small chunks so that we can monitor the impact on SEO. My gut feeling is that this is okay if we make sure to redirect old pages and make sure that the pages we remove aren't getting any traffic. From my research online it seems that more content is not necessarily a good thing if that content is ineffective and that simplifying a site can improve conversions and usability. Could I get people's thoughts on this please? Are there are risks that we should look out for or any alternatives to this approach? At the moment I'm struggling to combine the needs of SEO with making the website more effective.
Intermediate & Advanced SEO | | RG_SEO0 -
Reverse Proxys - Lost On It's Purpose To Help Seo
Reverse Proxys - Lost On It's Purpose To Help Seo - read an article on seomoz check link below. When should we use this reverse proxy and is it really worth the trouble at all ? Why create subdomains vs subfolders when organizing different sections of the website ? http://www.seomoz.org/blog/what-is-a-reverse-proxy-and-how-can-it-help-my-seo
Intermediate & Advanced SEO | | helpwanted0 -
To subnav or NOT to subnav... that's my question.... :)
We are working on a new website that is golf related and wondering about whether or not we should set up a subnavigation dropdown menu from the main menu. For example: GOLF PACKAGES
Intermediate & Advanced SEO | | JamesO
>> 2 Round Packages
>> 3 Round Packages
>> 4 Round Packages
>> 5 Round Packages GOLF COURSES
>> North End Courses
>> Central Courses
>> South End Courses This would actually be very beneficial to our users from a usability standpoint, BUT what about from an SEO standpoint? Is diverting all the link juice to these inner pages from the main site navigation harmful? Should we just create a page for GOLF PACKAGES and break it down on that page?0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0