Websites that scroll....forever....
-
I'm seeing more and more websites that have all their content on one page. I came across www.otbthink.com today and am wondering if I'm missing something here.
I ask this because I am trying to figure out how to link to their copy writing portion of their website. I have a client that is in their area and needs someone to do some copy writing work for them. All the content there is being served through css. The search engines will see it but how do you optimize for this sort of site? How do you link to a particular section of this site? Am I missing something obvious here?
Thanks.
-
Lets just play that out for a second.
If I was going to link to their services pages then I wouldn't use "services" as the anchor text. I would probably use the company's name (their brand name) because it's their service page. This sort of website is perfect for that because it will have tons of links to the home URL with the brand name of the website, some with "services" some with "contact" etc. It wouldn't be hit for over optimization because you have to link to the home page and not a sub-page for these queries. Also, the blog post on specific topics will have their own sub-pages and those can be handled with SEF URLS and those should rank too with their own links.
I'm playing devils advocate here but I'm wondering if we PURE SEO guys are missing something.
-
Since these rank so poorly, if more companies want to put up long page sites like this, in the end, it makes our jobs working on more focused sites easier, doesn't it?
-
Oh god, I couldn't agree more with you. Infinite Scroll pages/sites just look and feel awful to me from a user standpoint and they detract from sites from an SEO standpoint.
-
These sites are silly i don't understand their popularity. I'd assume the fad will die off sooner or later as it is terrible from an seo standpoint.
-
Thanks for the tip! That works.
The people who did his website are supposed to be pretty good and they are telling them that this is the way to go for a website. I'm just puzzled I guess.
The only thing I can think of that may be in play here is the ability to rank for co-citations or something...
They do have a blog and theoretically if it is good then people will link to those sub-pages. And I can kind of see how people don't really link to your services or other company related pages so building all the links to one URL would be okay...I guess. I'm just starting to see it more and more but don't know how this would be helpful to the user or the search engines.
-
You can link to: http://www.otbthink.com/#section2
That will bring you to the page and jump to that relevant section.
I agree that I don't think the site layout is optimal.
-
Hey Darin, (great name...mine too)
Yes, I can do that if I own a site but I don't even work with them. I just wanted to share their information with a client of mine. I can't see how this can be good for SEO or linkability for backlinks.
-
With so much copy on one page, I'd be concerned about naturally having more than 100 links on that page.
As far as linking to a particular section of the site, you'd want to create named anchors to each particular section. See the following page for examples:
http://www.thesitewizard.com/html-tutorial/link-to-specific-line-or-paragraph.shtml
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Entire website is duplicated on 2 domains - what to do?
My client's website has 1000+ pages and a Domain Authority of 23. I have just discovered that the entire site is duplicated on a second domain (main URL = companyname.com - duplicate site URL = company-name.com). The home page of the duplicate domain has a 301 redirect going to the main domain. However, none of the 1000+ other pages have any redirect set up, so Google is indexing the entire duplicate site. I'm assuming this is a bad thing for SEO. Duplicate site has a domain Authority of 4, so I'd like to transfer whatever link juice it has, towards the main site. What's the best thing to do? Ultimately I think it would be best to delete the duplicate site. So would it be a case of adding a redirect to the htaccess file along the lines of: redirect company-name.com/?slug? to https://companyname.com/?slug? (I realise this isn't the correct syntax - but is the concept correct?) Has anyone ever dealt with this successfully?
Technical SEO | | BottleGreenWebsites0 -
Is Tag Manager a good option to insert text in websites?
Is Tag Manager a good option to insert text in websites? When a website doesn't have an administration panel adding text is a very big problem.
Technical SEO | | propertyshark0 -
Traffic on my website hasn't gone up since
Anyone please I am looking for some help!! My website used to get around 40 to 50 visitors a day, as soon as I created the new website and put it live, traffic has dropped by 25%, page authority for the new and some of the old URL's are only 1, but my keywords are still doing well? I have made sure that I have redirected all the old URL's to the new ones, the tracking code in at the end section of the head section. Any ideas anyone?
Technical SEO | | One2OneDigital0 -
Using the same domain for two websites (for different geographical locations)
Hi all, My client has a new E-commerce site coming out in few months.
Technical SEO | | skifr
His requirement is to use the same domain (lets call it www.domain.com for now) for two seperate websites:
The first site, for users with ip addresses from USA - which will include prices in US dollars.
The second site - for users outside of the US - will not include any prices, and will have different pages and design. Now, lets say that googlebot crawls the websites from different ip ranges. How can i make sure a user from France, for example, won't see crawled pages from the US? Sure, once he will click the result, I can redirect him to a "Sorry, but this content is unavailable in your country" page. The problem is, I don't want a user from France to see the in the search results the meta description snippets of pages related only to users in the US (in some cases, the snippets may include the prices in $).
Is Geotargeting through Webmaster Tools can help in this case? I know I can target a part of the website for a specific country (e.g. - www.domain.com/us/), but how can I make sure global users won't see the pages targeted only to the US in the search results? Thanks in Advance0 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0 -
Why is my website banned?
IMy website is Costume Machine at www.costumemachine.com . My site has been banned for 1 year now. I have requested that google reconsider my site 3 times without luck. The site is dynamic and basically pulls in feeds from affiliate sites. We have added over 1,500 pages of original content. The site has been running great since 2008 without any penalties. I don't think I got hit with any linking penalty. I cleaned up all questionable links last November when the penalty hit. Am I being hit with a "thin" site penalty? If that is the issue what is the best way to fix the problem?
Technical SEO | | tadden0 -
Removing Out of Stock Items from an E-Commerce website
I have a dilemma. We have over 500 out of stock items that are still listed on our ecommerce website. I'm thinking it would be a good idea to leave them up because they are all considered content by google, and the keywords might drive traffic. On the other hand, the customers might be disappointed if the items are out of stock (we don't restock our sold out items), and many times, they will not lead to a conversion if the customer is looking for something very specific. Considering all these factors (and some unmentioned ones), my main question is: If I remove content, does that make all of the other content on our website stronger by having more pagerank and link juice flow to them, or do I hurt our rankings?
Technical SEO | | 13375auc30 -
Hosting in the US for an Australian website
Hi, I was wondering whether it's still important for SEO to host a website in the same country as you're targetting. We host a fair few Australian sites and we're looking for a new webhost at the moment. Should we only consider hosting on Australian servers or is it worth looking at servers elsewhere? One of the main reasons we're thinking about US hosting is that we can get a machine that is 3x more powerful than we can get in Australia for the price. This performance increase should go some way to minimizing the effect of the latency increase. Thanks, Nick
Technical SEO | | MarketingResults0