How not to get penalized by having a Single Page Interface (SPI) ?
-
Guys, I run a real estate website where my clients pay me to advertise their properties.
The thing is, from the beginning, I had this idea about a user interface that would remain entirely on the same page. On my site the user can filter the properties on the left panel, and the listings (4 properties at each time) are refreshed on the right side, where there is pagination.
So when the user clicks on one property ad, the ad is loaded by ajax below the search panel in the same page .. there's a "back up" button that the user clicks to go back to the search panel and click on another property.
People are loving our implementation and the user experience, so I simply can't let go of this UI "inovation" just for SEO, because it really is something that makes us stand out from our competitors.
My question, then, is: how not to get penalized in SEO by having this Single Page Interface, because in the eyes of Google users might not be browsing my site deep enough ?
-
Hi,
Google and Bing can see how much time your users spend on the page, and since they can also see that there is a large amount of information accessible through that page, I don't think you need to be as worried about the "single page" factor as normal.
That said, just because your main user interface lives within a single page, there is no reason that you cannot have other pages linked to it. In fact there are a number of other pages which should be included in your site. For example: Contact, About, Terms, Privacy Policy and (if relevant) Disclosure and/or Disclaimer. They do not have to be right up front or included in your main UI, but they should at least be available for users as text links at the bottom of the page, in a sidebar or somewhere. If you don’t include them you are reducing the appearance of transparency for the site. This works against trust and will make people less confident about doing business through your site. Given that you are in real estate, these things should be a major consideration.
Also, if you do not have an About page, you are reducing your opportunity to grow your customer base and add more clients.
Hope that helps,
Sha
-
If you have your listings available in an unordered list, that should be fine. If there aren't hundreds and hundreds of listings on your site, I don't think Google will have a problem with your implementation. If there are, you might consider building static pages for each category, and linking to the listings from there.
-
John, thanks for the quick reply.
I had already read the "make your Ajax page indexable", but unfortunately it was too late in product development and our programmers simply convinced us it would imply re-doing the entire backend for it to work.
So we already have in place a workaround for crawlers reach all these listings. Below the search panel (that has Ajax pagination and loads the ads on the same page with javascript) we have a standard html
So the crawlers can reach the properties individual pages. In other words, we comply with the rule "make each of your pages reachable by at least one internal link".
But my question was more focused about how google "sees" the navigation pattern of my users ... I know the crawler is reaching those pages, but since the majority of users use the search panel (that loads the properties by javascript/ajax) and not the static links below it, it might appear that the users only viewed one page inside our site.
-
Is there some alternate navigation to reach all of these listings without using your AJAX search? Or are the listings included in a sitemap? Is there some way for Google to find them already?
I'd recommend reading http://code.google.com/web/ajaxcrawling/ to learn more about how to make your AJAXy pages indexable. You may also want to take a look at http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html if you have prev and next pagination. If you have a view all, and want to make that the canonical form, you'll want to look at http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html
Also, in Bing Webmaster Tools, you can go to the Crawl > Crawl Settings tab and enable the "Configure your site to have bingbot crawl escaped fragmented URLs containing #!." option if that's applicable to you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redesign Just Starting - Should I Leave The Previous Incomplete Site or Setup A Temporary Holding Page and Redirect Previous URL'S?
Hi All I've picked up a new website project and wanted to ask about the best way to proceed with the current site during the development process. The current site is incomplete although it has been live for a while and has over 80 pages in the sitemap. Link to site https://tinyurl.com/ychwftup The business owner wants to take down the current site and simply add a landing page stating "new website coming soon". From an SEO perspective, am I better to keep the current site live until the new site is ready? Or would it not make any difference if I setup the landing page and add 301 redirects from each page in the sitemap to the landing page. Many Thanks In Advance For Any Assistance
Web Design | | ruislip180 -
Reason for robots.txt file blocking products on category pages?
Hi I have a website with thosands of products. On the category pages, all the products are linked to with the code “?cgid” in the URL. But “?cgid” is also blocked in the robots.txt file for some reason. So I'm thinking it's stopping all my products getting crawled by Google. Am I right here? Is there any reason why a website would want to limit so many URL's? I'm only here a week and the sites getting great traffic, so don't want to go breaking it!!! Thanks
Web Design | | Frankie-BTDublin0 -
19 Hours Excessive to Code Single Wordpress Page?
My developer says that is will take 19 hours to modify a listing page of the wpcasa London real estate theme because the existing template is difficult to customize. I am attaching an image of the existing page before customization and an image of a final mock up. Is 19 hours a reasonable amount of time to customize this page? Look forward to feedback. New Design is visible at: https://imgur.com/a/42XBqDD Alan IQ1i0kg
Web Design | | Kingalan10 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
Regarding rel=canonical on duplicate pages on a shopping site... some direction, please.
Good morning, Moz community: My name is David, and I'm currently doing internet marketing for an online retailer of marine accessories. While many product pages and descriptions are unique, there are some that have the descriptions duplicated across many products. The advice commonly given is to leave one page as is / crawlable (probably best for one that is already ranking/indexed), and use rel=canonical on all duplicates. Any idea for direction on this? Do you think it is necessary? It will be a massive task. (also, one of the products that we rank highest for, we have tons of duplicate descriptions.... so... that is sort of like evidence against the idea?) Thanks!
Web Design | | DavidCiti0 -
Curious why site isn't ranking, rather seems like being penalized for duplicate content but no issues via Google Webmaster...
So we have a site ThePowerBoard.com and it has some pretty impressive links pointing back to it. It is obviously optimized for the keyword "Powerboard", but in no way is it even in the top 10 pages of Google ranking. If you site:thepowerboard.com the site, and/or Google just the URL thepowerboard.com you will see that it populates in the search results. However if you quote search just the title of the home page, you will see oddly that the domain doesn't show up rather at the bottom of the results you will see where Google places "In order to show you the most relevant results, we have omitted some entries very similar to the 7 already displayed". If you click on the link below that, then the site shows up toward the bottom of those results. Is this the case of duplicate content? Also from the developer that built the site said the following: "The domain name is www.thepowerboard.com and it is on a shared server in a folder named thehoverboard.com. This has caused issues trying to ssh into the server which forces us to ssh into it via it’s ip address rather than by domain name. So I think it may also be causing your search bot indexing problem. Again, I am only speculating at this point. The folder name difference is the only thing different between this site and any other site that we have set up." (Would this be the culprit? Looking for some expert advice as it makes no sense to us why this domain isn't ranking?
Web Design | | izepper0 -
How serious is duplicate page content?
We just launched our site on a new platform - Magento Enterprise. We have a wholesale catalog and and retail catalog. We have up to 3 domains pointing to each product. We are getting tons of duplicate content errors. What are the best practices for dealing with this? Here is an example: mysite.com/product.html mysite.com/category/product.html mysite.com/dynamic-url
Web Design | | devonkrusich0