How not to get penalized by having a Single Page Interface (SPI) ?
-
Guys, I run a real estate website where my clients pay me to advertise their properties.
The thing is, from the beginning, I had this idea about a user interface that would remain entirely on the same page. On my site the user can filter the properties on the left panel, and the listings (4 properties at each time) are refreshed on the right side, where there is pagination.
So when the user clicks on one property ad, the ad is loaded by ajax below the search panel in the same page .. there's a "back up" button that the user clicks to go back to the search panel and click on another property.
People are loving our implementation and the user experience, so I simply can't let go of this UI "inovation" just for SEO, because it really is something that makes us stand out from our competitors.
My question, then, is: how not to get penalized in SEO by having this Single Page Interface, because in the eyes of Google users might not be browsing my site deep enough ?
-
Hi,
Google and Bing can see how much time your users spend on the page, and since they can also see that there is a large amount of information accessible through that page, I don't think you need to be as worried about the "single page" factor as normal.
That said, just because your main user interface lives within a single page, there is no reason that you cannot have other pages linked to it. In fact there are a number of other pages which should be included in your site. For example: Contact, About, Terms, Privacy Policy and (if relevant) Disclosure and/or Disclaimer. They do not have to be right up front or included in your main UI, but they should at least be available for users as text links at the bottom of the page, in a sidebar or somewhere. If you don’t include them you are reducing the appearance of transparency for the site. This works against trust and will make people less confident about doing business through your site. Given that you are in real estate, these things should be a major consideration.
Also, if you do not have an About page, you are reducing your opportunity to grow your customer base and add more clients.
Hope that helps,
Sha
-
If you have your listings available in an unordered list, that should be fine. If there aren't hundreds and hundreds of listings on your site, I don't think Google will have a problem with your implementation. If there are, you might consider building static pages for each category, and linking to the listings from there.
-
John, thanks for the quick reply.
I had already read the "make your Ajax page indexable", but unfortunately it was too late in product development and our programmers simply convinced us it would imply re-doing the entire backend for it to work.
So we already have in place a workaround for crawlers reach all these listings. Below the search panel (that has Ajax pagination and loads the ads on the same page with javascript) we have a standard html
So the crawlers can reach the properties individual pages. In other words, we comply with the rule "make each of your pages reachable by at least one internal link".
But my question was more focused about how google "sees" the navigation pattern of my users ... I know the crawler is reaching those pages, but since the majority of users use the search panel (that loads the properties by javascript/ajax) and not the static links below it, it might appear that the users only viewed one page inside our site.
-
Is there some alternate navigation to reach all of these listings without using your AJAX search? Or are the listings included in a sitemap? Is there some way for Google to find them already?
I'd recommend reading http://code.google.com/web/ajaxcrawling/ to learn more about how to make your AJAXy pages indexable. You may also want to take a look at http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html if you have prev and next pagination. If you have a view all, and want to make that the canonical form, you'll want to look at http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html
Also, in Bing Webmaster Tools, you can go to the Crawl > Crawl Settings tab and enable the "Configure your site to have bingbot crawl escaped fragmented URLs containing #!." option if that's applicable to you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you optimize for online catalog PDFs in regards to Page load time?
Does anyone have any experience with online widgets or apps that can support catalog pdfs? We have tons of catalog PDFs on one page for the website and the more we add, the worse the page load time gets. Any thoughts would be appreciated. Cheers!
Web Design | | FullMedia900 -
No-index part of page
Hi All, I want to copy articles from CNN/Bloomberg/etc and I want to show the content to my users in Lightbox (CSS), but the problem is duplicate content. Do you have any idea how can I no-index part of page/content?
Web Design | | JohnPalmer0 -
Main page redirect affecting search results?
Question.... A recent change was made to our page www.BGU.edu by a marketing person. So now when you type in www.BGU.edu it actually redirects to a different page www.BGU.edu/inquiry This is a really bad idea isn't it? I do not know enough about SEO to know a lot, and just joined SEOmoz but do I need to tell the admin to change it back?
Web Design | | nongard10 -
Schema.org - Right way to mark the pages
Dear all, Almost since we started designing our site, we are using schema microdata. It is not only because of the rich snippets, but because I want the search engines to better understand what we have. For example, the +1 buttom would not work properly without schema microdata, because it kind of ignores the OpenGraph parameters that specified image and description; and since we are a (very small) local bussiness directory (between other things), all our clients have a hand written schema complient description on their lisings, including address, opening ours, telephone number, description, etc. It is hand written by us because the tools avialable are simply not good enough to cover all different scenarios that a listing can present. I have not use, until today, a proper for the homepage, and it is probably the cause that our page lost the nice links below the site description in the google snippet. I did not place it on the body tag, but near the description, closing it inmediately after the description finishs. Now this is solved and we will wait to see if the links come back in the next weeks. Now to the question. Our site has three sections, with three different systems installed, two running wordpress and a third running another script. the main site is the local bussiness directory. The front page is mark as "schema.org/WepPage", and I do not know how to mark the other pages of the main site. I was thinking of marking the listings as "schema.org/ItemPage" since they are related to specific clients. Would you consired it to be right? Then, we have landing pages for the categories, should they be mark as WepPage, or as an Article, or something else? Many thanks in advance for your help, Best Regards, Daniel
Web Design | | te_c0 -
Duplicate Page Content mysite.com and mysite.com/index.html MOZ Dashboard
According to MOZ Dashboard my site shows Duplicate Page Content mysite.com and mysite.com/index.html .What i can do for that .redirect mysite.com/index.html to mysite.com .then how can i do that using .htaccess file .
Web Design | | innofidelity0 -
Home Page Optimization
I only discovered SEOmoz about a week ago and my knowledge in this area has grown 500% in that time, but I'm still a newbie. I'm looking for whether I have the right general idea or not with my home page in regards to SEO. The page is located at Line.com. The top section with the images is 100% for humans. The next section is where the SEO comes into play. I have 5 different services [sports monitor, free sports betting, sports betting forum, sports handicapper websites, gambling affiliate program] that I offer on 5 different inner pages. What I'm trying to do is have my home page rank decently for my desired terms and then pass link juice to the respective pages. My goal is to eventually have my inner pages rank higher than my home page for my desired search terms. Do I have the right general idea or am I way off? Is this too much for the search engines with all of the links and bold text? Design criticisms are also welcome, and anybody who wants to critique the inner pages would be forever thanked. Feel free to be as harsh as you want as long as it's constructive. Thanks!
Web Design | | PatrickGriffith0 -
Spammy page titles and the consequences
Hiya Mozzers! A pal who works in SEO has suggested I add the following type <title>tag structure to my pages:<br /><br />Bars in New York - Bars New York [no brand name]</p> <p>Pizzas in New York - Pizzas New York [no brand name]</p> <p>Firstly, I think this looks spammy, secondly, can't understand the logic of both combinations, thirdly, my understanding is brand name lessens importance of keyphrases, but it's still important from a branding point of view.</p> <p>Fourthly, is this sustainable? I mean, Google could identify this as spammy in the future, with penalty, no? Any feedback on these points would be very useful.</p> <p>Also, he said that I should play around with title tags on an ongoing basis, but I haven't changed any single title tag more than once/6 months for fear of being flagged for manipulative SEO practice by Google. Guidance here would be great as well.</p> <p>Thanking you in advance, Luke</p></title>
Web Design | | McTaggart0