How not to get penalized by having a Single Page Interface (SPI) ?
-
Guys, I run a real estate website where my clients pay me to advertise their properties.
The thing is, from the beginning, I had this idea about a user interface that would remain entirely on the same page. On my site the user can filter the properties on the left panel, and the listings (4 properties at each time) are refreshed on the right side, where there is pagination.
So when the user clicks on one property ad, the ad is loaded by ajax below the search panel in the same page .. there's a "back up" button that the user clicks to go back to the search panel and click on another property.
People are loving our implementation and the user experience, so I simply can't let go of this UI "inovation" just for SEO, because it really is something that makes us stand out from our competitors.
My question, then, is: how not to get penalized in SEO by having this Single Page Interface, because in the eyes of Google users might not be browsing my site deep enough ?
-
Hi,
Google and Bing can see how much time your users spend on the page, and since they can also see that there is a large amount of information accessible through that page, I don't think you need to be as worried about the "single page" factor as normal.
That said, just because your main user interface lives within a single page, there is no reason that you cannot have other pages linked to it. In fact there are a number of other pages which should be included in your site. For example: Contact, About, Terms, Privacy Policy and (if relevant) Disclosure and/or Disclaimer. They do not have to be right up front or included in your main UI, but they should at least be available for users as text links at the bottom of the page, in a sidebar or somewhere. If you don’t include them you are reducing the appearance of transparency for the site. This works against trust and will make people less confident about doing business through your site. Given that you are in real estate, these things should be a major consideration.
Also, if you do not have an About page, you are reducing your opportunity to grow your customer base and add more clients.
Hope that helps,
Sha
-
If you have your listings available in an unordered list, that should be fine. If there aren't hundreds and hundreds of listings on your site, I don't think Google will have a problem with your implementation. If there are, you might consider building static pages for each category, and linking to the listings from there.
-
John, thanks for the quick reply.
I had already read the "make your Ajax page indexable", but unfortunately it was too late in product development and our programmers simply convinced us it would imply re-doing the entire backend for it to work.
So we already have in place a workaround for crawlers reach all these listings. Below the search panel (that has Ajax pagination and loads the ads on the same page with javascript) we have a standard html
So the crawlers can reach the properties individual pages. In other words, we comply with the rule "make each of your pages reachable by at least one internal link".
But my question was more focused about how google "sees" the navigation pattern of my users ... I know the crawler is reaching those pages, but since the majority of users use the search panel (that loads the properties by javascript/ajax) and not the static links below it, it might appear that the users only viewed one page inside our site.
-
Is there some alternate navigation to reach all of these listings without using your AJAX search? Or are the listings included in a sitemap? Is there some way for Google to find them already?
I'd recommend reading http://code.google.com/web/ajaxcrawling/ to learn more about how to make your AJAXy pages indexable. You may also want to take a look at http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html if you have prev and next pagination. If you have a view all, and want to make that the canonical form, you'll want to look at http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html
Also, in Bing Webmaster Tools, you can go to the Crawl > Crawl Settings tab and enable the "Configure your site to have bingbot crawl escaped fragmented URLs containing #!." option if that's applicable to you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Similar Product Variations - Page layout, Title and SEO best practice??
Im doing some research into SEO for our new web design. I sell designer eyewear prescription and sunglasses. Lets take a Ray Ban Wayfarer sunglass it comes in 30 colours and 3 sizes for each model. Up till now i was of the impression that for best practice SEO i would need to have each individual variation as its own page, this would also help with things like google shopping too. So for example heres 1 colour product in 3 sizes of 30 colour variations for this particular model. Ray Ban Wayfarer RB2140
Web Design | | Craigboi1987
Colour: Black 901
Sizes: 47, 50, 54 Currently my urls looks like this with a new page and the size changing on the end for each variation. Ray Ban Wayfarer RB2140 - Black 901 - 47 URL: www.mywebsite.com/ray-ban-wayfarer-rb2140.html?colour=Black+901&size=47 Ray Ban Wayfarer RB2140 - Black 901 - 50 URL: www.mywebsite.com/ray-ban-wayfarer-rb2140.html?colour=Black+901&size=50 Ray Ban Wayfarer RB2140 - Black 901 - 54 URL: www.mywebsite.com/ray-ban-wayfarer-rb2140.html?colour=Black+901&size=54 This is very time consuming and I'm not sure if its adding any benefit to my SEO in fact scared its actually a) slowing my site down (content heavy)
b) looking like duplicate content I am thinking about moving towards a page more like this were it would be just be a model with variations. (not effecting the title/getting a new page per variation) http://demoleotheme.com/vigoss/index.php/atomic-endurance-running-tee-crew-neck.html I am not sure of the pros and cons of doing it this way over the way I'm doing it currently all i know is my site is ranking horribly. Lastly I'm currently running a magento V1.9 store which is renowned for duplicate content slow site speeds etc so have been told moving to woo commerce would benefit me for both site performance and seo but I'm skeptical as currently with this structure of a each SKU being a new page il be up to 8000+ products and multiple product variations that it can handle my needs, anyone with any experience on woo commerce platform? (this might be a operate question apologise) This is absolutely frying my brain so any advice appreciated. Im prepared to put every dying second into just need some solid advice in which direction to go!0 -
Are pages not included in navigation given less "weight"
Hi, we recently updated our website and our main navigation was dramatically slimmed down to just three pages and no drop down under those. Yet we have many more important pages, which are linked to once on one of those main three pages. However, will this hurt those other pages because they are not included in navigation (some of which were starting to get good traction in rankings)?
Web Design | | LuaMarketing2
Thanks!0 -
Homepage and Category pages rank for article/post titles after HTML5 Redesign
My site's URL (web address) is: http://bit.ly/g2fhhC Timeline:
Web Design | | mcluna
At the end of March we released a site redesign in HTML5
As part of the redesign we used multiple H1s (for nested articles on the homepage) and for content sections other than articles on a page. In summary, our pages have many many, I mean lots of H1's compared to other sites notable sites that use HTML5 and only one H1 (some of these are the biggest sites on the web) - yet I don't want to say this is the culprit because the HTML5 document outline (page sections) create the equivalent of H1 - H6 tags. We have also have been having Google cache snapshot issues due to Modernzr which we are working to apply the patch. https://github.com/h5bp/html5-boilerplate/issues/1086 - Not sure if this would driving our indexing issues as below. Situation:
Since the redesign when we query our article title then Google will list the homepage, category page or tag page that the article resides on. Most of the time it ranks for the homepage for the article query.
If we link directly to the article pages from a relevant internal page it does not help Google index the correct page. If we link to an article from an external site it does not help Google index the correct page. Here are some images of some example query results for our article titles: Homepage ranks for article title aged 5 hours
http://imgur.com/yNVU2 Homepage ranks for article title aged 36 min.
http://imgur.com/5RZgB Homepage at uncategorized page listed instead of article for exact match article query
http://imgur.com/MddcE Article aged over 10 day indexing correctly. Yes it's possible for Google index our article pages but again.
http://imgur.com/mZhmd What we have done so far:
-Removed the H1 tag from the site wide domain link
-Made the article title a link. How it was on the old version so replicating
-Applying the Modernizr patch today to correct blank caching issue. We are hoping you can assess the number H1s we are using on our homepage (i think over 40) and on our article pages (i believe over 25 H1s) and let us know if this may be sending a confusing signal to Google. Or if you see something else we're missing. All HTML5 and Google documentation makes clear that Google can parse multiple H1s & understand header, sub & that multiple H1s are okay etc... but it seems possible that algorythmic weighting may not have caught up with HTML5. Look forward to your thoughts. Thanks0 -
Why is this page removed from Google & Bing indices?
This page has been removed from indices at Bing and Google, and I can't figure out why. http://www.pingg.com/occasion/weddings This page used to be in those indices There are plenty of internal links to it The rest of the site is fine It's not blocked by meta robots, robots.txt or canonical URL There's nothing else to suggest that the page is being penalized
Web Design | | Ehren0 -
Does on page links have an effect on SERP rankings with PANDA
I have been doing some competitive analysis basing my company on others and have noticed a pattern. Very high ranking sites seem to have limited the internal and external on page links on their subdomains to under 100. my site has a lot of links but all are relevant and lead to unique content. I am interested to know if anyone else has noticed this pattern in changes in the SERP results. bIs google now penalizing pages with to many on site nav links? And if a full site restructure is needed to allow google to index and rank these pages or if a it is a non issue and does not need to be addressed. Panda confuses me!!!!! HELP!
Web Design | | Brother220 -
How do you get rid of the .html and .php extensions at the end of urls?
What is the whitehat way to properly remove the .html and .php extensions at the end of urls? Example: http://www.seomoz.org/learn-seo.php should be (and is) http://www.seomoz.org/learn-seo
Web Design | | Ryan-Bradley0 -
What is the optimal URL Structure for Internal Pages
Is it more SEO friendly to have an internal page URL structure that reads like www.smithlawfirm.com/personal-injury/car-accidents or www.smithlawfirm.com/personal-injury-car-accidents? The former structure has the benefit of showing Google all the sub-categories under personal injury; the later the benefit of a flatter structure. Thanks
Web Design | | rarbel0 -
Dynamic pages and code within content
Hi all, I'm considering creating a dynamic table on my site that highlights rows / columns and cells depending on buttons that users can click. Each cell in the table links to a separate page that is created dynamically pulling information from a database. Now I'm aware of the google guidelines: "If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few." So we wondered whether we could put the dynamic pages in our sitemap so that google could index them - the pages can be seen with javascript off which is how the pages are manipulated to make them dynamic. Could anyone give us a overview of the dangers here? I also wondered if you still need to separate content from code on a page? My developer still seems very keen to use inline CSS and javascript! Thanks a bundle.
Web Design | | tgraham0