Is this tabbed implementation of SEO copy correct (i.e. good for getting indexed and in an ok spot in the html as viewed by search bots?
-
We are trying to switch to a tabbed version of our team/product pages at SeatGeek.com, but where all tabs (only 2 right now) are viewed as one document by the search engines.
I am pretty sure we have this working for the most part, but would love some quick feedback from you all as I have never worked with this approach before and these pages are some of our most important.
Resources:
http://www.ericpender.com/blog/tabs-and-seo
http://www.google.com/support/forum/p/Webmasters/thread?tid=03fdefb488a16343&hl=en
http://searchengineland.com/is-hiding-content-with-display-none-legitimate-seo-13643
Sample in use: http://www.seomoz.org/article/search-ranking-factors
**Old Version: **
http://screencast.com/t/BWn0OgZsXt
http://seatgeek.com/boston-celtics-tickets/
New Version with tabs:
http://screencast.com/t/VW6QzDaGt
http://screencast.com/t/RPvYv8sT2
http://seatgeek.com/miami-heat-tickets/
Notes:
- Content not displayed stacked on browser when Javascript turned off, but it is in the source code.
- Content shows up in Google cache of new page in the text version.
- In our implementation the JS is currently forcing the event to end before the default behavior of adding #about in this case to the url string - this can be changed, should it be?
- Related to this, the developer made it so that typing http://seatgeek.com/miami-heat-tickets/#about directly into the browser does not go to the tab with copy, which I imagine could be considered spammy from a human review perspective (this wasn't intentional).
- This portion of the code is below the truncated view of the fetch as Googlebot, so we didn't have that resource.
- Are there any issues with hidden text / is this too far down in the html?
Any/all feedback appreciated. I know our copy is old, we are in the process of updating it for this season.
-
Cool. When we launched them separately we overvalued the potential for ticket prices rankings and had so little respect from engines that double ranking was hard. Also, I wasn't as on my game with SEO back then.
I think merging is the way to go, I am filing it into our dev. queue for the coming weeks.
-
I'd probably agree with that merge decision. Topic is basically the same, primary difference is inclusion of "price" in the keyword targeting from what I see, and that can likely be achieved with one master page.
Furthermore, having awesome data integrated like that will lead to links, because it's better than most crappy ticket sites. Big boost in PA from that leads to better rankings than just the 2 pages IMO.
-
Thanks for the helpful response. And I definitely am with you on the idea of having better data on all our pages. I initially set it up separately but have been leaning towards merging those ticket price pages with the tickets pages and killing off (301ing the price pages to the tickets pages). Make sense?
-
My general rule of thumb is that as long as all of the content is delivered via HTML (which it appears to be), and the switching of the tabs is done via javascript (which it is) than you're mostly OK.
You do have one issue though - the current code on http://seatgeek.com/miami-heat-tickets/ doesn't die gracefully. You recognized this in your notes, but if a user doesn't have Javascript turned on, they can't access the text. That's an issue for usability, and you could make an argument that it might be bad for SEO, but either way I believe it should be fixed. When javascript isn't enabled, the content should still load below the event listings. Typically that means it should load that way automatically, and javascript should then hide the tab when the page loads and show it once they click on the tab.
Ideally the content would be made easily available (currently the tabs aren't as intuitive as they are on a Facebook page, for example). Putting them above the photo might help that?
Also, from a user perspective, the written content is mostly there for SEO purposes right now. Stuff like the price stats is cool information that I would find interesting while shopping for tickets - maybe there's a way to show that graphically on the page in a more interesting way than text?
Update - I just noticed that those stats are displayed on http://seatgeek.com/miami-heat-ticket-prices in an awesome way - do stuff like that for all of your pages!
On the same tabs topic, but separate from your implementation, I've seen companies load content from an XML file using Javascript. That is definitely not SEO friendly and can cause indexation issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rewrite a good post
Hello I have a blog post from 2015 that is ranking very well on Google but I have a terrible bounce rate of more than 98% for this landing page. This post is bringing 4 more visits than the rest of my website pages so I need to make people stay in the site. I don't want to lose my rankings so my question is: Is it better to modify this post or to write a new post and delete the old one? Thank you
Technical SEO | | seoandromedical0 -
Search Console rejecting XML sitemap files as HTML files, despite them being XML
Hi Moz folks, We have launched an international site that uses subdirectories for regions and have had trouble getting pages outside of USA and Canada indexed. Google Search Console accounts have finally been verified, so we can submit the correct regional sitemap to the relevant search console account. However, when submitting non-USA and CA sitemap files (e.g. AU, NZ, UK), we are receiving a submission error that states, "Your Sitemap appears to be an HTML page," despite them being .xml files, e.g. http://www.t2tea.com/en/au/sitemap1_en_AU.xml. Queries on this suggest it's a W3 Cache plugin problem, but we aren't using Wordpress; the site is running on Demandware. Can anyone guide us on why Google Search Console is rejecting these sitemap files? Page indexation is a real issue. Many thanks in advance!
Technical SEO | | SearchDeploy0 -
Why HTML entities gets crawled as content keywords in Google search console?
My Google search console shows HTML parameters such as div, class, img, src, gif, align as content keywords, but why google crawls HTML parameters as keywords? because of this, I would be losing traffic for my on-page content keywords. Please let me know how to solve this. Thanks, Jenifer
Technical SEO | | Jenifer300 -
Akamai's Edge Redirector good for SEO?
Hey guys, Just wondering if anyone has used/tested Akamai's new 'Edge Redirector' cloudlet?http://www.akamai.com/html/technology/edge-redirector.html It seems like it would be a better/faster option than redirects at the server level via htaccess.. thoughts? Thanks!,
Technical SEO | | wojkwasi
Woj1 -
How do I get google to index the right pages with the right key word?
Hello I notice that even though I have a site map google is indexing the wrong pages under the wrong key words. As a result its not as relevant and is not ranking properly.
Technical SEO | | ursalesguru0 -
Expressionengine SEO
One of my clients is using expressionengine CMS, and even the simplest things like creating unique page titles seem to be a nightmare. When I try to change page title via cms it also changes navigation menu. Any help will be appreciated.
Technical SEO | | Thommas0 -
Is Disqus SEO friendly?
I like the look of Disqus for handling comments but I'm not sure if it is really SEO friendly. Any other more SEO friendly alternatives out there (other than blogging software)?
Technical SEO | | andywozhere0 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90