AJAX & JQuery Tabs: Indexation & Navigation
-
Hi
I've two questions about indexing Tabs.
1. Let's say I have tabs, or an accordion that is triggered with Jquery. That means that all HTML is accessible and indexed by search engines. But let's say a search query is relevant to the content in Tab#3, while Tab#1 is the one that's open by default. Is there any way that Tab#3 would be open directly if it's more relevant to the search query?
2. AJAX Tabs: We have pages that have Tabs triggered by AJAX (example: http://www.swisscom.ch/en/residential/help/loesung/entfernen-sie-sim-lock.html). I'm wondering about the current best practice. Google recommends HTML Snapshots. A newer SEOMoz Article talks about pushState(). What's the way to go here?
Or in other words: How to get Tabs & Accordion content indexed and allow users to navigate directly to it?
-
hello Philip,
robots.txt file allows you to tell the web bots that crawl your site what is a link and what is not a link that you want to show to the world and to Google
Most search engines will analyze and follow a link only if it contains three query string parameters or fewer.
many parameters in the link shown you have 5 parameters they are what come after the1st / as shown below you have 5. You can block off certain parameters with robots.txt
/en/residential/help/loesung/entfernen-sie-sim-lock
http://www.swisscom.ch/en/residential/help/loesung/entfernen-sie-sim-lock.htm
for some reason whenever I go to your link you have posted I get this error
The requested URL /system/sling/cqform/defaultlogin.html was not found on this server.
http://msdn.microsoft.com/en-us/library/ff723936(v=expression.40).aspx
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687
please links above regarding URL parameters and Microsoft and Google agree that too many parameters and they will not search the link.
I hope this is been helpful
sincerely,
Thomas
-
Hi Thomas
Thanks for the resources. I'll have to check with IT which solution seems most practical.
Though I don't understand two points:
- Where does robots.txt come into the game?
- How do we have too many parameters in http://www.swisscom.ch/en/residential/help/loesung/entfernen-sie-sim-lock.htm? Can you specify this?
Thanks!
/Philipp
-
I forgot to add this but this is very relevant the software could make your script is Seo friendly
-
I would do some serious decoding
if you want to see what Google sees this as a great tool http://www.screamingfrog.co.uk/seo-spider/
see what screaming frog tells you then try to fix it in Unicode 8
can fix some of this by going to the bottom of this page and changing your robot.txt.
https://developers.google.com/webmasters/ajax-crawling/docs/learn-more
https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt
you have too many parameters http://www.swisscom.ch/en/residential/help/loesung/entfernen-sie-sim-lock.html
in my opinion you forgot some serious code changing To do.
if you want to use one of the tools Google recommended you'll need I-frame crawling and you can do that with this along with the 2nd URL
https://github.com/crawljax/crawljax/blob/master/CHANGELOG.md
http://code.google.com/p/selenium/issues/detail?id=387
http://www.unicode.org/faq/utf_bom.html
because it is Java based there are some great tools found here as well
I wish you the best and hope that this is helpful,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain to sub-directory migration: New subdirectory not yet indexed
Hi all, We have recently migrated a sub-domain to sub-directory to claim it's traffic in our website. Like demo.website.com to website.com/demo. We have also set a redirect for same which is working fine; but still old subdomain is showing in google search results and new directory haven't been indexed. We have submitted the new sub-directory in search console multiple times and it got partially indexed as per the status. We have allowed crawlers. Thanks
Web Design | | vtmoz0 -
Is this CSS solution to faceted navigation a problem for SEO?
Hi guys. Take a look at the navigation on this page from our DEV site: http://wwwdev.ccisolutions.com/StoreFront/category/handheld-microphones While the CSS "trick" implemented by our IT Director does allow a visitor to sort products based on more than one criteria, my gut instinct says this is very bad for SEO. Here are the immediate issues I see: The URL doesn't change as the filter criteria changes. At the very least this is a lost opportunity for ranking on longer tail terms. I also think it could make us vulnerable to a Panda penalty because many of the combinations produce zero results, so returning a page without content, under the original URL. This could not only create hundreds of pages with no content, there would be duplicates of those zero content pages as well. Usability - The "Sort by" option in the drop down (upper right of the page) doesn't work in conjunction with the left Nav filters. In other words if you filter down to 5 items and then try to arrange them by price high to low, the "Sort" will take precedence, remove the filter and serve up a result that is all products in that category sorted high to low (and the filter options completely disapper), AND the URL changes to this: http://wwwdev.ccisolutions.com/StoreFront/category/IAFDispatcher regardless of what sort was chosen...(this is a whole separate problem, I realize and not specifically what I'm trying to address here). Aside from these two big problems, are there any other issues you see that arise out of trying to use CSS to create product filters in this way? I am trying to build a case for why I believe it should not be implemented this way. Conversely, if you see this as a possible implementation that could work if tweaked a bit, and advice you are willing to share would be greatly appreciated. Thanks! Thank you to Travis for pointing out the the link wasn't accessible. For anyone willing to take a closer look we can unblock the URL based on your IP address. If you'd be kind enough to send me your IP via private message I can have my IT director unblock it so you can view the page. Thanks!
Web Design | | danatanseo0 -
Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
Greetings Moz Community: As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories: 1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate 2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
Web Design | | Kingalan1
15 PAGES Low bounce rate. 3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate 4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES 5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate 6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content 7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%) Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me. Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us. Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well? Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"? Any harm in doing this for about half the pages on the site? I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains. Thanks in advance for your responses!! Alan0 -
/index.php/ What is its purpose and does it hurt SEO?
Hello Moz Forum, I am still in the process of cleaning up the lack of attention to detail and betrayal set by our soon to be ex-SEO company. You can see a previous question I ask regarding betrayal SEO. I am analyzing every page on our website and i am noticing this /index.php/ in most of our URLs. We want to leave our expression engine cms and convert to wordpress. I have been reading about index.php but most of it is over my head for now. What does concern me is the "layman's" findings i am seeing through analytics. Our main domain has two URLs. one that ends in .com and the other ends in .com/index.php/ The one that ends in .com has a higher page rank than the ladder. And there are other internal pages with the same two variations. Can someone please explain to me what is /index.php/ ? what are the benefits of it? what are the cons? What will happen to my site once we move to wordpress? As always, your comments and suggestions are greatly appreciated.
Web Design | | CamiloSC0 -
CSS vs Javascript vs JQuery drop down navigation
For a user / seo perspective, what is the best way to code a drop down menu nav bar? Is it best to use css, javascript or a scripting library like jquery? I am thinking about overall best practice that will not have a negative impact on serps. I am also thinking about what will work best on all types of devices i.e. desk tops, lap tops, smart phones and tablets. What are the Pro's & Cons of Using CSS for Drop Down Menus. What are the Pro's & cons of using Javascript for drop down menus. And the same question for jquery. Thank you all in advance for your ideas.
Web Design | | bronxpad0 -
Does hidden content in jQuery ui tabs still get ignored?
I am looking for a more current answer to this question. I know that google leaves out the js and css. But since the code usually has display:hidden inline with the code while using jquery ui tabs I was curious to know if google considers this hidden or from what some articles have said, "tries to ignore the content". Is this still true today? I would assume no but looking for some back-up.
Web Design | | sknott0 -
Off Screen Rendering & Other Tactics
Hi Folks, We're currently trying to change our website search results to render in HTML in the first instance then switch off to AJAX when our user clicks on filters. But we came across an issue that diminishes the user experience, so we used this method below: We have moved the search grid offscreen in the initial rendering because we execute a lot of Javascript that modifies the DOM within the grid. Also when a user has performed a search from within the page, the hash is updated to identify the new search terms. Because this is not sent to the server, a user who has done a search and refreshes would see incorrect search results initially and the correct search results would then replace them. For example, on initial search a user reaches a URL akin to search.veer.com/chicken. When they perform a search from on that page, the hash gets updated tosearch.veer.com/chicken#keyword=monkey. If the user refreshes the page, the server only receives the request for chicken and then serves up the page with those results rendered on it. The Javascript then checks the hash and determines that it needs to run a different search and fires off an AJAX call to get the new results. If we did not render the results offscreen the user would see the results for chicken (confusingly) and be able to briefly interact with them until the AJAX call returns and the results are replaced with the correct monkey results. By rendering offscreen, the initial results are not visible and the Javascript can move them immediately onscreen if there is no hash, or wait until the AJAX call returns and then rebuild the grid and move it onscreen. Now I know that rendering text indent to -9999 is a black hat SEO tactic. But, would it be the same in this case? We're only doing this avoid bad UI. Please advise. Also, we came across these two articles that may serve alternative options. These article state that each tactic is SEO-friendly, but I'd like to run it my the community and see if you guys agree. http://joshblog.net/2007/08/03/make-your-rich-internet-application-seo-friendly/ http://www.inqbation.com/tools-to-increase-accessibility-in-the-web/ Thank you for your help!
Web Design | | CorbisVeer0 -
Custom 404 Page Indexing
Hi - We created a custom 404 page based on SEOMoz recommendations. But.... the page seems to be receiving traffic via organic search. Does it make more sense to set this page as "noindex" by its metatag?
Web Design | | sftravel0