Navigation for Users vs Spiders
-
We're creating a new global site nav that provides a great user experience, but may be less than ideal for the search engines. The user selects an item from category A, and is then presented options to choose from in category B, and then chooses a specific product. The user does not encounter any actual "links" until they choose the specific product.
The search engines won't see this navigation path due to the way that the navigation is coded. They're unable to choose an item from A, so they can't get to B, and therefore cannot get to C, which is the actual product page.
We'd like to create an alternative nav for the browsers, so that they can crawl the category pages for A and B, as well as the specific product pages (C).
This alternative nav would be displayed if the user does not have javascript enabled. Otherwise, the navigation described above will be shown to the user.
Moving forward, the navigation that the user sees may be different from what is shown to the search engine, based on user preferences (ie they may only see some of the categories in the nav, while the search engines will see links to all category/product pages).
I know that, as a general rule, it's important that the search engines see the same thing that the user sees. Does the strategy outlined above put us at risk for penalties?
-
Here is the Google’s guidelines for developers that how they can make their AJAX code crawlable.... https://support.google.com/webmasters/answer/174992?hl=en
I guess you should pretty much focus on your user’s experience and I believe Google crawlers can easy crawls your AJAX and JS codes...
Hope this helps!
-
Same response AJAX is a javascript method to get content from another page. Crawlers have no issues indexing that. Now a days, most BIG sites use AJAX, like the ones with infinite scroll.
The way they do it is: they put the link to the next page (that users don't see since you hide the "Next" via css) and both crawlers and users can navigate the site just fine. In your case, you can put links into each submenu option too, that way you will help both users and crawlers.
-
Sorry, I should have clarified, the navigation utilized AJAX, so the links don't actually appear anywhere in the source. We do have breadcrumbs on the product pages. Thanks!
-
Search engines are already good executing Javascript, so they WILL see those links too. I would suggest only the "user" navigation and add some bread crumbs in each product (the path the user followed to reach that product) so crawler and users can also navigate the site by category.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Google chose different canonical than user" Issue Can Anyone help?
Our site https://www.travelyaari.com/ , some page are showing this error ("Google chose different canonical than user") on google webmasters. status message "Excluded from search results". Affected on our route page urls mainly. https://www.travelyaari.com/popular-routes-listing Our canonical tags are fine, rel alternate tags are fine. Can anyone help us regarding why it is happening?
White Hat / Black Hat SEO | | RobinJA0 -
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
Why there is lot of difference in Domain Authority vs majestic trust flow strange???
Hello all I want to ask you why there is difference in DA authority vs majestic trust authority as both of these companies say they have the best authority alogrithm see the below link for refrence. http://wp.auburn.edu/bassclub/next-meeting-1-28-2014/
White Hat / Black Hat SEO | | adnan11010 -
Real Vs. Virtual Directory Question
Hi everyone. Thanks in advance for the assistance. We are reformatting the URL structure of our very content rich website (thousands of pages) into a cleaner stovepipe model. So our pages will have a URL structure something like http://oursite.com/topic-name/category-name/subcategory-name/title.html etc. My question is… is there any additional benefit to having the path /topic-name/category-name/subcategory-name/title.html literally exist on our server as a real directory? Our plan was to just use HTACCESS to point that URL to a single script that parses the URL structure and makes the page appropriately. Do search engine spiders know the difference between these two models and prefer one over the other? From our standpoint, managing a single HTACCESS file and a handful of page building scripts would be infinitely easier than a huge, complicated directory structure of real files. And while this makes sense to us, the HTACCESS model wouldn't be considered some kind of black hat scheme, would it? Thank you again for the help and looking forward to your thoughts!
White Hat / Black Hat SEO | | ClayPotCreative0 -
Subdomains vs. Subfolders Wordpress Multisite
I am in the process redesigning my organization's website using wordpress multisite. I am currently planning on creating subdomains for each of the locations, as I thought that having a keyword saturated domain name would provide the best rankings. So the Omaha office would look like this: omaha.example.com Would it be better to go with example.com/omaha? Things to consider: Google adwords is currently a huge source of our traffic. Despite having very good organic rankings, we receive most of our traffic from pay-per-click sources. The "display URL" has dramatic effect on our CTR, so I want to avoid subfolders if possible. (example OmahaEmergencyDental.com receives far more click thru's than EmergencyDental.com) Each location currently has it's own domain and website (omahaemergencydental.com) these sites/pages have been in place for several years Thanks in advance!
White Hat / Black Hat SEO | | LoganYard0 -
Link Building: High Ranking Site vs. Relevancy
Hello, When link building, is it acceptable to link with a site that has high authority but has minimal relevancy to our site? For example, if we sell nutritional products and the link exchange would be with a site that relates to free coupons, would that work? Also, if we are publishing articles on other sites, should we also publish them on our own site? Should we add "nofollow" if we publish them in our site?
White Hat / Black Hat SEO | | odegi0 -
User comments with page content or as a separate page?
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
White Hat / Black Hat SEO | | Peter2640 -
Influence of users' comments on a page (on-page SEO)
Do you think when Google crawls your page, it "monitors" comments updates to use this as a ranking factor? If Google is looking for social signs, looking for comments updates might be a social sign as well (ok a lot easier to manipulate, but still social). thx
White Hat / Black Hat SEO | | gt30