Would using javascript onclick functions to override href target be ok?
-
Hi all,
I am currently working on a new search facility for me ecommerce site... it has very quickly dawned on me that this new facility is far better than my standard product pages - from a user point of view - i.e lots of product attributes for customers to find what they need faster, ability to compare products etc... All in all just better. BUT NO SEO VALUE!!!
i want to use this search facility instead of my category/product pages... however as they are search pages i have "robots noindex them" and dont think its wise to change that...
I have spoken to the developers of this software and they suggested i could use some javascript in the navigation to change the onlclick function to take the user to the search equivelant of the page...
They said this way my normal pages are the ones that are still indexed by google etc, but the user has the benefit of using the improved search pages...
This sounds perfect, however it also sounds a little deceptive... and i know google has loads of rules about these kinds of things, the last thing i want is to get any kind of penalty or any negative reaction from an SEO point of view... I am only considering this as it will improve the user experience on my website...
Can any one advise if this is OK, or a "no no"...
P.s for those wondering i use an "off the shelf" cart system and it would cost me an arm and a leg to have these features built into my actual category / product pages.
-
Hello James,
Why do these pages have "no SEO value"? Is it because they are AJAX pages or because you have them noindexed? Or both?
To answer your original question, using an on-click javascript event to send a user to a page other than the URL listed in the href tag is borderline. It goes beyond the risk level I would feel comfortable with on an eCommerce site, but a lot of affiliate sites do this. For instance, all of their links out to merchant sites may go through a directory called /outlink/ so the href tag might look like .../outlink/link1234 and appear to send the user to another page on their domain, when actually the user gets redirected to the merchant's (e.g. Amazon.com, Best Buy...) website. Sometimes the user is redirected from the /outlink/... URL and sometimes they never even get that far because the javascript sends them to the merchant's URL first.
It is not cloaking unless you are specifically treating Google differently. If Google doesn't understand your site that is their problem. If you have code that essentially says "IF Google, THEN do this. ELSE do that" it is your problem because you are cloaking. Make sense? There is a very distinct line there.
The bottom line is if you want to show users a certain page then you should be showing that page to Google as well. If the problem is the content on that page doesn't appear for Google (e.g. AJAX) then you should look into optimizing that type of content to the best of your ability. For example, look into the use of hashbangs (#!) as in:
https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
-
1. Google understands simple JS that is inline with your HTML. So Google understands that
is a link to domain.com. You can obfuscate this further and Google might not understand it. I've not seen Google try to parse or execute JS but that doesn't mean they can't or won't in the future.3. Google is very unlikely to spider AJAX. Many AJAX pages don't return any user readable content (most of mine return things like JSON, which is not for end user consumption) and , as such, are beyond the scope of indexation. Again, as in #2, you might want this content to be shown elsewhere if you want it indexed. https://developers.google.com/webmasters/ajax-crawling/
-
ok, i am not keen on this approach, the developers have offered an alternative... but again, i'm not sure about it, they have said they can use ajax to force their search results / navigation over my current navigation / products on my category / product pages...
this gets rid of having to use javascript to send to different url... but up above Alan mentions cloaking, which to my understanding is basically serving anything different for a search engine / person... and thats what this will do... it serves up a different navigation to people... and the products could be listed in a different order etc... search engines do not see the ajax...
Is this any better? or just as negative?
-
Are they identical, you say the search equivalent, I just wouldn't treat search engines any different
-
even thou the content is identical?
It is only the way that content can then be navigated that is different...
-
Well then, yes I would be concerned, you are serving up different content to users, that is cloaking.
-
Hi Alan,
i think i may have explained incorrectly - my search page does have the meta tag noindex,follow - it also has a canonical link back to the main search page (i.e search.html) so i do not think any of the search results will be indexed. So my concern is not duplicate content, this should not happen...
My concern is the fact i am using javascript to litterally divert customers from one page to another... its almost like the static pages are there only for the benefit of google... and thats concerning me...
-
Google can follow JavaScript links, unless you are very good at hiding them.
I would not worry too much about the duplicate content, don't expect the duplicates to rank, but your not likely to be penalized for them. you can use a canonical tag to point all search results back to the one page.
I would not no index any pages, any links pointed to a no-index page are pouring their link juice away. if you want to no index a page use the meta tag no-index,follow, this way the search engine will follow the links and flow back out to your site
read about page rank and how link juice flows
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Metadata should one use multi country directory
Currently this is what applies throughout the site. property="og:locale" content="en_GB" /> How would one set this for properties in Italy or Spain for example? (The language is all in English) Regards Tai
Technical SEO | | Taiger0 -
How do i actually use the canonicalization rule for Apache?
Hi Guys, Moz is reporting lots of duplicate content on my site. I think this is partly from session id's and partly from category pages and on-site search generated pages. I know I have to use the canonicalization rule but don't know exactly how to determine the correct URL and where to put the code. Can anyone offer any advice on this? I'm new to this so apologies for any etiquette breaching etc. Many thanks, Stewart.
Technical SEO | | oiljob0 -
Does image domain name matter when using a CDN?
Has anyone does studies on using a different CDN domain name for images on a site? Here is an example: or ![](<a)http://cdn.mydomain.com/image.jpg> mydomain.com ranks highly and many images show up in Google/Bing image searches. Is there any actual data that says that using your real domain name for the CDN has benefits versus the default domain name provided by the CDN provider? On the surface, it feels like it would, but I haven't experimented with it.
Technical SEO | | findwell0 -
The use of robots.txt
Could someone please confirm that if I do not want to block any pages from my URL, then I do not need a robots.txt file on my site? Thanks
Technical SEO | | ICON_Malta0 -
Keyword Targeting with Dynamic Pages
We have a large e-commerce website made with .net. so all of our category and item pages are made dynamic. Most things like title, some of the words and a few other things are done with scripts. I want to be able to target certain words and have more customized words on certain pages. Has anyone dealt with this? I know .net is pretty common so I can't be a unique case.
Technical SEO | | EcommerceSite0 -
Target term hits a glass ceiling despite A grade
Greetings from 13 degrees C wetherby UK 🙂 Ive hit a roadbloack in my attempts to get a target term onto page one, below is a url pointing to a graph illustrting the situation. The target term is on the graph (I'm reluctant to stick it in here incase this page comes up) http://i216.photobucket.com/albums/cc53/zymurgy_bucket/glass-ceiling-office-to-let.jpg This is what Ive done to date for page -
Technical SEO | | Nightwing
http://www.sandersonweatherall.co.uk/office-to-let-leeds/ 1. Ensured the Markup follows SEO best parctice
2. Internally linked to the page via a scrolling footer
3. Shortened the URL
4. Requested the Social media efforts points links to the page
5. Requested additional content But i wonder... Is the reason for hitting a glass ceiling now down to lack of content ie just one page or is there a deeper issue of an indexing road block? Any insights welcome 🙂0 -
When criteria do you use for external linking?
When linking out to external sources what criteria do you use? Also, do you ever add the URL in plain text instead to conserve link juice?
Technical SEO | | Charlessipe0 -
Domain targeting advice needed please
I would be interested in hearing the views of other seomozers on this issue please. I have a web server hosted in The Netherlands which I currently host my sites on, it is super fast (16core 24gb ram) and in 8 months has had 4 mins of downtime! On this server I wish to build a couple of ecommerce stores. However this is where my issue lays The first store I launch will be targeted at the UK market, however the domain I wish to use for it is a .com domain which has a moz ranting of about 36 (better than most of my competitors, worse than a few so it's a good headstart). The problem I would then have is a .com domain hosted on a Dutch server targeting UK people. Even if I was to set the webmaster tools location to UK it would not be ideal. Also, when it comes to launching the US site I would then be looking at using a .us domain which is far from ideal The other option I have is to use the .co.uk domain for the UK site but this is new and lack any decent moz score. Given this I am now pondering the following set up....using the .com domain on the Dutch server but putting the UK store in domain.com/UK and the future usa store in domain.com/usa. Would this be the best work around? I could then set the location of folders in the webmaster tools? Also, I plan on using geo redirecting on the domain so if a uk page happens to rank in the USA listings the user gets automatically redirected to the nearest matching product available in their country in the /us/ folder. Would this be easiest to work with on just one domain as it wouldn't technically be redirecting people to another site as per using two domains. Any thoughts would be good. Not even sure I have managed to explain it very clearly hehe
Technical SEO | | Grumpy_Carl0