Fetch data for users with ajax but show it without ajax for Google
-
Hi,
We have a thematic footer which shows similar pages links relevant to the search criteria made on a page.
We want to fetch those footer similar links through ajax when users search on site but the links will be shown without using ajax when Google fetches those pages. We want to do this to improve our page load time.
The links content & count will be exactly same in both cases whether Google fetches the search pages or user fetches those pages. Will this be treated as negative by Google, Can this have any negative affect on our rankings or traffic.
Regards,
-
I'm with Alan on the server side, 1 second is not really good for just doing a request for some links.
-
1 second is a lot for a few links even a lot of links. maybe your server technology has problems.
But still you have the problem of load time no matter who you are downloading the links for, the search engine or the use, they still have to be downloaded.
-
Hi Martijn,
Thanks for a quick reply.
This will reduce around 1 second in page load time.
-
if you are going to load data for google on page load, then you will still have the load times. so loading links again using ajax is not solving anything.
-
Cloaking is very dangerous, and the most common reason for google to use his axe.
If you code has anything similar to if googlebot then, you are at risk.
But in this case you do have a solution which theoretically should have no negative effect. Google has been sponsoring that technique of serving a static content on first load and update it with ajax.
But let me stress what it means, serve static content on first load, and update with ajax. Which means no cloaking, don't serve a different content (neither different code with same looking content) to visitors than google bot.
Additionally, it is very important to please visitors and serve content fast to them, but at the same time it's important to serve content fast to googlebot, since speed is a ranking factor.
-
How many seconds would the impact be for Google if you would still load it via AJAX? It feels a bit like you're trying to fix something that ain't broken.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have Your Thoughts Changed Regarding Canonical Tag Best Practice for Pagination? - Google Ignoring rel= Next/Prev Tagging
Hi there, We have a good-sized eCommerce client that is gearing up for a relaunch. At this point, the staging site follows the previous best practice for pagination (self-referencing canonical tags on each page; rel=next & prev tags referencing the last and next page within the category). Knowing that Google does not support rel=next/prev tags, does that change your thoughts for how to set up canonical tags within a paginated product category? We have some categories that have 500-600 products so creating and canonicalizing to a 'view all' page is not ideal for us. That leaves us with the following options (feel it is worth noting that we are leaving rel=next / prev tags in place): Leave canonical tags as-is, page 2 of the product category will have a canonical tag referencing ?page=2 URL Reference Page 1 of product category on all pages within the category series, page 2 of product category would have canonical tag referencing page 1 (/category/) - this is admittedly what I am leaning toward. Any and all thoughts are appreciated! If this were in relation to an existing website that is not experiencing indexing issues, I wouldn't worry about these. Given we are launching a new site, now is the time to make such a change. Thank you! Joe
Web Design | | Joe_Stoffel1 -
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Multiple sites using same text - how to avoid Google duplicate content penalty?
Hi Mozers, my client located in Colorado is opening a similar (but not identical) clinic in California. Will Google penalize the new California site if we use text from our website that features his Colorado office? He runs the clinic in CO and will be a partner of the clinic in CA, so the CA clinic has his "permission" to use his original text. Eventually he hopes to go national, with multiple sites utilizing essentially the same text. Will Google penalize the new CA site for plagiarism and/or duplicate content? Or is there a way to tell Google, "hey Google, this new clinic is not ripping off my text"?
Web Design | | CalamityJane770 -
How to find out that none of the images on my site violates copyrights? Is there any tool that can do this without having to check manually image by image?
We plan to add several thousand images to our site and we outsourced the image search to some freelancers who had instructions to just use royalty free pictures. Is there any easy and quick way to check that in fact none of these images violates copyrights without having to check image by image? In case there are violations we are unaware of, do you think we need to be concerned about a risk of receiving Takedown Notices (DMCA) before owner giving us notification for giving us opportunity to remove the photo?
Web Design | | lcourse1 -
My news site not showing in "In the news" list on Google Web Search
I got a news website (www.tapscape.com) which is 6 years old and has been on Google News since 2012. However, whenever I publish a news article, it never shows up "In the news" list on Google Web Search. I have already added the schema.org/NewsArticle on the website and have checked it if it's working or not on Google structured data testing tool. I see everything shows on on the structured data testing tool. The site already has a news sitemap (http://www.tapscape.com/news-sitemap.xml) and has been added to Google webmaster tools. News articles show perfectly fine in the News tab, but why isn't the articles being shown on "In the news" list on the Google web search? My site has a strong backlink background already, so I don't think I need to work on the backlinks. Please let me know what I'm doing wrong, and how can I get it to the news articles on "In the news" list. Below is a screenshot that I have attached to this question to help you understand what I mean to say. 1qoArRs
Web Design | | hakhan2010 -
Recovering organic traffic and Google rankings post-site-crash
Hi everyone, we had a client's Wordpress website go down about 2 weeks ago and since then organic traffic has basically plummeted. We haven't identified exactly what caused the crash, but it happened twice in one week. We spent a lot of time optimizing the site for organic SEO, improving load times, improving user experience, improving the website content, improving CTR, etc. Then one morning we get a notification from our uptime monitoring service that the site was down, and upon further inspection we believe it may have been compromised. The child theme that the website was using, all of the files were deleted and/or blank. We reverted the website to a previous backup, which fixed the problem. Then, a few days later, the same exact thing happened, only this time the child theme files were missing after the backup was restored. We've since re-installed and reconfigured the child theme, changed all passwords (Wordpress, FTP, hosting, etc.), and we're looking into changing hosting providers in the very near future. The site uses the Yoast Wordpress SEO plugin, which has recently been reported as having some security flaws. Maybe that was the cause of the problem. Regardless, the primary focus right now is to recover the organic traffic and Google rankings that we've worked so hard to improve over the past few months up until this disaster occurred. The client is in a very competitive niche and market, so I'm pretty frustrated that this has happened after we were making such great progress, Since the website went down, organic search traffic has decreased by 50%. The site and all internal pages are loading properly again (and have been since the second time the website went down), but Google Webmaster Tools is still reporting a number of pages as "not found" witht he crawl dates as early as this past weekend. We've marked all errors as "fixed", and also re-submitted the Sitemaps in Google Webmaster Tools. The website passes the "mobile-friendly" tests, received A and B grades in GTMMetrix (for whatever that's worth), and still has the same original Google Maps rankings as before. The organic traffic, however, and organic rankings on Google have seen a pretty dramatic decrease. Does anyone have any recommendations when it comes to recovering a website's authority and organic traffic after it's experienced some downtime?
Web Design | | georgetsn0 -
Should i not use hyphens in web page titles? Google Penalty for hyphens?
all the page titles in my site have hyphens between the words like this: http://texas.com/texas-plumbers.html I have seen tests where hyphenated domain names ranked lower than non hyphenated domain names. Does this mean my pages are being penalized for hyphens or is this only in the domain that it is penalized? If I create new pages should I not use hyphens in the page titles when there are two or more words in the title? If I changed all my page titles to eliminate the hyphens, I would lose all my rankings correct? My site is 12 years old and if I changed all these titles I'm guessing that each page would be thrown in the google sandbox for several months, is this true? Thanks mozzers!
Web Design | | Ron100 -
Moz crawl showing up ?s=keyword pages as errors
Hi all, Hoping someone can she some light on a fix with ref to wordpress and the search function it uses as Moz is craling some pages which reference the search domain.com/?s=keyword Errors showing up are duplicate pages, descriptions and titles. The search function is not important on this site and I have tried to use a plugin which disables the search page which it does but these errors still show up. Can anyone assist as this is the final piece of the puzzle and then we're down to 0 issues on the site.
Web Design | | wtfi0