Does Google distinguish between core content and accessory, 3rd party widgets when considering how slow or fast a site is?
-
Our site's Facebook Plugin is really slowing page speed down. As far as users are concerned, the page loads fast enough and they can already start interacting with the page before the last sidebar widget has loaded. But the FB widget is really slow to load and is dragging the performance down in Google Analytics Page Speed for example. Any thoughts on whether this should be an SEO concern, and whether Google differentiates between different elements of the page when deciding whether a page is a bad user experience?
Thanks!
-
Hi Chris, thanks a lot for taking the time to respond. It almost doubles our page load time (again, 95% of the page is loaded, but the FB friends widget takes just as long again). Our programmers are blaming slow page speed on the FB widget. If we think about Google's goal of creating good user experiences, one would think they care more about the core content than a widget that appears on every page on the site. But I'm not sure if they're accounting for that. Our page speeds are bad (often 5-7 sec or more), but its hard to know if that is having an SEO impact. I couldn't find any concrete answers online. We're going to opt for a simpler version of the widget and hopefully that has a positive effect.
-
Just like your browser knows when all the components of a page have finished loading, It's my understanding that the bot knows too. How long is it taking and are there any other widgets you could use in it's place? Has anyone else commented on the slowness of that widget?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content from Another Site
Hi there - I have a client that says they'll be "serving content by retrieving it from another URL using loadHTMLFile, performing some manipulations on it, and then pushing the result to the page using saveHTML()." Just wondering what the SEO implications of this will be. Will search engines be able to crawl the retrieved content? Is there a downside (I'm assuming we'll have some duplicate content issues)? Thanks for the help!!
Technical SEO | | NetStrategies1 -
How can I get Google to forget an https version of one page on my site?
Google mysteriously decided to index the broken, https version of one page on my company's site (we have a cert for the site, but this page is not designed to be served over https and the CSS doesn't load). The page already has many incoming links to the http version, and it has a canonical URL with http. I resubmitted it on http with webmaster tools. Is there anything else I could do?
Technical SEO | | BostonWright0 -
Linking out to authoritive sites from my ecommerce site
Good afternoon SEOmoz community. I was looking for a specific answer or advice or opinion about linking out to other sites. My Site www.tacticalbootstore.com has been undergoing a complete content rewrite. In the process we have been told and read where it can be good to link out to other authoritive sites. One of the pages we have rewritten is here. http://www.tacticalbootstore.com/belleville-boots-sizing-chart-a-97.html We have not added the graphics yet as they are being built now. This is just an informational page about sizing of a particular manufacturers boots. Once you get to the bottom of the text we have added a link to the actual manufacturers page. Is this helpful for us in the SERPS or not? Thank you for your time. Chris
Technical SEO | | scamper0 -
Can anyone help me understand why google is "Not Selecting" a large number of my webpages to include when crawling my site.
When looking through my google webmaster tools, I clicked into the advanced settings under index status and was surprised to see that google has marked around 90% of my pages on my site as "Not Selected" when crawling. Please take a look and offer any suggestions. www.luxuryhomehunt.com
Technical SEO | | Jdubin0 -
Are aggregate sites penalised for duplicate page content?
Hi all,We're running a used car search engine (http://autouncle.dk/en/) in Denmark, Sweden and soon Germany. The site works in a conventional search engine way with a search form and pages of search results (car adverts).The nature of car searching entails that the same advert exists on a large number of different urls (because of the many different search criteria and pagination). From my understanding this is problematic because Google will penalize the site for having duplicated content. Since the order of search results is mixed, I assume SEOmoz cannot always identify almost identical pages so the problem is perhaps bigger than what SEOmoz can tell us. In your opinion, what is the best strategy to solve this? We currently use a very simple canonical solution.For the record, besides collecting car adverts AutoUncle provide a lot of value to our large user base (including valuations on all cars) . We're not just another leech adword site. In fact, we don't have a single banner.Thanks in advance!
Technical SEO | | JonasNielsen0 -
Does duplicate content on word press work against the site rank? (not page rank)
I noticed in the crawl that there seems to be some duplicate content with my word press blog. I installed a seo plugin, Yoast's wordpress seo plugin, and set it to keep from crawling the archives. This might solve the problem but my main question is can the blog drag my site down?
Technical SEO | | tommr10 -
Does google use the wayback machine to determine the age of a site?
I have a site that I had removed from the wayback machine because I didn't want old versions to show. However I noticed that in many seo tools the site now always shows a domain age of zero instead of 6 years ago when I registered it. My question is what do the actual search engines use to determine age when they factor it into the ranking algorithm? By having it removed from the wayback machine, does that make the search engines think the site is brand new? Thanks
Technical SEO | | FastLearner0 -
Google Shopping Australia/Google Merchant Centre
So Google Shopping has finally landed in Australia so we've got some work todo hooking it up to our client ecom sites. Right now we have a handful of clients who are setup, the feed is getting in their ok but all products are sitting in "disapproved" status in the dashboard and clicking into each individual product the status says awaiting review. I logged a support ticket with Google to get some more info on this as it doesn't look right to me (ie the disapproved status in dashboard) and got a useless templated answer. Seems that if I switch the country destination to US the products are approved and live in google.com shopping search within the hour. Switch back to Australia and they go back to disapproved status. Anyone having the same issue/seen this before? I simply don't trust Google support and wondering if there's other factors at play here.
Technical SEO | | Brendo0