Social Signals...I Need More
-
Hi,
Having used the Open Site Explorer (page specific metrics) tool I've realized I'm lacking in the Social Signals area...I have only 1 tweet and ZERO for everything else (Facebook shares, Likes, Google +1 etc).
So I have 2 quesitons:
How can I get more social signals (is there a service I can use to do it)?
How long will the new social signals take to appear in the Open Site Explorer information?
I've just paid someone on Fiverr.com to share my website link on his Facebook account, to tweet it, and also to use PingFM...
Any other ideas would be GREAT - I can see this is a real weakness for my websites...
Thanks,
James
-
Hi
As far as Facebook, make sure you have the Open Graph meta tags in place on your website. It not, any clicks of the like button on your website might not count as a "share".
-Dan
-
Your social media sharing buttons are also Call to Action buttons. I have not seen your site, but could you try changing the location of these buttons and make them more obvious.
-
Try to get good content on your site, that people will like to tweet etc. Also, you can implement share, like and tweet buttons etc. to make it easy for people to share the content of the website. You should also, if you have not done so yet, set up your own facebook page and twitter account.
I don't think it is a great idea to pay someone to tweet to their "50.000 real followers" or to share a message with their "140.00 fans", this is just the same as buying links - blackhat stuff that you should avoid.
Also, by creating good content, you will be able to get not only social shares, but also good backlinks.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need help with best practices on eliminating old thin content blogs.
We have about 100 really old blog posts that are nothing more than a short trip review w/ images. Consequently these pages are poor quality. Would best practices be to combine into one "review page" per trip, reducing from 100 to about 10 better pages and implement redirects? Or is having more pages better with less redirects? We only have about 700 pages total. Thanks for any input!
Intermediate & Advanced SEO | | KarenElaine0 -
301 redirection help needed!
Hi all, So if we used to have a domain (let's say olddomain.com) and we had a new site created at newdomain.com how do we properly setup redirects page to page. Caveat, the urls have changed so for instance the old page oldomain.com/service is now newdomain.com/our-services on the new site. Do we need to have hosting on the old site? Do we need to setup individual 301s for each page corresponding to the new page? Just looking for the easiest way to do this CORRECTLY. Thanks, Ricky
Intermediate & Advanced SEO | | RickyShockley3 -
Lazy Loading of products on an E-Commerce Website - Options Needed
Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Intermediate & Advanced SEO | | JBGlobalSEO
Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like: http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂0 -
Does Unique Content Need to be Located Higher on my webpages?
I have 1 page that ranks well with unique written content located high up on page (http://www.honoluluhi5.com/new-condos-in-honolulu/). I struggle to rank for 200+ other pages where unique content requires scrolling (ex: http://www.honoluluhi5.com/oahu/honolulu-homes/). I am thinking to do as follows: Change layout of all my pages to have unique content higher on page When users are on my site (not coming from search engines) and use my search filters, then users will land on pages where unique content is lower on page (so keep this layout: http://www.honoluluhi5.com/oahu/honolulu-homes/). I will then add these pages to my robots.txt file so they do not show in Google's index. Reason: unique content lower on page offers best user experience. With unique content higher on page, I expect bounce rate to increase about 10% (based on the 1 page I have with unique content higher), but I think it is worthwhile, as I am sure search engines will start having my pages rank higher.
Intermediate & Advanced SEO | | khi50 -
Need suggestion for link-building
Link Building Question i want to get rank in google for www.topnotchlawsuitloans.com so have to build backlinks with lawsuit loans alt tag but main question is this have to build or gain backlinks for this domain only or one of my website sub domain www.topnotchlawsuitloans.com/lawsuit-funding-philadelphia.html on page #6 so have to build backlink for this URL ??? what are the effective strategy to gain backlinks for main page or all sub pages have to build backlinks ?? how many back-link per keyword & per page is good for website.???
Intermediate & Advanced SEO | | JulieWhite0 -
How related to your industry do your links need to be?
Hello, Some of the hottest link building techniques right now are guest posting, viral content, and link bating. But I often see SEOs produce content that has very little relevance to the actually industry they are in. For instance, a dentist might build links by guest posting on a tech site, an attorney might create an infographic on color psychology, and an accountant might venture into celebrity gossip. While more advanced SEOs try to make sure that the content they produce has some relevance to their industry (even if it's marginal), where is the line drawn?
Intermediate & Advanced SEO | | lezal0 -
Need some urgent Panda advice. Open discussion about recovering from the Panda algorithm.
I have a site that has been affected by Panda, and I think I have finally found the problem. When I created this site in the year 2006, I bought content without checking it. Recently, when I went through the site I found out that this content had many duplicates around the web. Not 100% exact, but close to. The first thing I did is ask my best writer to rewrite these topics, as they are a must on my site. This is a very experienced writer, and she will make the categories and subpages outstanding. Second thing I did was putting a NOINDEX, FOLLOW robots meta in place for the pages I determined being bad. They haven't been de-indexed yet. Another thing I recently did is separate other languages and move these over to other domains (with 301's redirecting the old locations to the new.) This means that the site now has a /en/ directory in the URL which is no longer used. With this in mind I was thinking to relocate the NEW content, and 301 the old (to preserve the juice for a while.) For example: http://www.mysite.com/en/this-is-a-pandalized-page/ 301 to http://www.mysite.com/this-is-the-rewritten-page/ The benefits of doing this are: decreasing the amounts of directories in the URL getting rid of pages that are possibly causing trouble getting fresh pages added to the site Now, the advice I am looking for is basically this: Do you agree with the above? Or don't you agree? If you don't, please be so kind to include a reason with your answer. If you do, and have any additional information, or would like to discuss, please go ahead 🙂 Thanks, Giorgio PS: Is it proven that Panda is now a running update? Or is it still periodically executed?
Intermediate & Advanced SEO | | VisualSense1 -
I need help with htaccess redirect
Hi guys, we have the domain cheats.co.uk, it has always displayed as cheats.co.uk without the www. However it is now showing 2 version of the site, both the www. and the non www. version. I know how to add to the htaccess folder to get the non www. version going to the www. version but i am worried about doing this because the non www. version has always been the one indexed in Google and has a page rank of 3. Should i in fact be redirecting the www.version to the non www. version to keep page rank etc? or will page rank be passed over etc if i redirect to the www. version I hope thats clear Thanks guys Jon
Intermediate & Advanced SEO | | imrubbish0