How to get power tweets & Likes for social signals!
-
Hi,
Just been looking into social signals a little deeper.
From what I have read a tweet from one page is not the same as a tweet from another page, the authority and influence is also a big part.
So a tweet from CNN does a lot more then a tweet from a random.
So how do you find these authority and influential pages/users?
I have come across Klout.com which gives a score out of 100, which is one way I guess BUT I have also noticed mozbar stats change for different facebook pages.
Q: Can you use the mozbar on facebook & twitter pages to workout who will generate the best social signals?
Cheers
-
Hi Jen,
Thanks for the reply.
That would be useful, social seems like a big area now for SEO and I think maybe even some SEOmoz input on the subject would be great.
1. Best practises for getting the most out of your social pages (facebook, twitter, plus1) and these types of articles.
-
Hi! Unfortunately to answer your last question, I don't think that using the Mozbar on specific pages will be overly helpful to determine those high authority accounts. While it will show you backlinks and overall page/domain authority it doesn't show you the number of followers, how often they tweet, how many retweets they get, etc.
Klout and similar services are an interesting way to finding influential/authoritative users, but it shouldn't be the only strategy. Each of the social sites has different ways to go about this and different tools exist. There are a number of great posts out there that about how to do this on the different platforms. Let me do some searches and I'll add them here!
Thanks,
Jen
-
Building relationships with people in your field is key, Google + is very good for this: http://www.seomoz.org/blog/5-tips-for-managing-community-on-google-plus. Mention people in your niche, comment on their posts, give them value. Then ask them to share your content/tweets etc...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What will SEO be like in the 2020's?
Hey guys, I would love to hear your thoughts on how you think SEO will change in the 2020's. The 2010's saw some pretty cool stuff like Panda, Penguin, penalties for non-mobile-friendly, non-secure and slow loading sites. What will be more or less important for SEO's in the 2020's than today? How will machine learning and AI change SEO?
Intermediate & Advanced SEO | | GreenHatWeb0 -
Mass Referencing Supplier Product Info & SEO
Hi I have a mass referencing project which will mean taking between 1000-2000 SKUs from a supplier, taking all their content & loading it onto our site. I need to make a case for not doing this from an SEO perspective. As these are pages I want to rank. I'm going to push for optimising titles/meta titles before they're loaded in. However if not, I may be forced to load in the products as they are & go back to optimise everything - does anyone see a real issue with this? I know there are so many 'similar' descriptions of products on ecommerce sites & across the web, so how does Google deal with these? The pages won't be identical as the templates are different, but maybe 100-200 words of descriptions could be until we work through them. Although this isn't ideal - what are the implications? The problem for me is, the managers just want the products on the site, without much thought regarding organic traffic/categorisation initially. Thanks!
Intermediate & Advanced SEO | | BeckyKey0 -
Fetch as Google -- Does not result in pages getting indexed
I run a exotic pet website which currently has several types of species of reptiles. It has done well in SERP for the first couple of types of reptiles, but I am continuing to add new species and for each of these comes the task of getting ranked and I need to figure out the best process. We just released our 4th species, "reticulated pythons", about 2 weeks ago, and I made these pages public and in Webmaster tools did a "Fetch as Google" and index page and child pages for this page: http://www.morphmarket.com/c/reptiles/pythons/reticulated-pythons/index While Google immediately indexed the index page, it did not really index the couple of dozen pages linked from this page despite me checking the option to crawl child pages. I know this by two ways: first, in Google Webmaster Tools, if I look at Search Analytics and Pages filtered by "retic", there are only 2 listed. This at least tells me it's not showing these pages to users. More directly though, if I look at Google search for "site:morphmarket.com/c/reptiles/pythons/reticulated-pythons" there are only 7 pages indexed. More details -- I've tested at least one of these URLs with the robot checker and they are not blocked. The canonical values look right. I have not monkeyed really with Crawl URL Parameters. I do NOT have these pages listed in my sitemap, but in my experience Google didn't care a lot about that -- I previously had about 100 pages there and google didn't index some of them for more than 1 year. Google has indexed "105k" pages from my site so it is very happy to do so, apparently just not the ones I want (this large value is due to permutations of search parameters, something I think I've since improved with canonical, robots, etc). I may have some nofollow links to the same URLs but NOT on this page, so assuming nofollow has only local effects, this shouldn't matter. Any advice on what could be going wrong here. I really want Google to index the top couple of links on this page (home, index, stores, calculator) as well as the couple dozen gene/tag links below.
Intermediate & Advanced SEO | | jplehmann0 -
Linking from & to in domains and sub-domains
What's the best optimised linking between sub-domains and domains? And every time we'll give website link at top with logo...do we need to link sub-domain also with all it's pages? If example.com is domain and example.com/blog is sub-domain or sub-folder... Do we need to link to example.com from /blog? Do we need to give /blog link in all pages of /blog? Is there any difference in connecting domains with sub-domains and sub-folders?
Intermediate & Advanced SEO | | vtmoz0 -
Hiring SEO Management Firm. Budget & References
Hello, Need advise on outsourcing SEO What could be a likely budget to hire an International company of repute. Full time Contract of Outsourcing for 6 months - and to extension terms Primarily looking for social & off page reputation and ofcourse higher SERP positions Which would be the best renowned companies and what could be likely spending which can be xpected out of it. Pls advise only if a company is known for organic & ethical with a strong reputation build up.
Intermediate & Advanced SEO | | Modi0 -
URL Parameter Being Improperly Crawled & Indexed by Google
Hi All, We just discovered that Google is indexing a subset of our URL’s embedded with our analytics tracking parameter. For the search “dresses” we are appearing in position 11 (page 2, rank 1) with the following URL: www.anthropologie.com/anthro/category/dresses/clothes-dresses.jsp?cm_mmc=Email--Anthro_12--070612_Dress_Anthro-_-shop You’ll note that “cm_mmc=Email” is appended. This is causing our analytics (CoreMetrics) to mis-attribute this traffic and revenue to Email vs. SEO. A few questions: 1) Why is this happening? This is an email from June 2012 and we don’t have an email specific landing page embedded with this parameter. Somehow Google found and indexed this page with these tracking parameters. Has anyone else seen something similar happening?
Intermediate & Advanced SEO | | kevin_reyes
2) What is the recommended method of “politely” telling Google to index the version without the tracking parameters? Some thoughts on this:
a. Implement a self-referencing canonical on the page.
- This is done, but we have some technical issues with the canonical due to our ecommerce platform (ATG). Even though page source code looks correct, Googlebot is seeing the canonical with a JSession ID.
b. Resubmit both URL’s in WMT Fetch feature hoping that Google recognizes the canonical.
- We did this, but given the canonical issue it won’t be effective until we can fix it.
c. URL handling change in WMT
- We made this change, but it didn’t seem to fix the problem
d. 301 or No Index the version with the email tracking parameters
- This seems drastic and I’m concerned that we’d lose ranking on this very strategic keyword Thoughts? Thanks in advance, Kevin0 -
How to get traffic from a particular Geographical region?
Our company is based out of India and has a web site with .in domain ; however our target customers are from North America and Australia.
Intermediate & Advanced SEO | | TPS2013
The problem is we get as high as 70% of organic traffic from India.
This 70% traffic from India has little use to us. Possibly because we have ”.in “ domain the Google local search is active.
How to reverse this situation; I mean we are looking for more traffic from across the globe except India.
Any suggestions ? P.S. Changing domain from .in to .com is not an option as its the part of our brand advertised for last 7 years1 -
SEO - I just do not seem to get it
Hello All, I came to the forum two weeks ago and prior to that studied SEO until my brain almost melted two weeks before. Now, I've read some great articles here on SEOMoz which have been fantastic. Mainly being this one: http://www.seomoz.org/beginners-guide-to-seo. So anyway, I genuinely throught I got it about a week ago. Here's what I did: First I decided on my keyword: "PC Repair Sheffield". Then, I made a website and on-page optimizaed as best as I possibly could (Graded A on the SEOMoz on-page exam tool). So I decided firstly to add myself to the Independant newspaper Business Directory, because it's free and has a great domain authority. I then went to Yahoo answers, found a Question i knew the answer too and made a whole article on the question, I provided a really useful answer and put a link to my full answer in the 'Source' section. This was a NoFollow Link. Every week, I write a new article and put it in my 'Blog' page. In the articles, I like to cover some problem that I've encountered throughout the week, if, for example, I write about a Hard Drive I replaced in Sheffield, I write about that and link it to my 'PC Repair Sheffield' page. Every article contains a video from Youtube, which is another 'NoFollow' link. Past that, I just find forums which are preffereably not 'No Follow' links and try to help people by answering their questions, but putting more details on my site for them toi see, with a link. That is pretty much the extent of what I 'Get' so far. But I do read a lot of the posts on here and I'm always seeing the Experts on SEO criticise these things and say it's bad without actually explaining why, or how to improve this, or what to do instead in some kind of simple way? I mean, this blog i'm writing, is there really any point? It's unlikely anybody is going to see it, sharing it is just a rediculous assumption. Nobody shares a page on how to fix a hard drive from a local sheffield site. I don't think that NoFollow links are a waste of time, but that's my personal assumption. I think that makes it more natural in a way.. 'Write Fresh Content' and get natural links is a crazy suggestion, nobody is ever going to share it. It's just not the way the world works for small businesses. No task is easy, but none are impossible either, i know that I could do SEO, i'm just not entirely sure 'what to do'.
Intermediate & Advanced SEO | | Paul_Tovey0