How to get power tweets & Likes for social signals!
-
Hi,
Just been looking into social signals a little deeper.
From what I have read a tweet from one page is not the same as a tweet from another page, the authority and influence is also a big part.
So a tweet from CNN does a lot more then a tweet from a random.
So how do you find these authority and influential pages/users?
I have come across Klout.com which gives a score out of 100, which is one way I guess BUT I have also noticed mozbar stats change for different facebook pages.
Q: Can you use the mozbar on facebook & twitter pages to workout who will generate the best social signals?
Cheers
-
Hi Jen,
Thanks for the reply.
That would be useful, social seems like a big area now for SEO and I think maybe even some SEOmoz input on the subject would be great.
1. Best practises for getting the most out of your social pages (facebook, twitter, plus1) and these types of articles.
-
Hi! Unfortunately to answer your last question, I don't think that using the Mozbar on specific pages will be overly helpful to determine those high authority accounts. While it will show you backlinks and overall page/domain authority it doesn't show you the number of followers, how often they tweet, how many retweets they get, etc.
Klout and similar services are an interesting way to finding influential/authoritative users, but it shouldn't be the only strategy. Each of the social sites has different ways to go about this and different tools exist. There are a number of great posts out there that about how to do this on the different platforms. Let me do some searches and I'll add them here!
Thanks,
Jen
-
Building relationships with people in your field is key, Google + is very good for this: http://www.seomoz.org/blog/5-tips-for-managing-community-on-google-plus. Mention people in your niche, comment on their posts, give them value. Then ask them to share your content/tweets etc...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical Issue On AMP
Hi everyone,
Intermediate & Advanced SEO | | MuhammadQasimAttari
I have one issue about canonical. kindly guide me about it. I have a site example.com/abc and I convert it on an amp and know its URLs is example.com/abc=?amp. but the search console tells me to add the proper canonical URL but both pages are the same. kindly guide me about it. what will I do?0 -
Getting SEO Juice back after Redirect
Hi, On my website, many product pages were redirected over time to its product category, due to the product being unavailable. I understand with a 301 redirect, the final URL would have lost about 15% of the link juice. However - if after some time (e.g. 2 months, or 1 year) I remove the redirection - is the original page going to have any SEO juice, or did it already lose all of it? Thanks,
Intermediate & Advanced SEO | | viatrading10 -
AMP pages for a responsive Ecommerce website?
Howdy guys, I'm wondering if AMP is worthwhile intergrating into a responsive e-commerce site? I'm under the impression that the benefits of AMP would be focused around speed, however it may come at the cost of conversion rate if it was to be delivered for product pages, etc. I'm presuming that even if AMP was on every page across a responsive ecommerce site, Google would only display AMP pages in the carousel for news articles, such as on the integrated blog? Any advice would be awesome! Thanks guys 🙂
Intermediate & Advanced SEO | | JAR8970 -
Product Schema & Google Guidelines
Hi We have product mark up on our site, data-vocab rather than schema. I can't see it showing in Google SERPs, but when testing it appears to be correct. Are Google still selective with what schema they show for a site? Thanks
Intermediate & Advanced SEO | | BeckyKey0 -
Pros & Cons of Switching Your Main Domain to Mask Links & Combat EMDs
Hello Mozzers, I'd love to receive some advice for a client of mine and insights you may have regarding pros and cons on changing your main domain to mask links. Within a competitive niche there are about 4 different sites that routinely rank 1-4. Our site crushes all three on just about all metrics except we have a high volume of nofollow links and our site remains at #4. Our site is much older so we have significantly more links than these smaller sites, including pre-penguin penalty spammy links (like blog comments that make up 50+ nofollow links from 1 comment per domain). Obviously we are attempting to remove any toxic links and disavow, however the blog comment nofollow links skew our anchor text ratio pretty intensely and we are worried that we aren't going to make a dent in removing this type of links. Just disavowing them hasn't worked alone, so if we are unable to remove the bulk of these poor quality links (nofollow, off-topic anchor text, etc..) we are considering 301 redirecting the current domain to a new one. We've seen success with this in a couple of scenarios, but wanted to see other insights as to if masking links with a 301 could send fresh signals and positively effect rankings. Also wanted to mention, 2 of the 3 competitors that outrank us have EMD's for the primary keywords. Appreciate your time, insights, and advice on this matter.
Intermediate & Advanced SEO | | Leadhub0 -
Product Pages & Panda 4.0
Greeting MOZ Community: I operate a real estate web site in New York City (www.nyc-officespace-leader.com). Of the 600 pages, about 350 of the URLs are product pages, written about specific listings. The content on these pages is quite short, sometimes only 20 words. My ranking has dropped very much since mid-May, around the time of the new Panda update. I suspect it has something to do with the very short product pages, the 350 listing pages. What is the best way to deal with these pages so as to recover ranking. I am considering these options: 1. Setting them to "no-index". But I am concerned that removing product pages is sending the wrong message to Google. 2. Enhancing the content and making certain that each page has at least 150-200 words. Re-writing 350 listings would be a real project, but if necessary to recover I will bite the bullet. What is the best way to address this issue? I am very surprised that Google does not understand that product URLs can be very brief and yet have useful content. Information about a potential office rental that lists location, size, price per square foot is valuable to the visitor but can be very brief. Especially listings that change frequently. So I am surprised by the penalty. Would I be better off not having separate URLs for the listings, and for instance adding them as posts within building pages? Is having separate URLs for product pages with minimal content a bad idea from an SEO perspective? Does anyone have any suggestions as to how I can recover from this latest Panda penalty? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Is there a way to keep sitemap.xml files from getting indexed?
Wow, I should know the answer to this question. Sitemap.xml files have to be accessible to the bots for indexing they can't be disallowed in robots.txt and can't block the folder at the server level. So how can you allow the bots to crawl these xml pages but have them not show up in google's index when doing a site: command search, or is that even possible? Hmmm
Intermediate & Advanced SEO | | irvingw0 -
Get-targeted homepage for users vs crawlers
Hello there! This is my first post here on SEOmoz. I'll get right into it then... My website is housingblock.com, and the homepage runs entirely off of geo-targeting the user's IP address to display the most relevant results immediately to them. Can potentially save them a search or three. That works great. However, when crawlers frequent the site, they are obviously being geo-targeted for their IP address, too. Google has come to the site via several different IP addresses, resulting in several different locations being displayed for it on the homepage (Mountain View, CA or Clearwater, MI are a couple). Now, this poses an issue because I'm worried that crawlers will not be able to properly index the homepage because the location, and ultimately all the content, keeps changing. And/or, we will be indexed for a specific location when we are in fact a national website (I do not want to have my homepage indexed/ranked under Mountain View, CA, or even worse, Clearwater, MI [no offence to any Clearwaterians out there]). Of course, my initial instinct is to create a separate landing page for the crawlers, but for obvious reasons, I am not going to do that (I did at one point, but quickly reverted back because I figured that was definitely not the route to go, long-term). Any ideas on the best way to approach this, while maintaining the geo-targeted approach for my users? I mean, isn't that what we're supposed to do? Give our users the most relevant content in the least amount of time? Seems that in doing so, I am improperly ranking my website in the eyes of the search engines. Thanks everybody! Marc
Intermediate & Advanced SEO | | THB0