How can I export from Followerwonk into a Twitter List?
-
I use Hootsuite but can't see anywhere in there or find any other tools to import a text file of Twitter usernames into a Twitter list. Does anyone know how?
-
I see. I don't think it's possible to do what you are after I'm afraid
-
I don't think so really as I can get a CSV, XLS from Follerwonk. I need to get that data into a List. Thanks.
-
Hi Mark
Yes a List of Contacts. It seems one of the obvious things to do once you've used Follerwonk to find a group you want to communicate with.
-
I know how to copy a full page of peoples Twitter names and profile URLs to the clipboard with one click of a mouse?
Would this be useful for you?
-
Hi there,
Do you mean import a list of Twitter usernames to add as contacts?
I don't believe it can be done unfortunately from what I can see.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can we efficiently use Fresh Web Explorer and Just Discovered Links?
Love the fresh data sources SEOmoz is building for us. However, I'm frustrated by the lack of scale the tools offer. Let's say I have 30 competitors I want to watch (which is pretty conservative - if we're targeting 100 keywords on a site, we could easily have 100's of top 20 ranked competitors). If I have to run individual reports for each using OSE and Fresh Web Explorer, that would be hours of work every day/week. Ideally, I'd like to see a campaign feature where you could add 2-200 competitors to view in one report. You could view recent links (from FWE and JDL) for all competitors on one handy report, and sort by various metrics. So for example, if you wanted to view the top 10 links your competitors have gotten in the past week, you could see that in 30 seconds of work, vs many hours of work. Any others who think this would be useful? Any ideas for how we can use the data in such a way without this feature?
Moz Pro | | AdamThompson1 -
Can I rely on Keyword Difficulty tool?
I just ran into a problem that I hadn't expected. Testing the Keyword Difficulty I saw the results contained a result for a page that has Domain Authority=1 and Page Authority=1. As a result, Keyword Difficulty was reduced (compared to last month), which may actually be reversed if the site is crawled. Sadly, I didn't run the report on the figures as it was a small project. Questions: Can I rely on results shown by Keyword Difficulty? Are results where Domain Authority =1 are used to calculate Keyword Difficulty? If so, why is that? Is there any difference between a page that has received no links and a page that OSE/Mozscape has no link data for? The problem Using the Keyword Difficulty tool, I found swings of up to 14% in Keyword Difficulty (between Oct -Nov). Dr Pete may suggest that this is because of changes in Google's index ( http://www.seomoz.org/blog/a-week-in-the-life-of-3-keywords ). However, It would be helpful to have a figure for Keyword Difficulty that isn't affected by the gaps in the Mozscape data. The (bad) solution You can mirror something close to Keyword Difficulty using: =(Sum of Page Authorities + Sum of Domain Authorities )/20 Right now, I have resorted to manually calculating keyword difficulty. I use the SEOMoz Page Authority & Domain Authority figures and a quick splash of Excel SUMIF and COUNTIF. I find the results don't look as 'easy' when I can ignore results where the data is unknown (PageAuthority=1 & DomainAuthority = 1). Background Info One result I still have a report on is for the phrase [fixing your business puzzle] using US Results on Google. For the specific result, I found the additional information about the site: DNS lookup shows the domain was registered in 2010
Moz Pro | | Darroch
Archive.org shows no records
OSE shows no data for the site
Site uses https
Google showing No links
Robots.txt file seems fine
No Sitemap.xml0 -
Why does the csv export from OSE only include 25 of the 90+ links?
In OpenSiteExplorer, I clicked "Download CSV" for a report on backlinks from one domain to another. The online visualization in OSE showed 93 external inbound links from site A to site B. When I opened the report, there are only 25 linking pages listed. How do I download the full list?
Moz Pro | | DanielH0 -
SEOmoz directory list, discussing individual directories...
It would be nice if the directory list (http://www.seomoz.org/directories) had an area to discuss each individual directory. Some of the directories like Internet Library have great domain authority, but have a large number of thumbs-down from users. I would like to know why. Personally, I'm very skeptical about directories. I'd love to prove myself wrong, though. The SEOmoz directory list in its current form isn't all that helpful.
Moz Pro | | MicahMMG0 -
Can you change crawl day of week?
Can I somehow sync the day of the week for each of my campaigns' crawls, so that all campaigns are updated on the same day?
Moz Pro | | ATShock0 -
Can someone explain why I have been seeing an increase in the number of Linking Page URLs in OSE that link directly to downloads?
Ever since the last couple Linkscape updates when doing competitive back link analysis I have noticed a large increase in the number of URLs of Linking Pages in OSE that result in an immediate file download. The majority of the time these downloads are not common files ie PDF, DOC files. For example, these were all in a competitors back link profile: http://download.unesp.br/linux/debian/pool/main/i/isc-dhcp/isc-dhcp-relay-dbg_4.1.1-P1-17_ia64.deb http://snow.fmi.fi/data/20090210_eurasia_sd_025grid.mat http://www.rose-hulman.edu/class/me/HTML/ES204_0708_S/working model examples/Le25 mad hatter.wm?a=p&id=145880&g=5&p=sia&date=iso&o=ajgrep These are just a few I came across for a single competitor. Is this sketchy black hat SEO, some sort of error, actual links, or something else? Any information on this subject would be helpful. Thank you.
Moz Pro | | Gyi0 -
How can I clean up my crawl report from duplicate records?
I am viewing my Crawl Diagnostics Report. My report is filled with data which really shouldn't be there. For example I have a page: http://www.terapvp.com/forums/Ghost/ This is a main forum page. It contains a list of many threads. The list can be sorted on many values. The page is canonicalized, and has been since it was created. My crawl report shows this page listed 15 times. http://www.terapvp.com/forums/Ghost/?direction=asc http://www.terapvp.com/forums/Ghost/?direction=desc http://www.terapvp.com/forums/Ghost/?order=post_date and so forth. Each of those pages uses the same canonicalization reference shared above. I have three questions: Why is this data appearing in my crawl report? These pages are properly canonicalized. If these pages are supposed to appear in the report for some reason, how can I remove them? My desire is to focus on any pages which may have an issue which needs to be addressed. This site has about 50 forum pages and when you add an extra 15 pages per forum, it becomes a lot harder to locate actionable data. To make matters worse, these forum indexes often have many pages. So if I have a "Corvette" forum there that is 10 pages long, then there will be 150 extra pages just for that particular forum in my crawl report. Is there anything I am missing? To the best of my knowledge everything is set up according to the best SEO practices. If there is any other opinions, I would like to hear them.
Moz Pro | | RyanKent0