What is the best way to scrape serps for targeted keyword research?
-
Wanting to use search operators such as "KEYWORD inurl:blog" to identify potential link targets, then download target url, domain and keyword into an excel file. Then use SEOTools to evaluate the urls from the list. I see the link aquisition assistant in the Moz lab, but the listed operators are limited.
Appreciate any suggestions on doing this at scale,
thanks!
-
Thanks a bunch, Lavellester. Just finished Tom's post - exactly what I needed.
I appreciate your help,
H
-
Hi,
If I remember correctly this post by Tom Critchlow about agile SEO hacks should get you moving in the right direction.
Hope it helps.
L
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats the best practice for acquisition?
Hi, My company have just bought out a competitor. We wan't to dissolve their website and if possible steal some of their link juice. The site hasn't got any spammy links or 404's so i'm not worried in that department. What I am not sure about is which of the following is best practice? a. Redirect every single page (even pages like /?checkout) to a relevant page on our website. b. Only redirect important pages, category pages, contact pages etc and leave the other pages to 404? c. Redirect the important pages to a relevant URL and redirect the less important pages to our homepage. d. Redirect the entire domain to our home page (i assume this isn't a good idea) e. Don't redirect any of the pages just delete the site.
Intermediate & Advanced SEO | | DannyHoodless0 -
Best way to do site seals for clients to have on their sites
I am about to help release a product which also gives people a site seal for them to place on their website. Just like the geotrust, comodo, symantec, rapidssl and other web security providers do.
Intermediate & Advanced SEO | | ssltrustpaul
I have notices all these siteseals by these companies never have nofollow on their seals that link back to their websites. So i am wondering what is the best way to do this. Should i have a nofollow on the site seal that links back to domain or is it safe to not have the nofollow.
It wont be doing any keyword stuffing or anything, it will probly just have our domain in the link and that is all. The problem is too, we wont have any control of where customers place these site seals. From experience i would say they will mostly likely always be placed in the footer on every page of the clients website. I would like to hear any and all thoughts on this. As i can't get a proper answer anywhere i have asked.0 -
Keyword ranking verse all other data
Hi there I have just joined Moz so I am not sure if i am doing a good job of analysing all the data, but from what i can see i have a few questions: 1. I seem to have a fairly high visibility compared to a few other competitors 2. All the other competitors i am looking at have a much lower domain authority 3. I win the link metrics in all categories compared to my competitors 4. I have a page optimisation score of 94 5. I dont have any crawl issues (except that i just changed to https and i believe there is a synching issue with Moz and cloud flare..) YET I barely rank for any of the main keyword in my industry.... Kitchen, New Kitchen, Kitchen Renovation etc. I also have a page optimisation score of 94 for related keywords. I feel like i am really missing a big point and was hoping I could get your expert thoughts on this 🙂 Thanks so much! PS my domain is www.bluetea.com.au
Intermediate & Advanced SEO | | bluetea0 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
When migrating website platforms but keeping the domain name how best do we add the new site to google webmaster tools? Best redirect practices?
We are moving from BigCommerce to Shopify but maintaining our domain name and need to make sure that all links redirect to their corresponding links. We understand the nature of 301s and are fine with that, but when it comes to adding the site to google webmaster tools, not losing link juice and the change of address tool we are kind of lost. Any advice would be most welcome. Thank you so much in advance!
Intermediate & Advanced SEO | | WNL0 -
What is Best Way to Scale RCS Content?
SEO has really moved away from the nitty gritty analysis of backlinking factors, link wheels, and the like and has shifted to a more holistic marketing approach. That approach is best described around MOZ as “Real Company S#it”. RCS is a great way to think about what we really do because it is so much more than just SEO or just Social Media. However, our clients and business owners do want to see results and want it quantified in some way. The way most of our clients understand SEO is by ranking high on specific terms or online avenues they have a better possibility of generating traffic/sales/revenue. They understand this more from the light of traditional marketing, where you pay for a TV ad and then measure to see how much revenue that ad generated. In the light of RCS and the need to target a large number of keywords for a given client, how do most PROs handle this situation; where you have a large number of keywords to target but with RCS? Many I’ve asked tend to use the traditional approach of creating a single content piece that is geared towards a given target keyword. However, that approach can get daunting if you have say 25 keywords that a small business wants to target. In this case is not really a case of scaling down the client expectations? What if the client wants all of the keywords and has the budget? Do you just ramp your RCS content creation efforts? It seems that you can do overkill and quickly run out of RCS content to produce.
Intermediate & Advanced SEO | | AaronHenry0 -
Best links to gain?
Hi, Just a quick one to see what peoples thoughts are regarding links. I have just gained a free link on a .gov website in the UK. In one of their offers pages, will this provide any link value or domain trust to me and what can the benefits be SEO wise from having a link on a government domain? The link is just the website url with a a few lines of text detailing our address etc... so its not got any anchor text regarding targetting one of our brands. It is equallly, less or more important to target anchor text links to specific brands or to get good high quality links from trusted sites such as the .gov one I have linking to my root domain? The website is a local council website in the UK. And was listed by a member of their staff, who only list in the offer page if your offering discount to council members etc..., so its not a spammed page or anything like that. What are peoples views on anchor text links vs domain url links? Cheers Will
Intermediate & Advanced SEO | | YNWA0 -
Keyword Self-Cannibalization
Hi, Happy Friday! I was advised to look at the SEO strategy of a UK SEO company and copy a technique they used, however, I have doubts that this technique is any good. So mydomain.com targets My Domain For my main keyword phrase, I was told to place a link to a newly created inner page in my footer, targeting the main keyword and on this page, create unique content which points back to my homepage. Now I also have mydomain.com/my-domain.php which has a link to mydomain.com with anchor text My Domain. Based on the SEOMoz reports, this now seems to be Keyword Self Cannibalisation and I think that it is diluting link juice and the value of my SEO on my homepage for this term rather than helping. Can you advise if this technique is wrong?
Intermediate & Advanced SEO | | tdsnet0