Ideal Campaign Set Up For Multiple Landing Pages Post Panda
-
I have a local floor cleaning business in the UK. I have various discrete services e.g. carpet cleaning, stone restoration and polishing (a page for each type of stone) that are delivered in distance geographical service areas. Each page is set up to target a "town/service" keyword. In each geographical location I have different competitors
Should I set individual campaigns for each landing page or is one site wide campaign the way to go.
Many thanks
David Allen
-
Glad it was helpful! Let me know if you have any other questions.
Cheers!
-
Thank you Megan - Just what I wanted - I had not considered the competitor angle.
-
Hi David! This is Megan from SEOmoz. You could really set up your campaigns either individually or as a root domain - it just depends on what you think would work best for your needs. Here are some pros and cons, though, that will hopefully help you reach your decision.
Root Domain -
Pros: 1. Might be easier for you to have everything in one spot rather than clicking between campaigns. 2. If you have the same keyword you want to track for everything then you would only need to enter it once as opposed to adding it to each campaign (and thus taking up several keyword spaces out of your allotted 300 instead of just 1)
Cons: 1. You're only given 3 competitor comparisons, so it would be pretty limiting if you have several competitors you want to track against
Individual campaigns -
Pros: 1. You can track up to 3 competitors for each campaign so you would be able to keep an eye on the direct competitors for each service. 2. It might be more convenient to have all of the data separated out to create clear lines between the services.
Cons: 2. It would takes up a lot more campaigns slots
I hope this helps at least a little bit! If you want to send us your specific URLs so we can take a closer look, feel free to send them to help[at]seomoz.org.
Thanks!
-
Thank you for your repiles Sandip and Keri.
Yes I'm trying to understand how to set up my SEP Moz campaign considering I will have a number of PPC and SEO landing pages. Will a root url cover all the pages or will I need to set campaign per page?
-
I'm looking at the tags for this post -- you're wondering about how to set up your SEOmoz campaigns, right?
-
Not sure whether you are talking about PPC campaign or organic SEO...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Aren't domain.com/page and domain.com/page/ the same thing?
Hi All, A recent Moz scan has turned up quite a few duplicate content notifications, all of which have the same issue. For instance: domain.com/page and domain.com/page/ are listed as duplicates, but I was under the impression that these pages would, in fact, be the same page. Is this even something to bother fixing or a fluke scan? If I should fix it does anyone know of an .htaccess modification that might be used? Thanks!
Moz Pro | | G2W0 -
Moz campaign works around my robots.txt settings
My robots.txt file looks like this: User-agent: * Disallow: /*? Disallow: /search So, it should block (deindex) all dynamic URLs. If I check this url in Google: site:http://www.webdesign.org/search/page-1.html?author=47 Google tells me: A description for this result is not available because of this site's robots.txt – learn more. So far so good. Now, I ran a Moz SEO campaign and I got a bunch of duplicate page content errors. One of the links is this one: http://www.webdesign.org/search/page-1.html?author=47 (the same I tested in Google and it told me that the page is blocked by robots.txt which I want) So, it makes me think that Moz campaigns check files regardless of what robots.txt say? It’s my understanding User-agent: * should forbid Rogerbot from crawling as well. Am I missing something?
Moz Pro | | VinceWicks0 -
Duplicate page report
We ran a CSV spreadsheet of our crawl diagnostics related to duplicate URLS' after waiting 5 days with no response to how Rogerbot can be made to filter. My IT lead tells me he thinks the label on the spreadsheet is showing “duplicate URLs”, and that is – literally – what the spreadsheet is showing. It thinks that a database ID number is the only valid part of a URL. To replicate: Just filter the spreadsheet for any number that you see on the page. For example, filtering for 1793 gives us the following result: | URL http://truthbook.com/faq/dsp_viewFAQ.cfm?faqID=1793 http://truthbook.com/index.cfm?linkID=1793 http://truthbook.com/index.cfm?linkID=1793&pf=true http://www.truthbook.com/blogs/dsp_viewBlogEntry.cfm?blogentryID=1793 http://www.truthbook.com/index.cfm?linkID=1793 | There are a couple of problems with the above: 1. It gives the www result, as well as the non-www result. 2. It is seeing the print version as a duplicate (&pf=true) but these are blocked from Google via the noindex header tag. 3. It thinks that different sections of the website with the same ID number the same thing (faq / blogs / pages) In short: this particular report tell us nothing at all. I am trying to get a perspective from someone at SEOMoz to determine if he is reading the result correctly or there is something he is missing? Please help. Jim
Moz Pro | | jimmyzig0 -
Settings to crawl entire site
Not sure what happened but I started a third campaign yesterday and only 1 pages was crawled, The other two campaigns has 472 and 10K respectively. What is the proper setting to choose in the beginning of campaign setup to have the entire site crawled. Not sure what I did different and I must be reading the instructions incorrectly. Thanks, Don
Moz Pro | | NicheGuy210 -
Seomoz crawling filtered pages
Hi, I just checked an seo campaign we started last week, so I opened seomoz to see the crawl diagnostics. Lot's of duplicate content & duplicate titles showing up, but that's because Rogerbot is crawling all of the filtered pages as well. How do I exclude these pages from being crawled? /product/brand-x/3969?order=brand&sortorder=ASC
Moz Pro | | nvs.nim
/product/brand-x/3969?order=popular&sortorder=ASC
/product/brand-x/3969?order=popular&sortorder=DESC&page=10
/product/brand-x/3969?order=popular&sortorder=DESC&page=110 -
Why did SEOMoz only crawl 1 page?
I have multiple campaigns and on a few of them SEOMoz has only crawled one page. I think this may have to do with how I set up the campaign. How do I get SEOMoz to crawl more than one page on these campaigns.
Moz Pro | | HermanAdvertising0 -
How do i archive a campaign?
Hi my account is saying i have reached my limit (5) and i am wanting to archive some of the current campaigns in have there within the 5 (they were trial play accounts). Can anyone advise on the best way to do this? At this stage i am not in a position to upgrade my account, so i am hoping that this is still a possibility with the current membership i have and the Q&A outside of saying it is easy does not explain how to do it 🙂 HELP PLEASE?
Moz Pro | | SueCook_TAOS0 -
SEOMoz Pro and Multiple Niche Sites
Currently I have 15 affiliate marketing sites and three blogs which are monetized. My budget dictates the base pro package so may I get some recommendations on how or if I can do analysis over time for all the sites. I ask this because it seems you would set your campaigns for long-term but I imagine you delete/modify/add campaigns to analyze different key words. Thanks for your help.
Moz Pro | | JavaManOne0