How can I set up a campaign to track just directories on a specific subdomain?
-
I am trying to setup a campaign to track a specific subdomain and all its directories. For example, want to track example.abc.com/11111 and example.abc.com/22222 and so on. No interest in tracking abc.com itself. Is this possible?
-
This is done on the first page of Campaign Setup, Chaya. There's a radio button to select that you want to track specifically a subdomain (it's the first button) and then below, there's a text box where you enter the exact address of the subdomain. The page also has examples of typical subdomain addresses to prompt you.
I've included a screenshot with arrows pointing to the boxes you need to use.
Does that get you set up the way you need?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Confused on setting up my domain in a campaign
I have a website that is www.example.com. When I first setup my campaign in SEOMoz, I missed the instructions that stated I need to set up my domain as it appeared in the browser. Thus, I set it up as example.com. When I realized my mistake, I setup a second campaign as www.example.com and am running them side by side to see the differences. My question has to do with the best setup. I have a blog that is a subdomain as blog.example.com. It appears that the www.example.com campaign is not crawling the blog pages but example.com is. Are there any downsides in setting up this campaign as example.com? I have a number of other sites with the same issue so would like to get this resolved before I setup the additional sites. Thanks for any assistance or insights into ways I may be "shooting myself in the foot" with the wrong domain structure in my campaign settings.
Moz Pro | | rfwood0 -
Campaigns - crawled
The new Pages Crawled: 2. I have many 404 and other errors, I wanted to start working on it tomorrow but the new crawl only crawled to pages and doesn't show any errors. Whats the problem and what can I do? Yoseph
Moz Pro | | Joseph-Green-SEO0 -
Issue with Data Updates on Campaign
I'm reviewing a campaign I am running for a client, the Keywords data has not updated yet even though it's supposed to update every monday. The competitive analysis shows a date of August 14, which is odd becuase I'm on a trial account?
Moz Pro | | amerihope0 -
Campaign Setup-GA account
When setting up a campaign- in select a Google account it does not show the subdoman account. The drop down has other accounts we have but not the one we want.
Moz Pro | | RNK0 -
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!
Moz Pro | | prima-2535090 -
Campaign data via API
I am currently in the process of building an in house system for managing our clients SEO campaigns. One of the features that we will be including is a customer login where clients can view their reports and site progress. We plan on providing rankings and traffic stats within this which we will collect on a weekly basis to store historical changes. While I could integrate into analytics for traffic and create my own web crawler for rankings, this seems a waste since I already have that data in one central location. Looking through the moz API details it seems to have lots of functions for links but nothing relating to the campaigns. Is there any way or are there any plans to open up campaign data to an API? If I could do this I would certainly upgrade to allow each client to have their own campaign.
Moz Pro | | PPCnSEO0 -
Can I get a list of all links on a given domain?
Sorry, this is actually kind of a tripartite question: I was looking at the Competitive Link Analysis on one my clients' campaigns. Sometime between June and September their total links went up by about 120,000. We have no idea where those links came from (although the numbers would indicate that they're mostly internal). Question 1: In none of the other tools can I figure out how to list these links on a domain level. Is there a way to get a list of all links for our given domain? I've been playing around with the page-by-page and even that doesn't show me everything. For example, I'm looking at OSE for their homepage and it lists 45 links for a page that it claims has 151 total. Question 2: How did it pick those 45 to display out of the 151 possible? If these are only external links, why do half of them come from one of our subdomains? Also... Question 3: If our client hasn't made any major changes recently, why has the number of internal links gone up so dramatically? Thanks.
Moz Pro | | MackenzieFogelson1 -
Can SEOMOZ do Local SEO ?
Most of our clients are local but we can not get local or regional (for example VA State or Washington DC) SERP results within SEOMOZ. it would be nice to create custom search profiles based on region (State or County) and track SERP within SEOMOZ. Ranking reports and even keyword difficulty analysis etc. within SEOMOZ are misleading for local web sites. SEOMOZ default site setup assumes every business doing SEO serves all over the US. I think it will be very beneficial if we have a choice to select State or County of local web sites during initial campaing set up at SEOMOZ. what do you think? do you know any work around of this problem? Thanks,
Moz Pro | | CertifiedSEO
Lewis2