Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Find Pages with 0 traffic
-
Hi,
We are trying to consolidate the amount of landing pages on our site, is there any way to find landing pages with a particular URL substring which have had 0 traffic?
The minimum which appears in google analytics is 1 visit.
-
This is a really nice solution! Thanks for sharing. It's super quick as well, so a GA export and a few VLOOKUPs/pivots later and you're sorted - nice one!
-
No problem my friend : -))
-
My bad. I misunderstood and misread. Thanks for the update.
-
He is trying to consolidate or find the total number of landing pages that do not have any traffic at all. So, screaming frog seo spider can be used to crawl the entire website (with the substring in the URLs) and substitute the URLs that have driven at least 1 visitor. He is not trying to get a hold of his historic or old analytics data. The question is pretty straight forward unless I missed something.
-
Yes, but how does that help him get the old data he needs? Crawlers shouldn't know your traffic unless you install the code they give you or verify some other way. Find it to be a crawler causing the problem unlikely unless I misunderstood the problem/question. I sure hope they have a Linux host (most are) and can just check the apache logs while Google Analytics takes a few days to update.
-
What webhost are you using? Most keep analytics software enabled by default or at least lets you turn it on. (While you wait for Google.) Analytics are a key part to SEO so I use awstats (free), and webalizer. With most hosts if not enabled its as easy as clicking a button.
Depending on your host, you might be able to get the raw log info, but most hosts don't have this option unless you paid for a fancy account which allows root shell access, but maybe not it differs from site to site.
Google Analytics will only show 1 visit if you are the only visitor even if you refresh the page or come pack. It saves your IP address and hardware profile most likely is the method they use. Make sure you change Google Analytics to display as far back as possible.
-
Hi, you can use a crawler like Screaming Frog SEO Spider to come up with total number of pages with some unique string in the URLs, substitute the URLs that have traffic from these and the rest will be ones with no traffic.
You will have to use the paid version of Screaming Frog SEO Spider if you want to crawl more than 500 pages and here is the section of the user guide that tells you how to do a regex crawl:
http://www.screamingfrog.co.uk/seo-spider/user-guide/configuration/#9
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fire a tag when element is loaded on page (Google Tag Manager)
I'm using an Element Visibility trigger to track a value that appears on a page. However, I want to track this value even when the user doesn't scroll to the area of the page where the element is (i.e. when the page is loaded, and the value is displayed below the fold, but the user doesn't scroll down there). Is there a way of doing this
Reporting & Analytics | | RWesley0 -
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
UTM Links Showing Up as Separate Pages in Google Analytics
Hey everyone, I was just looking at landing pages in Google Analytics, and in addition to just the URL of the landing page, the UTM links are being listed as separate pages. Is this normal? I anticipated seeing the landing page URL and then using the secondary dimension to see source/medium. If this isn't normal, what would I check next?
Reporting & Analytics | | rachelmeyer0 -
How to exclude traffic for a specific mobile device in Google Analytics view?
Hi, Need help on how to exclude traffic for a specific mobile device in Google Analytics view. I have been searching and the only information available is excluding IP address of internal traffic. Is there any way to exclude traffic through a mobile MAC address?
Reporting & Analytics | | Khadija_K0 -
What does 'Safari (in-app)' mean in Google Analytics browser traffic?
Hi, can anyone explain what 'Safari (in-app)' refers to in my browser sources? Also, it has a very high bounce rate - any ideas why?
Reporting & Analytics | | b4cab1 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Direct traffic spam on Google Analytics: how can you identify and filter it?
One of my smaller clients noticed a huge jump in direct traffic visits last month. The bounce rate was around 97% so I'm pretty certain that most of the traffic was illegitimate. I know how to filter out spam referrals and organic keywords in Google Analytics. However I'm not sure what to do about direct traffic spam. Are there recommendations for filtering this out? Can I identify spam IP addresses?
Reporting & Analytics | | RosemaryB0 -
Switch to www from non www preference negatively hit # pages indexed
I have a client whose site did not use the www preference but rather the non www form of the url. We were having trouble seeing some high quality inlinks and I wondered if the redirect to the non www site from the links was making it hard for us to track. After some reading, it seemed we should be using the www version for better SEO anyway so I made a change on Monday but had a major hit to the number of pages being indexed by Thursday. Freaking me out mildly. What are people's thoughts? I think I should roll back the www change asap - or am I jumping the gun?
Reporting & Analytics | | BrigitteMN0