Better to have less pages with more related content?
-
I work with a law firm and we are having a hard time busting into the first page of results for any of our keywords. I am new at SEO and have been trying to analyze how are competitors have an edge over us when on paper we are better optimized than their websites. One glaring difference is they have fewer webpages, which possibly makes each of their pages more keyword rich.
Would it be smarter to condense our many webpages/topics into less, more general web pages?
I hope my question is even making sense, thanks for any possible help!
Our site is http://www.utahdefenseattorney.net/
-
You'd want to decide pages based on how to best optimize for keyword searches. If you find more people search for drug possession than marijuana or cocaine possession, then it would be better to condense all of that info under one page. It really all depends on what is going to net you the most opportunity. You could have one umbrella page for drug charges and then a couple of sub pages with details on each type of possession charge. Beef up your content with info about potential sentencing/outcomes, what to expect during the trial, etc. That will help you capture additional long tail searches as well.
-
To be honest we haven't yet gotten rigorous with our keyword planning - like I said I'm pretty new at SEO and everything has been a little overwhelming at first. We are hoping our homepage will rank with "salt lake city criminal defense attorney" and then hope to optimize each of the individual 'charges' pages. I haven't made it through every page on the site yet because there are so many, so thanks for bringing that particular one to my attention.
So I am wondering since "marijuana possession attorney" isn't a searched for term if it would be better to condense all the drug charge pages into one page instead of having them broken up into 11 more pages specific to each drug charge.
Thanks for your help and patience, definitely a huge learning curve here!
-
How rigorous have you been with your keyword research/planning? Are you sure you're optimizing for the most-searched terms? And then optimizing for those terms across all of your page elements (including body copy)? For example, you've partially optimized for "marijuana possession attorney," but according to Google's keyword planner that term has zero searches in the state of Utah.
Also, I agree with EGOL that you need to eliminate the extra blank title tag. That's definitely not helping your cause.
-
Give us an example of the keywords that you are concerned about.
I looked at this page....
http://www.utahdefenseattorney.net/criminal-defense/drug-charges/possession-of-marijuana/
It is not optimized well for anything. The title tag is....
<title>Marijuana Possession Attorney in Salt Lake City | Intermountain Legal</title>
.... but the visible text on the page is not optimized for that.
Also, I see an instance of the title tag higher in the code....
.... which might or might not be causing a problem, but which I would definitely eliminate because you can never be sure of how a search engine will treat it..
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on Places to Stay listings pages
Hello, I've just crawled our website https://www.i-escape.com/ to find we have a duplicate content issue. Every places to stay listing page has identical content (over 1,500 places) due to the fact it's based on user searches or selections. If we hide this pages using canonical tags, will we lose our visibility for each country and/or region we promote hotels? Any help on this would be hugely appreciated! Thanks so much Clair
Technical SEO | | iescape0 -
Do mobile and desktop sites that pull content from the same source count as duplicate content?
We are about to launch a mobile site that pulls content from the same CMS, including metadata. They both have different top-level domains, however (www.abcd.com and www.m.abcd.com). How will this affect us in terms of search engine ranking?
Technical SEO | | ovenbird0 -
How to deal with duplicated content on product pages?
Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen
Technical SEO | | Digital-DMG0 -
On-Page Problem
Hello Mozzers, A friend has a business website and the on-page stuff is done really bad. He wants to rank for: conference room furnishing, video conference, digital signage. (Don't worry about the keywords, it's just made up for an example.) For these three services he has a page: hiswebsite.com/av AV stands for audio and video and is the h1. If you click on one of the service, the url doesn't change. Like if you click on video conference, just the text changes, the url stays /av. All his targeted pages got an F Grade, I am not surprised, the services titles are in . Wouldn't it be a lot better to make an own page for every service with a targeted keyword, like hiswebsite.com/video-conference All this stuff is on /av, how will a 301 resirect work to all the service pages, does this make sense? Any help is appreciated! Thanks in advance!
Technical SEO | | grobro1 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Added 301 redirects, pages still earning duplicate content warning
We recently added a number of 301 redirects for duplicate content pages, but even with this addition they are still showing up as duplicate content. Am I missing something here? Or is this a duplicate content warning I should ignore?
Technical SEO | | cglife0 -
Duplicate content
I'm getting an error showing that two separate pages have duplicate content. The pages are: | Help System: Domain Registration Agreement - Registrar Register4Less, Inc. http://register4less.com/faq/cache/11.html 1 27 1 Help System: Domain Registration Agreement - Register4Less Reseller (Tucows) http://register4less.com/faq/cache/7.html | These are both registration agreements, one for us (Register4Less, Inc.) as the registrar, and one for Tucows as the registrar. The pages are largely the same, but are in fact different. Is there a way to flag these pages as not being duplicate content? Thanks, Doug.
Technical SEO | | R4L0 -
Why are my pages getting duplicate content errors?
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page: http://www.mapsalive.com/Features/audio.aspx http://www.mapsalive.com/Features/Audio.aspx The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
Technical SEO | | jkenyon0