Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Site Not Rankings After a Few Months
-
I have a client site that I am beating my head against the wall for right now. Three months into a 100% white hat campaign, we can't get him ranking in the top 150. Here's the cliffsnotes:
- Built a new wordpress website
- All on page SEO has been done and score an A+ for his primary kws
- Robots.txt is setup correctly
- .htaccess is setup correctly
- new domain
- multiple 95 DA, 50 PA links from reputable, national sites.
- Yext Local listings
- SSL, CDN, Speed optimized
- Has 19 pages indexed by Google
- Posting one blog a week for him
Granted his primary keyword is a hyper competitive kw, but still, I've been doing this for 8 years and never seen a guy be stuck on the 16th page for so long for the sort of links we are building him. I'm genuinely stumped here and could use some help.
-
Not a problem Brian.
I'm interested to see what sort of impact these changes make!
-
So David was kind enough to look at the site and found some potential issues. In short, the home page had 57 heading tags which is excessive, there are some heading tags missing from some interior pages and there's a few pages that should be no indexed that aren't
I'm going to fix all of these, give it some time and see if this fixes the issues. I'll keep this thread updated.
Thanks David for your time!
-
Can you post what was the issue? (without telling the web)
-
PM replied
-
Thanks David - I PM'd you the info
-
Hi Brian,
This is a really tough one to respond to without knowing the domain and being able to have a look at it.
If you PM me the domain, I'd be happy to take a look and see if I can find anything (I won't mention the domain publicly).
Cheers,
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google for Jobs: how to deal with third-party sites that appear instead of your own?
We have shared our company's job postings on several third-party websites, including The Muse, as well as putting the job postings on our own website. Our site and The Muse have about the same schema markup except for these differences: The Muse...
Local Website Optimization | | Kevin_P
• Lists Experience Requirements
• Uses HTML in the description with tags and other markup (our website just has plain text)
• Has a Name in JobPosting
• URL is specific to the position (our website's URL just goes to the homepage)
• Has a logo URL for Organization When you type the exact job posting's title into Google, The Muse posting shows up in Google for Jobs--not our website's duplicate copy. The only way to see our website's job posting is to type in the exact job title plus "site:http://www.oursite.com". What is a good approach for getting our website's posting to be the priority in Google for Jobs? Do we need to remove postings from third-party sites? Structure them differently? Do organic factors affect which version of the job posting is shown, and if so, can I assume that our site will face challenges outranking a big third-party site?1 -
Schema markup for a local directory listing and Web Site name
Howdy there! Two schema related questions here Schema markup for local directory We have a page that lists multiple location information on a single page as a directory type listing. Each listing has a link to another page that contains more in depth information about that location. We have seen markups using Schema Local Business markup for each location listed on the directory page. Examples: http://www.yellowpages.com/metairie-la/gold-buyers http://yellowpages.superpages.com/listings.jsp?CS=L&MCBP=true&C=plumber%2C+dallas+tx Both of these validate using the Google testing tool, but what is strange is that the yellowpages.com example puts the URL to the profile page for a given location as the "name" in the schema for the local business, superpages.com uses the actual name of the location. Other sites such as Yelp etc have no markup for a location at all on a directory type page. We want to stay with schema and leaning towards the superpages option. Any opinions on the best route to go with this? Schema markup for logo and social profiles vs website name. If you read the article for schema markup for your logo and social profiles, it recommends/shows using the @type of Organization in the schema markup https://developers.google.com/structured-data/customize/social-profiles If you then click down the left column on that page to "Show your name in search results" it recommends/shows using the @type of WebSite in the schema markup. https://developers.google.com/structured-data/site-name We want to have the markup for the logo, social profiles and website name. Do we just need to repeat the schema for the @website name in addition to what we have for @organization (two sets of markup?). Our concern is that in both we are referencing the same home page and in one case on the page we are saying we are an organization and in another a website. Does this matter? Will Google be ok with the logo and social profile markup if we use the @website designation? Thanks!
Local Website Optimization | | HeaHea0 -
Multilocation business, how can you rank for different categories in different locations with only branch pages?
Hello Mozzers, I am wondering how do you rank for categories locally where when you operate from multiple branches. Currently our eCommerce website has location pages for every category but I know that this is now classed as doorway pages and spammy so I am in the process of sorting out our site structure. I understand that the general format for having sites with multiple branches is to have a branch page per physical location and that's about it. Is there any more to this ? However, What confuses me though, is that if you offer all these services in all these branches, how are you going to rank for them locally if you don't have a specific page for each of them in that location? So for example - We rent Carpet cleaners , floor sanders, generators in each of our different branches. My site currently has a carpet cleaner hire <location>url , floor sander hire <location>url and a generator hire <location>url. Every branch has a url for each of my categories.</location></location></location> So if I was to get rid of all of my location category pages. How am I going to rank for these renting these products in different cities where our branches does without having specific location pages for them ? Is it just a case that google knows that because I have branch pages at locations x, y, x , then my carpet cleaner , floor sander and generator category pages will rank locally in those locations providing I have decent citations etc etc etc thanks
Local Website Optimization | | PeteC12
Pete0 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0 -
Best marketing for a language learning site
Hello everybody, I'm a programmer so I'm not very good at marketing. Any idea what the best way is to promote my language learning site? (http://www.antosch-and-lin.com/) Since Google Penguin the site has taken a big hit and the changes suggested by a SEO expert hasn't helped. Thanks for any suggestions!
Local Website Optimization | | delpino0 -
Competitor Ranking High with Questionable Backlinks
Happy Friday Mozzers! I wanted to pick your brains this morning, and see what your thoughts were on how Google missed this one. One of our competitors is ranking high in Google, and has been for some time. About 5-6 months ago, his site skyrocketed from page 3-4 to the top of page one. The site meta tag is pulling in logo alt text, content is very messy and sales driven, and after looking at the backlink profile in MOZ tools, it has a ton of links from China, Japan, Korea. Most of the backlinks are from blog pages, about everything under the sun, from UFO's to porn sites. This site has consistantly ranked high at the top of the page for many different competitive keywords. My question is this: HOW? After all the updates done by Google, and their focus on web spam, what is allowing this site to rank high constantly? (5-6 months now, and often in the number one spot). Here is an example of some of the backlinks. There are a LOT of them. http://sundtjek-wp.alexandra.dk/?p=1
Local Website Optimization | | David-Kley
http://ice.anyang.ac.kr/xe/teacher/2095
http://blog.so-net.ne.jp/etsuko_hayashi_ET3/2006-07-02 Don't worry, we are not looking to follow in his footsteps, lol. I was just wondering how this can happen, and for such a long time period.0 -
Single Site For Multiple Locations Or Multiple Sites?
Hi, Sorry if this rambles on. There's a few details that kind of convolute this issue so I'll try and be as clear as possible. The site in question has been online for roughly 5 years. It's established with many local citations, does well in local SERPs (working on organic results currently), and represents a business with 2 locations in the same county. The domain is structured as location1brandname.com. The site was recently upgraded from a 6-10 page static HTML site with loads of duplicate content and poor structure to a nice, clean WordPress layout. Again, Google is cool with it, everything was 301'd properly, and our rankings haven't dropped (some have improved). Here's the tricky part: To properly optimize this site for our second location, I am basically building a second website within the original, but customized for our second location. It will be location1brandname.com/secondcity and the menu will be unique to second-city service pages, unique NAP on footer, etc. I will then update our local citations with this new URL and hopefully we'll start appearing higher in local SERPs for the second-city keywords that our main URL isn't currently optimized for. The issue I have is that our root domain has our first city location in the domain and that this might have some negative effect on ranking for the second URL. Conversely, starting on a brand new domain (secondcitybrandname.com) requires building an entire new site and being brand new. My hunch is that we'll be fine making root.com/secondcity that locations homepage and starting a new domain, while cleaner and compeltely separate from our other location, is too much work for not enough benefit. It seems like if they're the same company/brand, they should be on the same sitee. and we can use the root juice to help. Thoughts?
Local Website Optimization | | kirmeliux0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0