Duplicate content, hijacked search console, crawl errors, ACCCK.
-
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel.
Of course I was not around to manage the build. I also do not have ftp access
I'm also dealing with major search console issues.
The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation.
The site prefers the www version and does not read the same traffic for the non www version
We also have something like 90,000 backlinks from 13 sites.
And a shit ton of ghost spam.
Help!
-
Yes, thank you so much, I will. What I'm concerned about is how bad this was in the first place. The way this company markets themselves is completely out of line with with the state of the build and the advice they give my employers. My bosses LOVE these guys because they are supposedly #technology #experts that do national speaking engagements about #success
What I see from them is mostly paid product endorsements, outsourced workforce, and #broisms on social media.
They're fast talking sales people that are delivering a product to people who don't understand what they are getting (or not getting) under the hood.
-
I'm glad you were able to sort out part of your issue, and it sounds like there's hope for it all to get fixed! The only thing I would add is to make sure you get a promise in writing that should you part ways with the company that's hosting the site, they will transfer the site to a host of your choosing and hand over the keys.
-
Ok, so it's TWO companies.
One is the marketing company that provides the website, the other is a local SEO company that created just the www version of our search console.
We own the domain, but the marketing company has the only access to the website files/ hosting. I'm guessing we are on a shared server with their other clients, so we will not get access. They have front end people on the team, but no understanding of SEO whatsoever. When I came on there was no sitemap or robots file submitted to search console, for example.
As far as the Search Console issue, I actually have a friend at that company there that told me their setup is proprietary. We no longer have a relationship with that company, and the owner of our company was also an owner on Search Console. I managed to remove all the reps from this company by unverifying, so that is no longer a problem! Maybe you helped me in spirit. I tried to do this a few times before and didn't find the way, but right after your response I did.
So now at least, the only issue is with the duplicate staging site and the ghost spam. I'd really prefer that the company take the staging files down. My employers paid a to of money for this site and are paying a large monthly retainer. At the very least we should have a clean build. It's over 4,000 duplicate pages, so I think that is going to have to be on them.
As far as ghost spam, I'll read the articles and get er done.
Thank you so much for your thoughtful response.
-
I have so many questions about this arrangement.
First of all, the third party ownership of the Search Console (and GA too, maybe?) is a massive red flag. Account ownership should always ALWAYS be handled in house. You need to insist on that, and insist loudly and furiously. It's extremely shady for a third-party SEO to own the accounts since it lets them hold the site and its data hostage if the relationship sours. How easy would it be for people who aren't even part of your company to use Search Console to start removing important URLs from the index? What happens to your data if you end the contract? Do they also own your analytics? Could they cut off your access to your own data on a whim? Replace your site with a page telling the world what awful clients you are? Depending on the size and type of company you are, letting an outsider own that access could be a very real threat to your business with the potential to do significant damage.
Also, what exactly is the local SEO company's role here? Why aren't THEY worrying about referral spam and questionable backlinks? If they're not, then what are they being paid to do?
If you don't have FTP access, who does? Does your company actually own the site? Is there a contract that spells it out?
For the staging site, all you should need is to make sure it's excluded from indexing via robots.txt. We have had multiple staging sites that, if indexed, would put some crazy dupe content into the world, but that's what the robots.txt is for. Set and forget. Well, check on it periodically, since you don't seem to have any actual control over what these guys are doing and the account ownership thing makes me very wary of trusting them to get it right and keep it that way.
As for the ghost spam, there's been a ton of discussion about it in the community over the last year. On Moz alone, there's this piece from March, and this one from August, plus a bunch of forum discussions. Bottom line is that there isn't much you can do to stop it, but that doesn't mean you're stuck with seeing it muck up your data.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should Multi Location Businesses "Local Content Silo" Their Services Pages?
I manage a site for a medical practice that has two locations. We already have a location page for each office location and we have the NAP for both locations in the footer of every page. I'm considering making a change to the structure of the site to help it rank better for individual services at each of the two locations, which I think will help pages rank in their specific locales by having the city name in the URL. However, I'm concerned about diluting the domain authority that gets passed to the pages by moving them deeper in the site's structure. For instance, the services URLs are currently structured like this: www.domain.com/services/teeth-whitening (where the service is offered in each of the two locations) Would it make sense to move to a structure more like www.domain.com/city1name/teeth-whitening www.domain.com/city2name/teeth-whitening Does anyone have insight from dealing with multi-location brands on the best way to go about this?
Local Website Optimization | | formandfunctionagency1 -
I've submitted my site to google search console, and only 6 images of 89 images have been indexed in 2 weeks. Should I be worried?
I've submitted my site to google search console, and only 6 images of 89 images have been indexed in 2 weeks. Should I be worried? My site is http://bayareahomebirth.org Images are a pretty big part of this site's content and SEO value. Thanks for your help!
Local Website Optimization | | mattchew0 -
Is it possible to rank for street name searches?
I am working with a real estate agency who serves a very small geographical area in Dallas, TX. Many areas with Dallas addresses have proper names (e.g. Uptown, Highland Park, Lake Highlands, etc.), but the area my client wants to target is nameless, so we had the idea of trying to target searches for particular street names instead (e.g. homes for sale on easy street). I have looked around quite a bit, but have not found a website that takes that approach. Any thoughts on whether it's possible?
Local Website Optimization | | cbizzle0 -
Content Strategy – Blog Channel Questions
We are currently blogging at a high volume to hit keywords for our 1,500 locations across the country. We are trying to make sure we rank well near each location and we have been using our blog to create content for that reason. With recent changes on Google, I am seeing that it is more about content topics than hitting all variations of your keywords and including state and city specific terms. We are now asking ourselves if the blog channel portion of our content strategy is incorrect. Below are some of the main questions we have and any input that is backed by experience would be helpful. 1. Can it hurt us to blog at a high volume (4 blogs per day) in an effort to include all of our keywords and attach them to state and city specific keywords (ie. "keyword one" with "keyword one city" and "keyword one different city")? 2. Is it more valuable to blog only a couple of times per month with deeper content, or more times per month with thinner connect but more keyword involvement? 3. Our customers are forced to use our type of product by the government. We are one of the vendors that provide this service. Because of this our customers may not care at all about anything we would blog about. Do we blog for them, or do we blog for the keyword and try and reach partners and others who would read the content and hope that it also ranks us high when our potential customers search? 4. Is there an advantage/disadvantage or does it matter if we have multiple blog authors? Big questions for sure, but if you have insight on any one of them, please provide and maybe we can answer them all with a group effort. Thanks to all of you who are taking the time to read this and contribute.
Local Website Optimization | | Smart_Start0 -
Migrating to new website with new name and new content
Hi for the past few years I have been running a personal training company from the following domain name www.smpt.me. This has done well in the past and so has some authority in google as it was ranking well on page 1. Over the last 6 months I have set up a new website with some new business partners using the domain name www.healthbyscience.co.uk. This new website, whilst still a personal training website, has different content to the original. We want to use the new website rather than the old one and therefore my question is how I can use the old website to assist with the new website. Thanks
Local Website Optimization | | Health-by-Science0 -
Using geolocation for dynamic content - what's the best practice for SEO?
Hello We sell a product globally but I want to use different keywords to describe the product based on location. For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name). What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches). I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5. I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada. I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript. Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though. Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck. Thank you so much! Laura Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
Local Website Optimization | | LauraFalls0 -
Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
Basically, I have 2 company websites running. The first resides on a .com and the second resides on a .co.uk domain. The content is simply localized for the UK audience, not necessarily 100% original for the UK. The main website is the .com website but we expanded into the UK, IE and AU markets. However, the .co.uk domain is targeting UK, IE and AU. I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
Local Website Optimization | | QuickToImpress0 -
Multi Location business - Should I 301 redirect duplicate location pages or alternatively No Follow tag them ?
Hello All, I have a eCommerce site and we operate out of mulitple locations. We currently have individual location pages for these locations against each of our many categories. However on the flip slide , this create alot of duplicate content. All of our location pages whether unique or duplicated have a unique title Tag, H1, H2 tag , NAP and they all bring in the City Name . The content on the duplicated content also brings in the City name as well. We have been going through our categories and writing unique content for our most popular locations to help rank on local search. Currently I've been setting up 301 redirects for the locations in the categories with the duplicated content pointing back to the category page. I am wondering whether the increase in number of 301's will do more harm than having many duplicate location pages ?.. I am sure my site is affected by the panda algorithm penalty(on the duplicated content issues) as a couple of years ago , this didn't matter and we ranked top 3 for pretty much for every location but now we are ranking between 8 - 20th depending on keyword. An Alternative I thought, may be to instead of 301 those locations pages with duplicate content, is to put No Follow tags on them instead ?... What do you think ?. It's not economically viable to write unique content for every location on every category and these would not only take years but would cost us far to much money. Our Site is currently approx 10,000 pages Any thoughts on this greatly appreciated ? thanks Pete
Local Website Optimization | | PeteC120