[Moderator deleted question.]
-
[Moderator deleted question.]
-
Andy is correct, we do not allow job postings in this forum. Otherwise, it would likely look much more like an SEO job board than an SEO discussion forum.
Thanks for your understanding, Missionunpossible. You may some luck posting on the job board at Inbound.org. We do maintain a list of Moz-recommended SEO consultants here, and wish you the best of luck in your search!
-
Oops sry didnt know
-
MOZ doesn't allow these sorts of posts I'm afraid.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Homepage organization schema question: logo lives on amazon server, can I call that out on the structured data?
Basically, the homepage organization schema has called out the logo, but it lives on the amazon server. We're having issues with Google rendering the correct logo on the knowledge graph. The URL for the amazon asset looks something like this: <brandname>-assets.s3-us-west-2.amazonaws.com/<logo>.png</logo></brandname> Calling that out on the organization structured data for the logo is okay right?
Intermediate & Advanced SEO | | imjonny1230 -
Google Search Console Site Property Questions
I have a few questions regarding Google Search Console. Google Search Console tells you to add all versions of your website https, http, www, and non-www. 1.) Do I than add ALL the information for ALL versions? Sitemaps, preferred site, etc.? 2.) If yes, when I add sitemaps to each version, do I add the sitemap url of the site version I'm on or my preferred version? - For instance when adding a sitemap to a non-www version of the site, do I use the non-www version of the sitemap? Or since I prefer a https://www.domain.com/sitemap.xml do I use it there? 3.) When adding my preferred site (www or non-www) do I use my preferred site on all site versions? (https, http, www, and non-www) Thanks in advance. Answers vary throughout Google!
Intermediate & Advanced SEO | | Mike.Bean0 -
Deleting Outdated News Pages??
Hi everyone, I'm currently doing a full content audit for my company, in preparation for a website redesign. I've discovered thousands of pages (dating all the way back to 2009) with thin, outdated, and irrelevant content. ie: real estate news and predictions that are now super old news. According to analytics, these older pages aren't receiving any traffic, so I think the best course of action is to delete these pages & add 404 redirects. In my opinion, this should be a big priority, because these pages are likely already hurting our domain authority to some extent & it's just a matter of time before we're really penalized by Google. Some members of my team have a different opinion -- they worry that deleting 1000 pages could hurt our rankings, and they want to wait and discuss the issue further in 3Q or 4Q (once the site redesign is completed and we have time to focus on it). Am I wrong to think that waiting is a very bad idea? Google will notice that we've done a major site redesign--we've written all new copy, optimized the UX & content organization to make info easier to find, created new lead magnets, optimized images, etc.-- but we didn't bother to update 1000 pages of outdated content that no one is looking at...won't that look bad? Do you agree that we should delete/merge all outdated content now, rather than waiting until after the site redesign? Or am I overreacting? Thanks so much for your help!
Intermediate & Advanced SEO | | JCon7110 -
Google PR & OSE DA/PA Question
Hey Moz Community, Can anyone explain why a website would have a PR4 Home page and most inner pages PR3 with only a DA12 and PA14 from OSE? The website in question is my Rotary club http://carymacgregorrotary.org. Thank you.
Intermediate & Advanced SEO | | WhiteboardCreations
Patrick0 -
Technical Site Questions
When i do a google cache of our site, i see 2 menus, our developers say that's because the 2nd is for the mobile menu - is that correct, as when i look up other sites that have mobile rendering they only have one menu visible. Plus GWT's has the number of internal links per page at least x2 what they should have - are they connected? Secondly when i do a spider test through http://tools.seobook.com/general/spider-test/ it shows all "behind the scenes text" eg font names, portals, sliders, margins - "font size px" is shown as 17 times and a density of 2.15% - surely this isnt correct as google will be thinking that these are my keywords !? My site is www.over50choices.co.uk Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
XML sitemaps questions
Hi All, My developer has asked me some questions that I do not know the answer to. We have both searched for an answer but can't find one.... So, I was hoping that the clever folk on Moz can help!!! Here is couple questions that would be nice to clarify on. What is the actual address/name of file for news xml. Can xml site maps be generated on request? Consider following scenario: spider requests http://mypage.com/sitemap.xml which permanently redirects to extensionless MVC 4 page http://mypage.com/sitemapxml/ . This page generates xml. Thank you, Amelia
Intermediate & Advanced SEO | | CommT0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Diversifying anchor text question
Hi, I've seen a new article by Dr. Pete on diversifying links for 2013 (http://www.seomoz.org/blog/top-1-seo-tips-for-2013), now my question is this: Dr. Pete talks about mixing up the anchor text for links, is so we don't get caught out by Google or actually mixing it has a better impact? For example: 1. 20 anchor text links targeting just the target term. 2. 20 anchor text links targeting 4 variations of the target term. Is number 2 recommended so things look natural or does it actually have a better impact on SEO. Thanks
Intermediate & Advanced SEO | | activitysuper0