Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.


  • google listing seo website submission

    have a domain that was unused for a while and displayed the "parked by Gandi" domain message. Recently, I decided to set up a fully working website on that domain, but I'm struggling to get it to rank correctly on Google. Google isn't updating the http://domain.com to the correct https://www.domain.com, and I have tried everything. The http://domain.com redirects to https://www.domain.com if you click on it. I managed to get the correct version, https://www.domain.com, to show up, but after a couple of days, it reverted to the old version and has stayed that way. This has been going on for a few weeks now. Additionally, I had an old version of the site using .com, and I forwarded that to the .co.uk, but the old .com will not be removed from Google. it was just one page without much content. At the moment, Google's search results look very messy and out of date. I have submitted the URL in the Google Search Console using the URL inspection tool and requested indexing and reindexing. I have also submitted sitemaps, but nothing is really working. Any ideas?

    SEO Tactics | | Roycd
    0

  • google http https indexation crawl

    Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!

    Technical SEO | | AKCAC
    1

  • seo consulting google

    This is my personal website. I wonder, will the articles written about artificial intelligence rank on Google, or will the site not rank? https://withpositivity.com/

    Community | | lowzy
    0
  • Unsolved

    google my business google penalty google

    We respond to almost 100% of our quote requests, yet every quote email that comes in from Google shows a 27% response rate and it never changes. Has anyone else seen this or have any insight into it?

    Local Listings | | r1200gsa
    0

  • seo seo rankings rankings google

    We recently launched a new design of our website and for SEO purposes we decided to have our website both in English and in Dutch. However, when I look at the rankings in MOZ for many of our keywords, it seems the English pages are being preferred over the Dutch ones. That never used to be the case when we had our website in the old design. It mainly is for pages that have an English keyword attached to them, but even then the Dutch page would just rank. I'm trying to figure out why English pages are being preferred now and whether that could actually damage our rankings, as search engines would prefer copy in the local language. An example is this page: https://www.bluebillywig.com/nl/html5-video-player/ for the keywords "HTML5 player" and "HTML5 video player".

    Local SEO | | Billywig
    0

  • duplicate titles multiple titles breadcrumbs google moz

    Moz has been flagging a lot of our articles for duplicate titles. I noticed that the website was automatically generating the same words that were used for the titles above the articles and main images. But those are bread crumbs. Does anyone know if Google also considers those duplicate titles? I'm not sure if the screenshot I'm uploading will appear, but this is one of the blog posts the extra wording was showing up on: https://hometowneautorepairandtireofwoodbridge.com/blog/when-why-replace-spark-plugs/

    Content Development | | ifixcars
    1

  • seo page treatment page rank google rankings

    Re; escortdirectory-uk.com, escortdirectory-usa.com, escortdirectory-oz.com.au,
    Hi, We are an escort directory with 10 years history. We have multiple locations within the following countries, UK, USA, AUS. Although many of our locations (towns and cities) index on page one of Google, just as many do not. Can anyone give us a clue as to why this may be?

    Technical SEO | | ZuricoDrexia
    0

Looks like your connection to Moz was lost, please wait while we try to reconnect.