Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Significant "Average Position" dips in Search Console each time I post on Google My Business
-
Hi everyone,
Several weeks ago I noticed that each Wednesday my site's Average Position in Search Console dipped significantly.
Immediately I identified that this was the day my colleague published and back-linked a blog post, and so we spent the next few weeks testing and monitoring everything we did.
We discovered that it was ONLY when we created a Google My Business post that the Average Position dipped, and on the 1st July we tested it one more time. The results were the same (please see attached image).
I am 100% confident that Google My Business is the cause of the issue, but can't identify why. The image I upload belongs to me, the text isn't spammy or stuffed with keywords, the Learn More links to my own website, and I never receive any warnings from Google about the content.
I would love to hear the community's thoughts on this and how I can stop the issue from continuing. I should note, that my Google My Business insights are generally positive i.e. no dips in search results etc.
My URL is https://www.photographybymatthewjames.com/
Thanks in advance
Matthew
-
No worries, you often get the weekly dips when global - no idea why. However, when adding country it all flattens out. If the question is answered or happy please mark it accordingly. Hope that helps.
-
Actually, no. Those big dips each Wednesday don't appear when I select Denmark. So I guess this makes much more sense if I am only targetting a local market such as Denmark.
Thanks for making this clear...
-
Ta, but do you get the big dip now when Denmark is applied?
-
Thanks again.
No, the country filter wasn't set, so it was showing an average position for all countries.
The site is optimised as best as possible as far as I am aware.
-
Apologies my question was unclear - when you extracted your position data from search console. Did you limit it to Denmark? There is navigation that lets you limit to the country. Can you let me know if we are working from that data set on positional changes?
Zero or 0 is worse than 196. 0 means your not ranking at all, whereas 196 means that is where you are ranking position 196.
They are very low rankings, is the site optimised from a title tag, meta description, H1 perspective?
Check out the URL structure component,
https://moz.com/learn/seo/on-page-factors
Also see excerpt below:
An Ideally Optimized Web Page
An ideal web page should do all of the following:
- Be hyper-relevant to a specific topic (usually a product or single object)
- Include subject in title tag
- Include subject in URL
- Include subject in image alt text
- Specify subject several times throughout text content
- Provide unique content about a given subject
- Link back to its category page
- Link back to its subcategory page (If applicable)
- Link back to its homepage (normally accomplished with an image link showing the website logo on the top left of a page)
Hope that helps.
- Be hyper-relevant to a specific topic (usually a product or single object)
-
Hi, and thanks for the fast response.
Yes, I have specifically been targetting Denmark for the past 3-4 years (I set this up in the old Webmaster Tools and it still appears the same). So in short, I am targetting one country.
As for Queries – yes, there are significant drops for certain customer queries. The column on the left with the highest number shows the average page rank on the day I posted on Google My Business, vs. the day before I posted. i.e. event photographer dipped from 0 average page ranking to 196
Does this make things clearer in any way?
| event photographer | 196 | 0 | 196 |
| event photography | 193 | 0 | 193 |
| sports photographer | 188.3 | 0 | 188.3 |
| sports photography | 182 | 0 | 182 |
| james harrison | 101 | 0 | 101 |
| photoshoot københavn | 98 | 0 | 98 |
| copenhagen photos | 97 | 0 | 97 |
| danish courses in copenhagen | 96 | 0 | 96 |
| dhl stafet københavn | 92 | 0 | 92 |
| cph business | 91.5 | 0 | 91.5 | -
Hi
On search console - have you limited that to a country? Or is that "countryless"". As if countryless - not unusal.
Can you clarify? Ideally, limit to the country you are targetting.
Also on top of the overall position dips - is there any corresponding dips for major customer queries you are tracking?
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I Report A SEO Agency to Google
Our competitor has employed the services of a spammy SEO agency that sends spammy links to our site. Though our rankings were affected we have taken the necessary steps. It is possible to send evidence to Google so that they can take down the site. I want to take this action so that other sites will not be affected by them again.
White Hat / Black Hat SEO | | Halmblogmusic0 -
What to do with internal spam url's google indexed?
I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture. What is the best way to remove them? use google disavow tool or just redirect them to some page? The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too 🙂 What would be the best strategy to solve this. Thanks. k9Bviox
White Hat / Black Hat SEO | | AndrisZigurs0 -
Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ? Website: http://www.mantistechnologies.com/
White Hat / Black Hat SEO | | RobinJA0 -
Does Google want contact numbers in the meta description?!
Reading up it seems like there's complete free reign to enter what you want in the meta description and they are not considered a direct ranking signal However I have added contact numbers to the meta descriptions for around 20 reasonably high ranking pages for my company and it seems to have had a negative effect (taken screen grabs and previous rankings) More strangely when you 'inspect' the page the meta description features the desired number yet when you find the page in the serps the meta description just does not feature the number (page has been cached and the description does not carry on) I'm wondering whether such direct changes are seen as spam and therefore negative to the page?
White Hat / Black Hat SEO | | Jacksons_Fencing1 -
How does google know if rich snippet reviews are fake?
According to: https://developers.google.com/structured-data/rich-snippets/reviews - all someone has to do is add in some html code and write the review. How does google do any validation on whether these reviews are legitimate or not?
White Hat / Black Hat SEO | | wlingke0 -
Avoiding the "sorry we have no imagery here" G-maps error
Hi there, we recently did a redesign on a big site and added Gmaps locations to almost every page since we are related to Real State, Listings, Details, search results all have a map embedded. While looking at GWT I found that the top keywords on our site (which is in spanish) are the following. have here imagery sorry After a quick search I found out this is a Gmaps bug, when Google Bot accesses the Pages it throws an error out with this text repeated several times. If you do a search for "sorry we have no imagery here" you will see lots of sites with this issue. My question is, Is this affecting the overall SEO since Bots are actually crawling and indexing this hence its being reported by GWT, Should I cloak this to robots? Has anyone noticed this or has been able to fix it? Thanks in advance!
White Hat / Black Hat SEO | | makote0 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0