Heya, when you say "not appearing" do you mean you're searching for keywords you hope to show up for? Or are you searching for the urls of your site specifically? If you write "inurl:the-url-of-your-page" into Google - what comes up?
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by R0bin_L0rd
-
RE: My site is showing indexed in search console but not appearing in Serps
-
RE: H1 text and Header Image Overlap?
I think seoelevated has your answer - if you can overlay the text on the image as actual text that should tick all the boxes for you
-
RE: H1 text and Header Image Overlap?
Good question! So you're saying that the text is part of the image? Not written on the page? In that case definitely put it on the text of the page - Google can't read text out of images so you don't need to worry about repeating yourself and it's showing Google important content for the first time!
-
RE: What to do with PDFs that rank well?
Sounds like a smart plan to me!
-
RE: What to do with PDFs that rank well?
Hi there, good question! I've run into a similar situation and had all the same concerns as you. I think my recommendations would be (in order from most preferred to least preferred)
- Convert pdf to full HTML page with internal links and ability to convert, redirect PDF
- Replace PDF with HTML page which has the PDF as the content but also contains things like the internal nav (sometimes this is the only possibility with limited resource)
- Create summary page and link to pdf with canonical on pdf page (just as you say)
Making the content of the PDF more easily readable by search engines (by turning it into a HTML page as in solution 1) would hopefully boost ranking potential as well as helping give users a better experience.
The only reason I put your solution as lowest on the list is because if we're creating a page then we may as well create the full page, rather than creating a page which kind of gives the same information as the PDF but still relies on the PDF existing. You could find that the two pages compete and neither ends up ranking.
Hopefully that helps!
-
RE: Minor languages keyword research
Hi there Luca, unfortunately sometimes search terms just have low search volume, particularly in less-spoken languages, and it can be difficult to get around.
Sometimes that's a sign that the term isn't used in that language (maybe the direct translation doesn't make sense or there's some cultural reason that people in the country don't search for a specific thing). Sometimes it can also be a sign that, even if it's the right term, optimising for that language won't offer much return on investment so I would pay some attention to the fact that search volumes are low.
That said, there are other sources of info (which are more/less dependent on the same data that you're getting from Google Ads), here's the list of tools I try to hit up:
- Search Console if you're already ranking SOMEWHERE for some of these terms
- SEMRush
- AHRefs
- SEOmonitor
Good luck!
-
RE: Moz crawling doesn't show all of my Backlinks
Hi there, unfortunately we can't really rely on any one tool to know all of our backlinks, even Google!
When someone links to your site, they could be doing it from the New York Times, or they could be doing it from a tiny site which they just created. The only way any tool (Google, Moz etc.) can get a list of backlinks is by continually accessing a page, finding every link on that page, going to all the linked pages, finding all the links on those pages etc.
With something like the New York Times, these tools know the site, they know to keep coming back to check every so often, so it's more likely they identify the link going to your site. With a tiny site someone has just created - it's much harder for these tools to even know the site exists, never mind know the specific page on that site which is linking to yours. Even if a tool has a massive database of the internet, there are trillions of web pages and no one database will have a record of them all.
The next best thing is for us to use as many tools as we can to get as complete a picture of the issue as we can. I often try to check;
- Moz
- Google Search Console (this will only give you the first 1K)
- AHrefs
- SEMRush
- Majestic SEO.
You'll need to dedupe the different lists but between them, they should stand a better chance of finding backlinks to your site. If they don't find anything then it's likely that the pages linking to your site are either quite out-of-the-way or they are in some way blocked from crawlers.
In terms of tracking keywords, Moz does allow you to include a keyword list which it'll give you ranking information for over time, you can also track keywords in Stat (search "getstat"), and SEMRush and AHRefs will give you ranking information too.
Hope that helps!
-
RE: An immediate and long-term plan for expired Events?
Great! Happy to help
-
RE: An immediate and long-term plan for expired Events?
To be honest it sounds like you already have your plan.
One thing I'd bear in mind is a crawl you run of your site won't line up with the pages that Google is visiting. For one thing, the tools we use try to approximate Google but won't be exactly the same. More importantly, once Google knows of a page it'll come back and check on it to see if the content changed, the only way you'll see that is by looking at your log files.
Yea there's no point making it "noindex, follow", it's not that Google doesn't know what to do with the page, it's just that it's attitude to the page will change over time.
In terms of the large number of redirects, there is some risk that Google could see the large number of 301s as spammy but, to be honest, I've never directly seen evidence of that being a problem. The way I see it, the choice is fairly similar you could
-
404/410 that's the way the internet is meant to work when something no longer exists but you'll lose link equity.
-
301 to preserve link equity but you're essentially misusing the status code.
-
Do a monthly check, 301 any expired pages with discovered backlinks, 410 the rest. This is best of both worlds but is much more time consuming.
I think you can probably get away with the 301s but it all comes down to your appetite for risk.
Good luck!
-
-
RE: An immediate and long-term plan for expired Events?
Hi there, thanks for posting!
I think my main question here is around the decision to note 404 or 301 these pages. I totally understand that you want to reduce the number of indexed pages which aren't providing value but also don't want to lose equity. I know you mention you're not super technical so I'm going to break down how I expect link equity to be passed around a site and therefore how I expect each of these techniques to impact the page.
Equity is passed from page to page via links so these events pages will pass equity to other pages on yo by Google having a record of the page and the equity of that page, then distributing that equity through links it can follow. Google representatives have said recently that, after a period of time, noindex pages are treated as noindex nofollow at which point we can't rely on equity being passed along any of the outbound links from these pages.
-
noindex: removes the page from the index, after a period of time no equity will be passed from the noindexed page. Initially Google will continue to crawl the page but that will reduce over time.
-
404: the page doesn't exist so will be removed from the index after a period of time. No equity will be passed from the page. Google should stop crawling the page fairly quickly.
-
410: more definitive than 404. Page should drop out of the index more quickly. No equity will be passed from the page. Google should should stop crawling the page fairly quickly.
-
301: we're telling Google that this address is no good any more and it should instead look at a different address. Again, the redirected page should drop out of the index and some proportion of the redirected page's equity should be transferred to the target page. Google should stop crawling the page more quickly than noindexed version but probably not as quickly as the 404/410.
Based on all that I don't think noindex is necessarily your best option. You'll still have a bunch of defunct pages, which Google may still spend time crawling, and you can't rely on them passing equity.
A custom 404/410 page explaining to users that the event has passed is probably a pretty good user experience and would be the most expected behaviour for a situation where content isn't there any more, but won't help you with equity.
I think what you could do is automatically 301 redirect to a relevant category page with a pop-up message that explains to users what's happened. Doesn't sound like you expect the event pages to pop in and out of existence so the logic should be fairly simple.
Hope that helps!
-
-
RE: Why is this site ranked #1 in Google with such a low DA (is DA not important anymore?)
I agree with the other answers on the following points;
- Moz DA is not a Google metric
- In territories like Thailand and spaces related to drugs, Google tends to have less of an idea of what is spammy or not
- On-page optimisation across your site is a great way to improve your rankings
- Giving your users the best possible experience is a great way to get long-term success.
However, I wouldn't go as far as to say that Google pays no attention whatsoever to links. I think you can treat these findings as frustrating but informative - from what you are seeing, that site having a worse backlink profile is not preventing them from ranking. So for those searches, backlinks don't seem to be as big a deciding factor. From there I'd start identifying what factors are correlated with rankings and, provided those things are above-board and won't come back to bite you (here I mean - don't buy links etc.) then try emulating that stuff while you continue to build and improve your own site.
My new favourite phrase is "You are the Igniterman"
-
RE: Query for paginated URLs - Shopify
I have a slightly different perspective here, based on one core assumption so feel free to tell me if this is off the mark - **I am assuming you want the products you are linking to on deeper paginated pages to still be found by Google so that they can rank. **
Google has said that noindexed urls are, over time, treated as noindex nofollow. Likewise, if all of the deeper paginated pages are canonicalised to the first page Google may not pass authority down to each of them. Pagination is common across the web, unless you are seeing massive conflict problems (which would be unusual) I would not robots block them, noindex them, or canonicalise them. I'd just leave them as they are and trust Google to figure it out until you have evidence that it is causing problems on your site in specific.
Hope that helps!
-
RE: SEO - New URL structure
Hi there! There seems to be a bit of confusion in this thread between URL structure and Information Architecture. Having more folders in a URL doesn't reduce the authority but pages with more folders in the URL tend to be deeper in the sites linking architecture, which means they tend to have less authority because they aren't as close to the surface. The difference between internal links and url format is an important one. There's a blog post here which explains in more depth.
From my perspective, here are the benefits of having pages within folders;
- There is an opportunity to put more relevant keywords in the URL without stuffing
- Easier folder-level reporting in Google Analytics, Search Console etc.
- Some increased understanding for Google of how pages hang together - there is some evidence that Google uses folder structure for ranking before it knows much about the page for example.
In terms of managing authority for pages and signals of relevance I'd be looking much more towards the internal linking to those pages. I wouldn't rely on Google intuitively understanding the topical connection between two pages unless both of those pages target that topic or have relevant links between them. So for example, say you have two pages;
If those pages are both subcategories of trinkets you could reformat them to be;
Having "trinkets" in the url might help both pages rank for "trinkets" type keywords, like "doodad trinkets" for example. However, I wouldn't rely on this change to help Google understand that widgets are related to doodads - you can handle that much more effectively with relevant internal links between /widgets and /doodads that make the relation clear.
In terms of whether there is a risk to making this change - this is essentially a migration and definitely comes with risks associated, even if all of your redirects are 1:1 and direct. It'll take time for Google to find the redirects and new pages, and as a rule of thumb, link equity isn't passed perfectly along a 301 redirect so I wouldn't expect these new pages to just inherit the strength of the old ones.
I think it comes down to weighing up whether the benefits I listed above outweigh the risk of an in-site migration. If you think the keyword targeting opportunities will make enough of a difference then great but I wouldn't rely on url structure as a way to get Google to understand your site differently - the impact of internal links is going to be a far greater factor.
-
RE: Traffic inconsistency
I agree with Effectdigital - best method is to go to the Acquisition section and look at the data by source and medium - as well as confirming whether you are getting organic traffic, it means you can confirm where you are getting traffic from if it isn't from Google.
In terms of your keywords question I couldn't say for certain why those tools aren't returning keywords but what do you see if you load your site with JavaScript switched off? Sometimes using JavaScript reliant sites can mean that tools like the ones you describe can't quickly pull content to get suggestions. Couple that with not ranking for terms that these tools may have already picked up and that could lead to what you're seeing. For what it's worth if that is the cause I'd consider server side rendering - the easier you can make it for machines to read your content, the better.
Hope that helps.
-
RE: Where are these "phantom visitors" and are they dangerous?
Yep, I agree with Martijn Scheijbeler, I'm marking this question as "Answered" but not closing it so you can continue to discuss if needed
-
RE: What do you do with product pages that are no longer used ? Delete/redirect to category/404 etc
No worries, glad to help. Good luck!
-
RE: Can you rank for copyrighted/trademarked words that became generic terms?
Hi there, interesting question!
So in terms of whether you are allowed to try to rank for brand name/trademarked keywords the answer is yes, absolutely. Google makes decisions about which sites it thinks are most relevant for a search and you don't have any responsibility to shy away from that attempt.
In terms of whether it's possible for you to rank for those keywords, that's actually kind of related to the point above. Google decides what should rank based on best user experience. If Google has really strong evidence that whenever someone searches a particular term they are looking for a specific brand it'll be very hard for you to break into that. However, as you've mentioned, there comes a time when a term becomes generic enough that users aren't necessarily searching for the brand, that's when you'll have more and more chance with pages using the term as a generic term. You can fairly quickly check by just Googling the terms and seeing what comes up. For example, when I search "spinning" the fourth text result is "Boom Cycle" - sounds like it doesn't just have to be a brand called "Spinning" for that term. If on the other hand you Google Apple - it's pretty clear Google thinks there's only one topic that's relevant as a result.
If it's a term you think your users will be searching for, create some content for it. If it's a stretch to think you'll rank, create something good but not terribly time consuming and go from there. If it looks like the only content showing up is about this brand, consider creating a post about the differences between that and what you offer, as a way to seem a bit more relevant for Google.
Hope that helps