@Anjana9638 A tool like Xenu, Site Sucker or Screaming Frog to name a few and then have the tool only focus on reporting external links to minimize crawl overhead.
- Home
- RyanPurkey
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
RyanPurkey
@RyanPurkey
Job Title: Managing Director
Company: rQuadrant
Website Description
A place to test yourself and join others who are looking to grow their lives–personally, professionally, in partnership–in provable, principled, repeatable ways.
SEO/SEM Marketer with broad and lengthy experience in the online marketing space. Currently working independently via rQuadrant. If you're a fellow Moz user, feel free to add me to your network via LinkedIn or other social channels at left.
Favorite Thing about SEO
Either finding what I'm looking for or making it.
Latest posts made by RyanPurkey
-
RE: How To Check Outbound Links Of wesite?
-
RE: Why My site pages getting video index viewport issue?
@mitty27 Your iFrame at the top of the page is hard set to 1200px width which could cause problems. Please provide some specific URLs that GSC has identified with the viewport issue for specific answers to those. Thanks and good luck!
-
RE: The Bad effect of Submitting Sitemap frequently?
You can submit sitemaps as often as you like to Google Search Console. Many large scale sites would even be doing so multiple times a day as they publish new articles, add or remove new products, and so on. For example, News Sites have a high touch need for fast crawling (hence AMP and specialized sitemap methodologies) Here's Google's help article on this: https://support.google.com/news/publisher/answer/40392
Here's the more generalized version of Sitemap recommendations: https://support.google.com/webmasters/answer/75712
-
RE: Resubmit sitemaps on every change?
Great follow up! Thanks for that. :^)
-
RE: Resubmit sitemaps on every change?
Hello. You can check the Submitted vs Indexed count within Search Console to see whether or not your regenerated sitemap is being picked up already, but resubmitting a sitemap isn't an issue, and fairly easy to do, per Google:
Resubmit your sitemap
- Open the Sitemaps report
- Select the sitemap(s) you want to resubmit from the table
- Click the Resubmit sitemap button.
You can also resubmit a sitemap by sending an HTTP GET request to the following URL, specifying your own sitemap URL: http://google.com/ping?sitemap=http://www.example.com/my_sitemap.xml
Via: https://support.google.com/webmasters/answer/183669 Also from a FAQ in the Webmasters blog they state that, "Google does not penalize you for submitting a Sitemap."
-
RE: Should I use sessions or unique visitors to work out my ecommerce conversion rate?
Matthew makes great points. I'd add to this that having conversions tied to membership data makes it all the more person specific. This is why you'll here numbers like 74% conversion rate for Amazon Prime members (see: https://www.internetretailer.com/2015/06/25/amazon-prime-members-convert-74-time). Aside from better tracking you can begin to see the value for Amazon in having members...
- Similar to Facebook they're collecting user data per person and building a massive user base aside from just sales.
- Better tracking.
- Higher conversion rates.
- Top of mind branding.
- Upselling
- And so on...
You get the idea. That's why when you go to Amazon.com the only pop-up or animated prompt you'll see on the home page is to "sign-in". Obviously, this could be something out of scope for your project currently, but food for thought down the road.
-
RE: Backlinks in Footer - The good, the bad, the ugly.
Here's my take. Some footer bests aren't necessarily links.
**What is the good, bad and ugly of backlinks in the footer: **
- Good: Links are like a mini sitemap. Footer is a very useful addition to the page above it. Features heavily used features of the site that benefit the page by being available in the footer as well. NAP. Anti-Spam. If links are external, they're to parent or partner companies within the same overall company.
- Bad: Spammy. Tries to link to every page on the site with keyword heavy text. Broken links. Obvious paid links to external sites.
- **Ugly: **Low utility. Never updated. Service provider links.
-
RE: One robots.txt file for multiple sites?
Hi Rena. Yes, if both sites are separate domains that you want to use in different ways, then you should place a different robots.txt file in each domain root so that they're accessible at xyz.com/robots.txt and abc.com/robots.txt. Cheers!
-
RE: Replace dynamic paramenter URLs with static Landing Page URL - faceted navigation
Hi James! This type of URL rewriting is a best practice when it comes to presenting visitors with easy to read page descriptions so whatever capacity you can apply it would be great. Here's Moz's guide to URLs which goes over your question as well as further details into URL structures: https://moz.com/learn/seo/url
Also note from that page:
In addition to the issues of brevity and clarity, it's also important to keep URLs limited to as few dynamic parameters as possible. A dynamic parameter is a part of the URL that provides data to a database so the proper records can be retrieved, i.e. n=3031001, v=glance, categoryid=145, etc.
Note that in both the Amazon and Canon URLs, the dynamic parameters number three or more. In an ideal site, there should never be more than two. Search engineer representatives have confirmed on numerous occasions that URLs with more than two dynamic parameters may not be spidered unless they are perceived as significantly important (i.e., have many, many other links pointing to them).
Cheers!
-
RE: Seeing lots of 0 seconds session duration from AdWords clicks
Like Mike requests, being able to view the landing page plus keyword combination is pretty key to discovering the low session duration. Knowing what the ad copy is would help as well. Currently though you're dealing with way too small of a sample size of 14 users. Something like 140 users would be more indicative of trends, while 1400 users would be even better.
Aside from the low sample size, the basic reason low time on site happens is because people are expecting something different than what they're getting from a site so they leave rather quickly. Specific reasons it could be happening: slow loading pages, poor design, poor matching keyword or ad copy to landing page content, poor user to content match, accidental clicks, etc. Cheers!
Best posts made by RyanPurkey
-
RE: Dofollow Blog Comments
If you're planning on doing this with a non-disposable, branded website it's a bad idea as there are can be lots of negative effects.
If you're planning on being black hat, having a network of disposable sites, and are going to be masking (or trying to mask) everything you do, it's just another tool in your toolbox.
Like people already mentioned here, it's better to make worthwhile comments that can be traced back to a reputable looking source as those could even have the added benefit of bringing you additional business, not just a boost in the rankings.
-
RE: .net or .co ?
You could focus group the two and see which people prefer. Or you could buy several and test their performance via Adwords split testing before taking the plunge of transferring or creating the new site. Really this boils down to not what has possibly worked for others, but what works best for you.
-
RE: Robots.txt on subdomains
Mostly no. I say 'mostly' because a lot of times when you look at a site using www and no-www if both of those work they're almost always pulling files from the same location (hence the warnings around duplicate content), so both www.domain.com/robots.txt and domain.com/robots.txt are going to work. This is the dominant example of a subdomain sharing a robots.txt file. However, on domains that are set up as their own subdomains they have different robots.txt. Take a look at the many differences between subdomain1-1000.wordpress.com/robots.txt vs wordpress.com/robots.txt. If you set up a subdomain that isn't just a reflection of your root domain, then you'll need to create a robots.txt file as well. Cheers!
-
RE: HUGE spike in Google Analytics Traffic
Thanks for the clarification Lauren. Have you been able to determine if the organic results are news based? Based on the landing pages are you aware of any rankings that might have jumped up recently? If you can get any insights from Google's Search Console you might get tipped off to some new rankings that are accounting for the spike.
-
RE: Internal Links - Dofollow or Nofollow and why?
Hi Angelos. Dofollowing internal links, is fine, especially in the context of relevant articles as those links are tying together information both in relation to search and for users that want to quickly dig deeper while reading your work.
-
RE: Media Kit for SEO?
Why yes the is! In 2013 Kaila Strong wrote an in-depth article here: http://moz.com/ugc/guide-to-using-unlinked-brand-mentions-for-link-acquisition-20981 as one example.
One suggestion I'd add is the creation of a Media Kit page that provides pre-approved pull quotes, company related imagery (in various sizes), timeline and more can work well in either gaining hotlinks to the images or providing citation samples that make it quick and easy for someone writing an article to cite your company.
-
RE: At what point to stop comments on a blog? Do too many comments hurt the page?
Yup! That sounds like a good interaction and a lively page that will keep presenting fresh, meaningful content each time it's updated. The only comments I'd worry about are spam (which you've screened) and really off-topic comments, but neither of those seem to be a problem. Since people are commenting under the same topic(s) as the article, it just adds to the page as a whole.
-
RE: Would changing the file name of an image (not the alt attribute) have an effect of on seo / ranking of that image and thus the site?
Why not change the file name to 'red-for-truck.jpg'? That would probably have a more positive effect. Much of what dictates image optimization is the same as on page factors, as described here: http://moz.com/blog/is-optimizing-photos-more-important-than-you-think. (On page factors: http://moz.com/learn/seo/on-page-factors) Plus Google has its own guide for image sitemaps here: https://support.google.com/webmasters/answer/178636. Cheers!
-
RE: HUGE spike in Google Analytics Traffic
Hello Laruen! You'll want to look at your direct pages and see if they're getting legitimate traffic in your referrals report: Acquisition >> All Traffic >> Referrals
From there you should be able to get an idea about what recent news or articles that have gone live that have been influencing your traffic. You can also use the Fresh Web Explorer here to see what pages might be mentioning you and sending larger amounts of traffic as well as various tools in Google Search Console (formerly Google Webmaster Tools).
I'm not seeing spikes across clientele so your situation is likely unique to you in that some of your pages have been recently referenced on high traffic sites or has popped into the news cycle. Cheers!
-
RE: Do Page Views Matter? (ranking factor?)
Rand recently did a whiteboard (beard?) Friday on this ~loosely~ under the broader scope of "Engagement" and I think you have to stick with keeping page views lumped into the overall scope of engagement, i.e., saying X page views per session = Y ranking boost is likely something no one can define precisely.
However, creating an on-site engagement score is something that is loosely feasible. For example you could look at time on site and a divide it by your GWT average time spent downloading a page to give yourself a rating engagement rating that. Lower the download time and you raise your score if the time on site stays the same. Increase time on site and the score goes up as well.
Does the number of page view equate into engagement? Maybe, although a site setup for getting lots of page views (pop culture sites with click lists, news articles, etc.) is going to have more than sites that do the bulk of their business via the home page. Perhaps a page view engagement metric you could create would be derived from your organic bounce rate: http://moz.com/blog/solving-the-pogo-stick-problem-whiteboard-friday
Hopefully this gives you a little direction in what to improve.
SEO/SEM Marketer with broad and lengthy experience in the online marketing space. Currently working independently via rQuadrant. If you're a fellow Moz user, feel free to add me to your network via LinkedIn or other social channels at left.
Looks like your connection to Moz was lost, please wait while we try to reconnect.