Some Issues about my Blog
-
-
I am facing issue regarding to my Blog https://digitalmedialine.com/blog/. As some pages are not Rank in google yet. Can Anyone help me out how to rank those blogs to improve my Traffic.
-
Thanks in Advance.
-
-
I'm experiencing problems with my blogging site. My website is about travel blogs on kerala. It's not showing up on Google but is appearing on Bing. My blogs are indexed, and I use low competition keywords. There are no mobile friendliness issues. Can you please help me understand the reason and provide some tips on how to improve my site's ranking on Google?
-
@qwaswd said in Some Issues about my Blog:
-
I am facing issue regarding to my Blog https://digitalmedialine.com/blog/. As some pages are not Rank in google yet. Can Anyone help me out how to rank those blogs to improve my Traffic.
-
Thanks in Advance.
Do you perform any kind of Link building if no the obviously you should have to start link building but the first task is submit your blog sitemap XML so google can crawl and index your blog post a well.
-
-
Hi qwaswd,
Have you tried creating and submitting an XML sitemap of your blog using Google Search Console?
Best,
Zack
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawler issues on subdomain - Need resolving?
Hey Guys, I'm fairly new to the world of SEO and have a ton of crawler issues with a friends website I'm doing some work on. After Moz did a site crawl I'm getting loads of errors (Total of 100+ for critical crawler, content and meta data). Most of these are due to broken social links on a subdomain - so my question is do I need to resolve all of the errors even if they are on a sub-domain? Will it affect the primary website? Thanks, Jack
Technical SEO | | Jack11660 -
Responsive Code Creating Duplicate Content Issue
Good morning, Our developers have recently created a new site for our agency. The site is responsive for mobile/tablets. I've just put the site through Screaming Frog and I've been informed of duplicate H2s. When I've looked at some of the page sources, there are some instances of duplicated H2s and duplicated content. These duplicates don't actually appear on the site, only in the code. When I asked the development guys about this, they advised this is duplicated because of the code for the responsive site. Will the site be negatively affected because of this? Not everything is duplicated, which leads me to believe it probably could have been designed better... but I'm no developer so don't know for sure. I've checked the code for other responsive sites and no duplicates can be found. Thanks in advance, Lewis
Technical SEO | | PeaSoupDigital0 -
Removal of date archive pages on the blog
I'm currently building a site which currently has an archive of blog posts by month/year but from a design perspective would rather not have these on the new website. Is the correct practice to 301 these to the main blog index page? Allow them to 404? Or actually to keep them after all. Many thanks in advance Andrew
Technical SEO | | AndieF0 -
No follow links on a blog
Hi On our blog, we have a section called 'Tags'. I have just noticed that these links are all "no follow" links. The tags section does appear on every single page on the blog - is this recommend to have them as 'no follow' links or should I get our developer to change them. Thanks
Technical SEO | | Andy-Halliday0 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
XML Sitemap Issue or not?
Hi Everyone, I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues. Issue: Url blocked by robots.txt. Description: Sitemap contains urls which are blocked by robots.txt. Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml Value: http://www.example.org/author/admin/ My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked. Do you think i m having a major problem or everything is fine?What should I do? How can I fix it? FYI: Wordpress is what we use for our website Thanks
Technical SEO | | Tay19860 -
Promoting a blog or a blog article
Hi what is the best way to promote a blog or a blog article. What i want to do is to find a site when i can put part of the article on that site and then have a link going to my blog for the article. Can anyone recommend any sites that do this please or the best ways to promote a new article from a blog
Technical SEO | | ClaireH-1848860 -
Canonicalization - duplicate homepage issues
I'm trying to work out the best way to resolve an issue where Google is seeing duplicate versions of a homepage, i.e. http://www.home.co.uk/Home.aspx and http://www.home.co.uk/ The site runs on Windows servers. I've tried implementing redirects for homepages before (for a different site on a linux server) and ended up with a loop, so although I know I can read lots of info (as I have been doing) and try again, I am really concerned about getting it wrong. Can anyone give me some advice on the best way to make Google take just one version of the page? Obviously link juice is also being diluted so I need to get this sorted asap. Thanks.
Technical SEO | | travelinnovations0