Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Struggling to get my lyrics website fully indexed
-
Hey guys, been a longtime SEOmoz user, only just getting heavily into SEO now and this is my first query, apologies if it's simple to answer but I have been doing my research!
My website is http://www.lyricstatus.com - basically it's a lyrics website.
Rightly or wrongly, I'm using Google Custom Search Engine on my website for search, as well as jQuery auto-suggest - please ignore the latter for now.
My problem is that when I launched the site I had a complex AJAX Browse page, so Google couldn't see static links to all my pages, thus it only indexed certain pages that did have static links. This led to my searches on my site using the Google CSE being useless as very few pages were indexed.
I've since dropped the complex AJAX links and replaced it with easy static links. However, this was a few weeks ago now and still Google won't fully index my site. Try doing a search for "Justin Timberlake" (don't use the auto-suggest, just click the "Search" button) and it's clear that the site still hasn't been fully indexed!
I'm really not too sure what else to do, other than wait and hope, which doesn't seem like a very proactive thing to do! My only other suspicion is that Google sees my site as more duplicate content, but surely it must be ok with indexing multiple lyrics sites since there are plenty of different ones ranking in Google.
Any help or advice greatly appreciated guys!
-
You need more unique content. Your site is great I like it much btter then the other lyic sites.
but I can't see any content at all you have written yourself.
-
I agree with Stephen. Tons of lyrics websites out there.
If you want to get your site more visible write a couple to a few hundred words about each song and post it on the pages above or beside the lyrics. Then you will have something unique.
Try that on a couple dozen pages to see what happens. Give it a few months.
-
You have exactly the same content as a million other lyrics websites, so why should Google be interested in your PR0, PA18, DA2 website?
I think your doing pretty good with 15000 pages indexed via site:http://lyricstatus.com
I think what you need is a USP, not technical seo responses
-
Do you have any organization to your site? I can see where some visitors would desire to find lyrics by year, singer, music style (jazz, rock, etc), music type (love songs, happy songs, etc) and so forth.
Even if users found songs by searching, crawlers move through your site through links. Unless your site is extremely well linked and has a great navigation system, you are only going to see a relatively small percentage of your site indexed.
-
Wow, that was a quick response, thanks so much Ryan!
With regards to Google WMT, yep done that as soon as I went live, and I did try and make a sitemap using xml-sitemaps.org's tool, but where I have 700,000+ songs, the XML sitemap generator kept stalling due to lack of RAM. I did upload a partial sitemap though, but to date the "URLs in web index" is stuck at 363... out of 700,000+!!
You're right, I don't have a nav as I believe users will just use the search, but there is a "Browse" link in the footer which appears on every page, and this is effectively my Site Map: http://www.lyricstatus.com/browse
So as far as I'm concerned there is a static link path to every page in my website, correct me if I'm wrong?
Good point in your last para about a unique couple hundred words on each page - tall order for 700k pages, but could definitely do that for key songs that I want to get ranked for. Thanks again Ryan!
-
Hi Ed.
A few things you can do to help get your pages indexed:
1. If you have not done so already, register with Google and go to the Google Webmaster Tools page http://www.google.com/webmasters
2. If you have not already done so, create a XML sitemap. Ideally it should be located at http://www.lyricstatus.com/sitemap
3. If you want to locate the sitemap anywhere else, you will need to create a robots.txt file and place the sitemap URL in the file. I noticed you didn't have a robots.txt file. You can learn more about them at robotstxt.org.
4. In Google WMT, go ahead and upload your sitemap (Site Configuration > Sitemap). Then check back a day later. What you want to look at is two fields: URLs submitted and URLs in index. Your goal would be to have all your URLs in the index, but that isn't realistic without a lot of work.
5. Another thing you can do is create a HTML sitemap and place a link in the footer of your home page. You don't offer site navigation so a HTML sitemap can help visitors navigate your site.
Take these steps for now and then you will have a much better idea where your site stands. You can then match up your URLs in the sitemap with the URLs in Google's index. The urls without a match are the pages you need to get into the index.
You can try link building or even placing links to these buried pages on your home page to help get them indexed.
One last note concerning duplicate content. You really should consider adding original content to the pages to help them not be considered duplicate content. Keep in mind the page is viewed as a whole so if you have a song, you probably need to write at least a couple hundred words to differentiate your pages from all the other similar pages on the web.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Indexing without content
Hello. I have a problem of page indexing without content. I have website in 3 different languages and 2 of the pages are indexing just fine, but one language page (the most important one) is indexing without content. When searching using site: page comes up, but when searching unique keywords for which I should rank 100% nothing comes up. This page was indexing just fine and the problem arose couple of days ago after google update finished. Looking further, the problem is language related and every page in the given language that is newly indexed has this problem, while pages that were last crawled around one week ago are just fine. Has anyone ran into this type of problem?
Technical SEO | | AtuliSulava1 -
Should search pages be indexed?
Hey guys, I've always believed that search pages should be no-indexed but now I'm wondering if there is an argument to index them? Appreciate any thoughts!
Technical SEO | | RebekahVP0 -
Z-indexed content
I have some content on a page that I am not using any type of css hiding techniques, but I am using an image with a higher z-index in order to prevent the text from being seen until a user clicks a link to have the content scroll down. Are there any negative repercussions for doing this in regards to SEO?
Technical SEO | | cokergroup0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
What to do with 302 redirects being indexed
Hi there, Our site's forums include permalinks that for some reason uses an intermediary URL that 302 redirects to the URL with the permalink anchor. For example: http://en.tradimo.com/learn/chart-analysis/time-frames/ In the comments, there is a permalink to the following URL; en.tradimo.com/co/50c450005f2b949e3200001b/ (there is no content here, and never has been). This URL 302 redirects to the following final URL: http://en.tradimo.com/learn/chart-analysis/time-frames/?offset=0&limit=20#50c450005f2b949e3200001b The problem is, Google is indexing the redirect URL (en.tradimo.com/co/50c450005f2b949e3200001b/) and showing duplicate content even though we are using the nofollow tag on these links. Ideally, we would directly use the last link rather than redirecting. Alternatively, I'd say a 301 redirect would be preferable. But if both aren't available, is there a way to get these pages out of the index? Is the canonical tag the best way? I really wish I could just add /co/ to the robots.txt file, but I think they would still be in the index, right? Thanks for your help!
Technical SEO | | etruvian0 -
Unnecessary pages getting indexed in Google for my blog
I have a blog dapazze.com and I am suffering from a problem for a long time. I found out that Google have indexed hundreds of replytocom links and images attachment pages for my blog. I had to remove these pages manually using the URL removal tool. I had used "Disallow: ?replytocom" in my robots.txt, but Google disobeyed it. After that, I removed the parameter from my blog completely using the SEO by Yoast plugin. But now I see that Google has again started indexing these links even after they are not present in my blog (I use #comment). Google have also indexed many of my admin and plugin pages, whereas they are disallowed in my robots.txt file. Have a look at my robots.txt file here: http://dapazze.com/robots.txt Please help me out to solve this problem permanently?
Technical SEO | | rahulchowdhury0 -
Why is my website banned?
IMy website is Costume Machine at www.costumemachine.com . My site has been banned for 1 year now. I have requested that google reconsider my site 3 times without luck. The site is dynamic and basically pulls in feeds from affiliate sites. We have added over 1,500 pages of original content. The site has been running great since 2008 without any penalties. I don't think I got hit with any linking penalty. I cleaned up all questionable links last November when the penalty hit. Am I being hit with a "thin" site penalty? If that is the issue what is the best way to fix the problem?
Technical SEO | | tadden0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0