Struggling to get my lyrics website fully indexed
-
Hey guys, been a longtime SEOmoz user, only just getting heavily into SEO now and this is my first query, apologies if it's simple to answer but I have been doing my research!
My website is http://www.lyricstatus.com - basically it's a lyrics website.
Rightly or wrongly, I'm using Google Custom Search Engine on my website for search, as well as jQuery auto-suggest - please ignore the latter for now.
My problem is that when I launched the site I had a complex AJAX Browse page, so Google couldn't see static links to all my pages, thus it only indexed certain pages that did have static links. This led to my searches on my site using the Google CSE being useless as very few pages were indexed.
I've since dropped the complex AJAX links and replaced it with easy static links. However, this was a few weeks ago now and still Google won't fully index my site. Try doing a search for "Justin Timberlake" (don't use the auto-suggest, just click the "Search" button) and it's clear that the site still hasn't been fully indexed!
I'm really not too sure what else to do, other than wait and hope, which doesn't seem like a very proactive thing to do! My only other suspicion is that Google sees my site as more duplicate content, but surely it must be ok with indexing multiple lyrics sites since there are plenty of different ones ranking in Google.
Any help or advice greatly appreciated guys!
-
You need more unique content. Your site is great I like it much btter then the other lyic sites.
but I can't see any content at all you have written yourself.
-
I agree with Stephen. Tons of lyrics websites out there.
If you want to get your site more visible write a couple to a few hundred words about each song and post it on the pages above or beside the lyrics. Then you will have something unique.
Try that on a couple dozen pages to see what happens. Give it a few months.
-
You have exactly the same content as a million other lyrics websites, so why should Google be interested in your PR0, PA18, DA2 website?
I think your doing pretty good with 15000 pages indexed via site:http://lyricstatus.com
I think what you need is a USP, not technical seo responses
-
Do you have any organization to your site? I can see where some visitors would desire to find lyrics by year, singer, music style (jazz, rock, etc), music type (love songs, happy songs, etc) and so forth.
Even if users found songs by searching, crawlers move through your site through links. Unless your site is extremely well linked and has a great navigation system, you are only going to see a relatively small percentage of your site indexed.
-
Wow, that was a quick response, thanks so much Ryan!
With regards to Google WMT, yep done that as soon as I went live, and I did try and make a sitemap using xml-sitemaps.org's tool, but where I have 700,000+ songs, the XML sitemap generator kept stalling due to lack of RAM. I did upload a partial sitemap though, but to date the "URLs in web index" is stuck at 363... out of 700,000+!!
You're right, I don't have a nav as I believe users will just use the search, but there is a "Browse" link in the footer which appears on every page, and this is effectively my Site Map: http://www.lyricstatus.com/browse
So as far as I'm concerned there is a static link path to every page in my website, correct me if I'm wrong?
Good point in your last para about a unique couple hundred words on each page - tall order for 700k pages, but could definitely do that for key songs that I want to get ranked for. Thanks again Ryan!
-
Hi Ed.
A few things you can do to help get your pages indexed:
1. If you have not done so already, register with Google and go to the Google Webmaster Tools page http://www.google.com/webmasters
2. If you have not already done so, create a XML sitemap. Ideally it should be located at http://www.lyricstatus.com/sitemap
3. If you want to locate the sitemap anywhere else, you will need to create a robots.txt file and place the sitemap URL in the file. I noticed you didn't have a robots.txt file. You can learn more about them at robotstxt.org.
4. In Google WMT, go ahead and upload your sitemap (Site Configuration > Sitemap). Then check back a day later. What you want to look at is two fields: URLs submitted and URLs in index. Your goal would be to have all your URLs in the index, but that isn't realistic without a lot of work.
5. Another thing you can do is create a HTML sitemap and place a link in the footer of your home page. You don't offer site navigation so a HTML sitemap can help visitors navigate your site.
Take these steps for now and then you will have a much better idea where your site stands. You can then match up your URLs in the sitemap with the URLs in Google's index. The urls without a match are the pages you need to get into the index.
You can try link building or even placing links to these buried pages on your home page to help get them indexed.
One last note concerning duplicate content. You really should consider adding original content to the pages to help them not be considered duplicate content. Keep in mind the page is viewed as a whole so if you have a song, you probably need to write at least a couple hundred words to differentiate your pages from all the other similar pages on the web.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help!!! Website won't index after taking it over from another IT Company
Hi, A while back we took over a website that was built in Wordpress. We rebuilt it on another platform and switched the servers over whilst retaining the same domain.I had access to the old GA Account however so did the old IT company. Therefore I created a new GA account and used that in the new website pages.Recently we found the website had been blacklisted (previous to us taking it over) and now after being crawled a lot, only 2 pages have been indexed (over a 2month period).We have submitted a request for revision (to relist the website) buthave had no movement.**Just wondering if having a old, active account that was still linked to their old website would affect our Google listing?****Will dropping the old GA Tracking code/script into the site and deleting the new account enable Google to index?**Also, there is ample content, metadata and descriptions on the site.I welcome any help on this please!
Technical SEO | | nimblerdigital0 -
Document.referrer Is that harmful to my website?
Someone (maybe my Competitors) open a subdomain on tumblr.com. Just like keywordxxx.tumblr.com , and use following script to redirect to my website. var s=document.referrer; if(s.indexOf("google")>0 || s.indexOf("bing")>0 || s.indexOf("yahoo")>0 || s.indexOf ("aol")>0){ self.location='mywebsiteurl'; } so , If anyone seach in google ,and click on keywordxxx.tumblr.com , this script will direct the user to my website. I have found many cases like that, is that harmful ? how can i avoid that?
Technical SEO | | sunvary0 -
Modx revolution- getting around index.php vs. root duplicate content issue?
Basically, SEOMoz bots are flagging our index.php and root files as duplicate content of one another, therefore cutting the page authority of each. What we want to do is make the root the canonical preference over index.php. Ordinarily, we should be able to do this in the htaccess file. For some reason, as the site has been built into a cms using ModX Revolution, this does not seem to work. We've tried A TON of htaccess rewrite mods to resolve this issue to no avail. We have also tried revising our sitemap to include only the root address. Any ideas? We'll try most anything at this point. Thanks in advance.
Technical SEO | | G2W0 -
Why has Google stopped indexing my content?
Mystery of the day! Back on December 28th, there was a 404 on the sitemap for my website. This lasted 2 days before I noticed and fixed. Since then, Google has not indexed my content. However, the majority of content prior to that date still shows up in the index. The website is http://www.indieshuffle.com/. Clues: Google reports no current issues in Webmaster tools Two reconsideration requests have returned "no manual action taken" When new posts are detected as "submitted" in the sitemap, they take 2-3 days to "index" Once "indexed," they cannot be found in search results unless I include url:indieshuffle.com The sitelinks that used to pop up under a basic search for "Indie Shuffle" are now gone I am using Yoast's SEO tool for Wordpress (and have been for years) Before December 28th, I was doing 90k impressions / 4.5k clicks After December 28th, I'm now doing 8k impressions / 1.3k clicks Ultimately, I'm at a loss for a possible explanation. Running an SEOMoz audit comes up with warnings about rel=canonical and a few broken links (which I've fixed in reaction to the report). I know these things often correct themselves, but two months have passed now, and it continues to get progressively worse. Thanks, Jason
Technical SEO | | indieshuffle0 -
Multi Company websites
Hello SEO community ! Hope you'll have some good advice for this project. 🙂 I'm working for a group of companies just starting its SEO experience. Nowadays they have 10 different websites with different names and pretty much the same objectives. So basicly, > Would it be better to gather all website under one adress with subdomains ? They want to display almost the same info, blogs and products.. It make dupplicate content a real pain and Social Media strategy a nightmare. More info: 10 websites for 8 subsidiaries, 1 holding, 1 online shop Each subisdiary has english + its proper language They want regular posts and info updates (blogs, newsletters) They don't have all the same name They all do the same activity Online shop is full a product keywords Ideas: Working on the holding website as mother ship - for branding (social media), actu (blogs), CM (videos, and more)- Displaying the online shop products in all websites (xml) Diplaying blog updates (no full message) via xml on all websites Linking all websites to the blog, shop and holding Tks a lot !
Technical SEO | | AymanH0 -
How do I get google to index the right pages with the right key word?
Hello I notice that even though I have a site map google is indexing the wrong pages under the wrong key words. As a result its not as relevant and is not ranking properly.
Technical SEO | | ursalesguru0 -
301 forward of index to root
Hi, In my crawl diagnotics, I received an error for duplicate content: 1. www.website.com 2. www.website.com/ 3. www.website.com/index.html Which code do I have to add to my htaccess to avoid this?
Technical SEO | | wellnesswooz0 -
What is the most effective way of indexing a localised website?
Hi all, I have a website, www.acrylicimage.com which provides products in three different currencies, $, £ and Euro. Currently a user can click on a flag to indicate which region they are in, or if the user has not manually selected the website looks at the users Locale setting and sets the region for them. The website also has a very simple content management system which provides ever so slightly different content depending on which region the user is in. The difference in content might literally be a few words per page, like contact details, measurements i.e. imperial to metric. I dont believe that GoogleBot, or any other bot for that matter, sets a Locale, and therefore it will only ever be indexing the content on our default region - the UK. So, my question really is if I need to be able to index different versions of content on the same page, is the best route to provide alternate urls i.e.: /en/about-us
Technical SEO | | dotcentric
/us/about-us
/eu/about-us The only potential downside I see to this is there are currently a couple of pages that do have exactly the same content regardless of whether you have selected the UK or USA regions - could this be considered content duplication? Thanks for your help. Al0