Struggling to get my lyrics website fully indexed
-
Hey guys, been a longtime SEOmoz user, only just getting heavily into SEO now and this is my first query, apologies if it's simple to answer but I have been doing my research!
My website is http://www.lyricstatus.com - basically it's a lyrics website.
Rightly or wrongly, I'm using Google Custom Search Engine on my website for search, as well as jQuery auto-suggest - please ignore the latter for now.
My problem is that when I launched the site I had a complex AJAX Browse page, so Google couldn't see static links to all my pages, thus it only indexed certain pages that did have static links. This led to my searches on my site using the Google CSE being useless as very few pages were indexed.
I've since dropped the complex AJAX links and replaced it with easy static links. However, this was a few weeks ago now and still Google won't fully index my site. Try doing a search for "Justin Timberlake" (don't use the auto-suggest, just click the "Search" button) and it's clear that the site still hasn't been fully indexed!
I'm really not too sure what else to do, other than wait and hope, which doesn't seem like a very proactive thing to do! My only other suspicion is that Google sees my site as more duplicate content, but surely it must be ok with indexing multiple lyrics sites since there are plenty of different ones ranking in Google.
Any help or advice greatly appreciated guys!
-
You need more unique content. Your site is great I like it much btter then the other lyic sites.
but I can't see any content at all you have written yourself.
-
I agree with Stephen. Tons of lyrics websites out there.
If you want to get your site more visible write a couple to a few hundred words about each song and post it on the pages above or beside the lyrics. Then you will have something unique.
Try that on a couple dozen pages to see what happens. Give it a few months.
-
You have exactly the same content as a million other lyrics websites, so why should Google be interested in your PR0, PA18, DA2 website?
I think your doing pretty good with 15000 pages indexed via site:http://lyricstatus.com
I think what you need is a USP, not technical seo responses
-
Do you have any organization to your site? I can see where some visitors would desire to find lyrics by year, singer, music style (jazz, rock, etc), music type (love songs, happy songs, etc) and so forth.
Even if users found songs by searching, crawlers move through your site through links. Unless your site is extremely well linked and has a great navigation system, you are only going to see a relatively small percentage of your site indexed.
-
Wow, that was a quick response, thanks so much Ryan!
With regards to Google WMT, yep done that as soon as I went live, and I did try and make a sitemap using xml-sitemaps.org's tool, but where I have 700,000+ songs, the XML sitemap generator kept stalling due to lack of RAM. I did upload a partial sitemap though, but to date the "URLs in web index" is stuck at 363... out of 700,000+!!
You're right, I don't have a nav as I believe users will just use the search, but there is a "Browse" link in the footer which appears on every page, and this is effectively my Site Map: http://www.lyricstatus.com/browse
So as far as I'm concerned there is a static link path to every page in my website, correct me if I'm wrong?
Good point in your last para about a unique couple hundred words on each page - tall order for 700k pages, but could definitely do that for key songs that I want to get ranked for. Thanks again Ryan!
-
Hi Ed.
A few things you can do to help get your pages indexed:
1. If you have not done so already, register with Google and go to the Google Webmaster Tools page http://www.google.com/webmasters
2. If you have not already done so, create a XML sitemap. Ideally it should be located at http://www.lyricstatus.com/sitemap
3. If you want to locate the sitemap anywhere else, you will need to create a robots.txt file and place the sitemap URL in the file. I noticed you didn't have a robots.txt file. You can learn more about them at robotstxt.org.
4. In Google WMT, go ahead and upload your sitemap (Site Configuration > Sitemap). Then check back a day later. What you want to look at is two fields: URLs submitted and URLs in index. Your goal would be to have all your URLs in the index, but that isn't realistic without a lot of work.
5. Another thing you can do is create a HTML sitemap and place a link in the footer of your home page. You don't offer site navigation so a HTML sitemap can help visitors navigate your site.
Take these steps for now and then you will have a much better idea where your site stands. You can then match up your URLs in the sitemap with the URLs in Google's index. The urls without a match are the pages you need to get into the index.
You can try link building or even placing links to these buried pages on your home page to help get them indexed.
One last note concerning duplicate content. You really should consider adding original content to the pages to help them not be considered duplicate content. Keep in mind the page is viewed as a whole so if you have a song, you probably need to write at least a couple hundred words to differentiate your pages from all the other similar pages on the web.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Accidental No Index
Hi everyone, We control several client sites at my company. The developers accidentally had a no index robot implemented in the site code when we did the HTTPS upgrade without knowing it (yes it's true). Ten days later we noticed traffic was falling. After a couple days we found the no index tags and removed them and resumbitted the sitemaps. The sites started ranking for their own keywords again within a day or two. The organic traffic is still down considerably and other keywords they are not ranking for in the same spot as they were before or at all. If I look in Google Search console, it says we submitted for example 4,000 URLs and only 160 have been indexed. I feel like maybe Google is taking a long time to re-index to remainder of the sites?? Has anyone has this issue?? We're starting to get very concerned so any input would be appreciate. I read an article on here from 2011 about a company that did the same and they were ranking for their keywords within a week. It's been 8 days since our fix.
Technical SEO | | AliMac260 -
Amp version of website
Hello & thanks for reading its maybe the monday morning blues but i have two versions of a website - www.gardeners.scot and www.gardeners.scot/AMP/ the pages on the amp version have canonicals pointing to the "normal" website Should the links on "www.example.com/AMP/" point to the amp website or the normal website? what are your thougths?
Technical SEO | | livingphilosophy0 -
How to get google to forget my old but still working page and list my new fully optimized page for a keyword?
Hi There! (i am beginner in seo) I have dynamic and static pages on our site. I created a static page for a specific keyword. Fully optimized it, (h1, alt, metas, etc.....maybe too optimized). My problem is that this page is alive for weeks, checked it in GWT and it is in robots.txt, google sees it, and indexed it. BUT whenewer i do a search for that keyword, we still appear with the dynamically created link in the google listings. How could i "redirect" google, if sy make a search for that keyword than shows our optimized page? Is there a tool for that? I cant delete the dynamic page... Any ideas? Thx Andrew
Technical SEO | | Neckermann0 -
Pages Indexed Not Changing
I have several sites that I do SEO for that are having a common problem. I have submitted xml sitemaps to Google for each site, and as new pages are added to the site, they are added to the xml sitemap. To make sure new pages are being indexed, I check the number of pages that have been indexed vs. the number of pages submitted by the xml sitemap every week. For weeks now, the number of pages submitted has increased, but the number of pages actually indexed has not changed. I have done searches on Google for the new pages and they are always added to the index, but the number of indexed pages is still not changing. My initial thought was as new pages are added to the index, old ones are being dropped. But I can't find evidence of that, or understand why that would be the case. Any ideas on why this is happening? Or am I worrying about something that I shouldn't even be concerned with since new pages are being indexed?
Technical SEO | | ang1 -
Indexing a catalogue
A client of mine has a large printed product catalogue that they post on their website as a pdf. Should I take a different approach of posting this catalogue in order to gain SEO value?
Technical SEO | | garymeld0 -
Where to find database optimize expert ? (My website get down every day)
Hi all, My website is based on Vbulletin forum software, with 10 years of database (about 50GB), my websitetraffic is about 12mil/month, we using 20 server for this website. Website down every day, when i use SEOMoz to crawl my site, SEOMoz crawled 486 page, 462 page return 403 response code > i don't know why ??? I'm looking for someone (some company) who can help us to optimize this database, make it faster. Thanks for any contact/advice/suggestion company from you. Ninh
Technical SEO | | firstjames0 -
New website, to www or not
I was just wondering if there are any advantages to using the www instead of just the domain name for seo. Can these be elaborated on?
Technical SEO | | simvegas1 -
How to get user genreated reviews indexed properly?
We are currently working to improve the deployment of a review widget on our website. The widget was deployed about 18 months ago and all reviews are behind Java navigation. I have been working with our IT staff to get the reviews into an HTML page which will either live on the product page as a tab or will be a link from the product page. Our IT staff has suggested leaving the Java navigation for users and creating separate HTML pages specifically for search engines. Based on my experience, this sounds like a bad idea, basically creating pages just for search engines that will not be use by site visitors, although the visitors will have access to the same content via the Java navigation. Anyone care to comment on this? Is creating HTML pages specifically for search engines a bad idea? An acceptable idea?
Technical SEO | | seorunner0