Struggling to get my lyrics website fully indexed
-
Hey guys, been a longtime SEOmoz user, only just getting heavily into SEO now and this is my first query, apologies if it's simple to answer but I have been doing my research!
My website is http://www.lyricstatus.com - basically it's a lyrics website.
Rightly or wrongly, I'm using Google Custom Search Engine on my website for search, as well as jQuery auto-suggest - please ignore the latter for now.
My problem is that when I launched the site I had a complex AJAX Browse page, so Google couldn't see static links to all my pages, thus it only indexed certain pages that did have static links. This led to my searches on my site using the Google CSE being useless as very few pages were indexed.
I've since dropped the complex AJAX links and replaced it with easy static links. However, this was a few weeks ago now and still Google won't fully index my site. Try doing a search for "Justin Timberlake" (don't use the auto-suggest, just click the "Search" button) and it's clear that the site still hasn't been fully indexed!
I'm really not too sure what else to do, other than wait and hope, which doesn't seem like a very proactive thing to do! My only other suspicion is that Google sees my site as more duplicate content, but surely it must be ok with indexing multiple lyrics sites since there are plenty of different ones ranking in Google.
Any help or advice greatly appreciated guys!
-
You need more unique content. Your site is great I like it much btter then the other lyic sites.
but I can't see any content at all you have written yourself.
-
I agree with Stephen. Tons of lyrics websites out there.
If you want to get your site more visible write a couple to a few hundred words about each song and post it on the pages above or beside the lyrics. Then you will have something unique.
Try that on a couple dozen pages to see what happens. Give it a few months.
-
You have exactly the same content as a million other lyrics websites, so why should Google be interested in your PR0, PA18, DA2 website?
I think your doing pretty good with 15000 pages indexed via site:http://lyricstatus.com
I think what you need is a USP, not technical seo responses
-
Do you have any organization to your site? I can see where some visitors would desire to find lyrics by year, singer, music style (jazz, rock, etc), music type (love songs, happy songs, etc) and so forth.
Even if users found songs by searching, crawlers move through your site through links. Unless your site is extremely well linked and has a great navigation system, you are only going to see a relatively small percentage of your site indexed.
-
Wow, that was a quick response, thanks so much Ryan!
With regards to Google WMT, yep done that as soon as I went live, and I did try and make a sitemap using xml-sitemaps.org's tool, but where I have 700,000+ songs, the XML sitemap generator kept stalling due to lack of RAM. I did upload a partial sitemap though, but to date the "URLs in web index" is stuck at 363... out of 700,000+!!
You're right, I don't have a nav as I believe users will just use the search, but there is a "Browse" link in the footer which appears on every page, and this is effectively my Site Map: http://www.lyricstatus.com/browse
So as far as I'm concerned there is a static link path to every page in my website, correct me if I'm wrong?
Good point in your last para about a unique couple hundred words on each page - tall order for 700k pages, but could definitely do that for key songs that I want to get ranked for. Thanks again Ryan!
-
Hi Ed.
A few things you can do to help get your pages indexed:
1. If you have not done so already, register with Google and go to the Google Webmaster Tools page http://www.google.com/webmasters
2. If you have not already done so, create a XML sitemap. Ideally it should be located at http://www.lyricstatus.com/sitemap
3. If you want to locate the sitemap anywhere else, you will need to create a robots.txt file and place the sitemap URL in the file. I noticed you didn't have a robots.txt file. You can learn more about them at robotstxt.org.
4. In Google WMT, go ahead and upload your sitemap (Site Configuration > Sitemap). Then check back a day later. What you want to look at is two fields: URLs submitted and URLs in index. Your goal would be to have all your URLs in the index, but that isn't realistic without a lot of work.
5. Another thing you can do is create a HTML sitemap and place a link in the footer of your home page. You don't offer site navigation so a HTML sitemap can help visitors navigate your site.
Take these steps for now and then you will have a much better idea where your site stands. You can then match up your URLs in the sitemap with the URLs in Google's index. The urls without a match are the pages you need to get into the index.
You can try link building or even placing links to these buried pages on your home page to help get them indexed.
One last note concerning duplicate content. You really should consider adding original content to the pages to help them not be considered duplicate content. Keep in mind the page is viewed as a whole so if you have a song, you probably need to write at least a couple hundred words to differentiate your pages from all the other similar pages on the web.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website crawl error
Hi all, When I try to crawl a website, I got next error message: "java.lang.IllegalArgumentException: Illegal cookie name" For the moment, I found next explanation: The errors indicate that one of the web servers within the same cookie domain as the server is setting a cookie for your domain with the name "path", as well as another cookie with the name "domain" Does anyone has experience with this problem, knows what it means and knows how to solve it? Thanks in advance! Jens
Technical SEO | | WeAreDigital_BE0 -
Mobile website question
Hi Mozzers, A website I manage has a mobile friendly version of their main website and a /m version as well. I was wondering if anyone had any experience in the best way of handling this? Should we just get rid of the /m version and tag the mobile friendly version? Thanks!
Technical SEO | | KarlBantleman0 -
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
Indexing Problem
My URL is: www.memovalley.comWe have submitted our sitemap last month and we are having issues seeing our URLs listed in the search results. Even though our sitemaps contain over 200 URLs, we only currently only have 7 listed (excluding blog.memovalley.com).Can someone help us with this? | |
Technical SEO | | Memovalley
| | | | It looks like Googlebot has timed out, at least once, for one of our URLs. Why is Googlebot timing out? My server is located at Amazon WS, in North Carolina and it is a small instance. Could Google be querying multiple URLs at the same time and jamming my servers? Could it be becauseThanks for your help!0 -
Website being crawled but not indexed any thoughts?
Hi Everyone,
Technical SEO | | Ant71
I created a new website a few weeks ago www.drivingseaford.co.uk , did a little link citation, links from Google+, submitted to webmaster tools etc but its still not getting indexed. Webmaster tools crawl stats page is showing pages being crawled, no errors. But 0 indexed. http://www.drivingseaford.co.uk/robots.txt is showing User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Im a bit stumped as never had this before!!! Any ideas from you lovely people?? Antony0 -
How can I get a listing of just the URLs that are indexed in Google
I know I can use the site: query to see all the pages I have indexed in Google, but I need a listing of just the URLs. We are doing a site re-platform and I want to make sure every URL in Google has a 301. Is there an easy way to just see the URLs that Google has indexed for a domain?
Technical SEO | | EvergladesDirect0 -
Why is a 301 redirected url still getting indexed?
We recently fixed a redirect issue in a website, and although it appears that the redirection is working fine, the url in question keeps on getting crawled, indexed and cached by google. The redirect was done a month ago, and google shows cached version of it, even for a couple of days ago. Manual checking shows that its being redirected, and also a couple of online tools i checked report a 301 redirect. Do you have any idea why this could be happening? The website I'm talking about is www.hotelmajestic.gr and its being redirected to www.hotel-majestic.gr
Technical SEO | | dim_d0