2000 pages indexed in Yahoo, 0 in Google. NO PR, What is wrong?
-
Hello Everyone,
I have a friend with a blog site that has over 2000 pages indexed in Yahoo but none in Google and no page rank. The web site is http://www.livingorganicnews.com/ I know it is not the best site but I am guessing something is wrong and I don't see it.
Can you spot it? Does he have some settings wrong? What should he do?
Thank you.
-
The site is just looking like a site of a blog network. The domain is 5 years old & the home page has DA & PA of 34 still not indexed by Google. I searched for site:livingorganicnews.com in Google which is not giving any results. So it shows that the site is penalized by Google. Use Google webmaster tools for further verification so as to find the reason.
Most probably it's penalized because of being a site of a paid blog network.
-
LOL, the fact that there's a tonne of clearly spun content won't help. I gather this is part of a content scraping or sharing network like LinkVine?
Have you tried reading the articles published? Could do with some quality guidelines for what gets accepted imho
Even when it gets indexed, it's not going to rank anywhere... this is exactly the kind of site that Panda wanted to stop. Regurgitated, nonsensical, spun, tosh that looks as if it was written by a lunatic and only really exists for the sake of its outgoing links, that point to other rubbish.
I'd tell your friend to give up on this site entirely and start looking at less automated ways of doing things. Google is only going to get tougher and tougher on these sites so he's fighting a losing battle.
I don't mean to be rude but I hope it doesn't get indexed ever, what value does it offer to anyone for anything? Most people don't want stuff like that clogging up the web. I don't mean to sound harsh but tell your friend the problem with the site is.... it's crap.
-
Another one of the many not-quite-right things on the site are some of the older posts like http://www.livingorganicnews.com/games/2010/panasonic-announced-the-jungle-handheld-gaming-platform/1965/ which end with "incoming search terms" and several search terms that all hyperlink to that exact same article. Search engines will not see that as providing any value to the user (users are already on that page, they don't need to link to that page) and they will see it as just another attempt to manipulate the engines.
-
It is interesting to have a new set of eyes here. I had noticed his different writing but figured it was because English is not his first language. I will ask if he is actually writing this.
-
Keri is absolutely right.
I did not look at the site's content. It couldn't be much worse. It is a 100% spam site which should never be indexed. Clearly the site is under a penalty.
Google's job is to satisfy a user's search query by giving them the content they seek. If you create a site like that, NO ONE will ever want to get that site as the result of a search query. Google correctly recognizes this fact and removes the site from their database.
-
When there are a couple of thousand other pages like this, yes.
http://www.livingorganicnews.com/games/2011/get-cool-with-selected-berber-carpet-tiles-now/3215/
The subject of the article is about berber carpet tiles, yet the text has links (I used bold) that are totally off base and make no sense. For example:
"The berber carpet tiles might also be renowned for the durability and stain resistance at extended stay motel rates."
"To get rid of the difficult to vacuum Provillus scam dust particles..."
"An important benefit in using berber carpet tiles is a likelihood to eliminate the damaged location alone and replace it with a new carpet tile, a comparatively low-cost way of capatrex scam damage control, to make your ground look just like new."
-
Absolutely.
It is entirely possible he has been removed from Google's index as a result of a penalty. If he links to sites which receive a penalty (mobile casinos would be a very bad choice of sites to link to) then his site could receive a penalty as well.
My suggestion is not to jump to the conclusion the site is under penalty. Start by checking WMT, then if nothing is discovered submit the sitemap. If you don't see any results after a few days, then proceed to inquiring with Google about the site being under a penalty.
-
The text doesn't really seem like a human wrote it. The current most recent article has the title "Religious Credit card debt Enable Provides You With the Meaningful and Economical You Need". Other posts are about acne treatment reviews, alcoholism, and other seemingly random things.
It really looks like it's been through an article spinner. The article about alcoholism ends with "So, Think before you Beverage." Uh..really? Or what about "As emission safety glasses are put on in the office, they need to provide ease and comfort, safe healthy and crystal clear eyesight to make sure they are usually not golf clubs to the wearer." An article I found that wasn't spun is instead indexed 94 other times on the web.
I would say the content is why Google has not indexed it. They can't find the value to the user for returning this in a search result. Is this truly the content that your friend has put up, or has the site gotten hacked?
-
Hello Bryce,
That sounds possible to loose credibillity but could it be the reason for no index?
-
Thank you Ryan,
I will ask him about GWT. Perhaps it is just a sitemap issue but I wonder why Yahoo would spot it and Google would totally miss it. I often see that they have a difference in pages indexed but this is the first time I have seen thousands verses zero.
-
'm thinking that by linking out to Mobile Casinos and Polish Rock Bands, he's probably losing credibility.
-
I didn't notice any obvious problem with your site. Have you logged into Google Webmaster Tools and looked at the site? That would be the logical next step.
The robots.txt file looks fine, there is not any "noindex" tag on the home page, a GA code is present on the page, etc. I would suggest reviewing the site in Google's WMT and look for any issues.
If none are present, the next step would be to submit a sitemap. If your friend does not have a sitemap already set up, you can use http://www.xml-sitemaps.com/ I think the free version only maps 500 pages, but that is enough to get you started.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Only fraction of the AMP pages are indexed
Back in June, we had seen a sharp drop in traffic on our website. We initially assumed that it was due to the Core Update that was rolled out in early June. We had switched from http to https in May, but thought that should have helped rather than cause a problem. Until early June the traffic was trending upwards. While investigating the issue, I noticed that only a fraction (25%) of the AMP pages have been indexed. The pages don't seem to be getting indexed even though they are valid. Accordingly to Google Analytics too, the percentage of AMP traffic has dropped from 67-70% to 40-45%. I wonder if it is due to the indexing issue. In terms of implementation it seems fine. We are pointing canonical to the AMP page from the desktop version and to the desktop version from the AMP page. Any tips on how to fix the AMP indexing issue. Should I be concerned that only a fraction of the AMP pages are indexed. I really hope you can help in resolving this issue.
Technical SEO | | Gautam1 -
Indexing Issue of Dynamic Pages
Hi All, I have a query for which i am struggling to find out the answer. I unable to retrieve URL using "site:" query on Google SERP. However, when i enter the direct URL or with "info:" query then a snippet appears. I am not able to understand why google is not showing URL with "site:" query. Whether the page is indexed or not? Or it's soon going to be deindexed. Secondly, I would like to mention that this is a dynamic URL. The index file which we are using to generate this URL is not available to Google Bot. For instance, There are two different URL's. http://www.abc.com/browse/ --- It's a parent page.
Technical SEO | | SameerBhatia
http://www.abc.com/browse/?q=123 --- This is the URL, generated at run time using browse index file. Google unable to crawl index file of browse page as it is unable to run independently until some value will get passed in the parameter and is not indexed by Google. Earlier the dynamic URL's were indexed and was showing up in Google for "site:" query but now it is not showing up. Can anyone help me what is happening here? Please advise. Thanks0 -
How Google sees my page
When looking for crawlability issues, what is the difference between using webmaster tools Fetch as google, looking at the cached pages in google index site:mypage.com, or using spider simulator tools.
Technical SEO | | shashivzw0 -
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
Google authorship what am i doing wrong
Hi, i am using google authorship on my site but when i use the testing tool it is not working. before the upgrade we had it working fine but now it does not seem to work. we have our google plus account pointing to the site and the writer we are trying to add is not coming up on the tool here is the code we are putting on the page Google+ and the page in question is here http://www.in2town.co.uk/emmerdale/emmerdale-laurel-is-determined-to-take-action when i check the tool i get the following Authorship Testing ResultAuthorship is not working for this webpage.andAuthorship rel=author MarkupCannot verify that rel=author markup has established authorship for this webpage.the tool i am using to check is http://www.google.com/webmasters/tools/richsnippetsany help to solve this problem would be great. i am using joomla
Technical SEO | | ClaireH-1848860 -
Empty Google cached pages.
My little startup Voyage has a tough relationship with Google. I have been reading SEOMOZ/MOZ for years. I am no pro but I understand the basics pretty well. I would like to know why all pages on my main domain look empty in google cache. Here is one example. Other advice is welcome too. I know a lot of my metas and my markup is bad but I am working on it!
Technical SEO | | vincentgagne0 -
Changed cms - google indexes old and new pages
Hello again, after posting below problem I have received this answer and changed sitemap name Still I receive many duplicate titles and metas as google still compares old urls to new ones and sees duplicate title and description.... we have redirectged all pages properly we have change sitemap name and new sitemap is listed in webmastertools - old sitemap includes ONLY new sitemap files.... When you deleted the old sitemap and created a new one, did you use the same sitemap xml filename? They will still try to crawl old URLs that were in your previous sitemap (even if they aren't listed in the new one) until they receive a 404 response from the original sitemap. If anone can give me an idea why after 3 month google still lists the old urls I'd be more than happy thanks a lot Hello, We have changed cms for our multiple language website and redirected all odl URl's properly to new cms which is working just fine.
Technical SEO | | Tit
Right after the first crawl almost 4 weeks ago we saw in google webmaster tool and SEO MOZ that google indexes for almost every singlepage the old URL as well and the new one and sends us for this duplicate metatags.
We deleted the old sitemap and uploaded the new and thought that google then will not index the old URL's anymore. But we still see a huge amount of duplicate metatags. Does anyone know what else we can do, so google doe snot index the old url's anymore but only the new ones? Thanks so much Michelle0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0