Website is not indexed in Google
-
Hi Guys,
I have a problem with a website from a customer. His website is not indexed in Google (except for the homepage). I could not find anything that can possibly be the cause.
I already checked the robots.txt, sitemap, and plugins on the website. In the HTML code i also couldn't find anything which makes indexing harder than usual.
This is the website i am talking about: http://www.xxxx.nl/ (Dutch)
The only thing that i am guessing now is the Google sandbox, but even that is quite unlikely.
I hope you guys discover something i could not find!
Thanks in advance
-
Baldea,
The domain was new indeed.
We are going to try your suggestions and hope for the best!
Fingers crossed indeed
-
Bastiaan,
The domain was new, right? I mean it wasn't dropped/expired etc.
Try to get a dofollow link from a relevant website that has some traffic (Google tends to index very fast such websites and outgoing resources).
Also, make sure you have a sitemap and try including a line in robots.txt:
Sitemap: http://www.wikiboedel.nl/sitemap.xml
Submit the sitemap in Google's Webmaster Tools.
If you do all these, it is impossible bot to get indexed in a few days.
Anyway, fingers crossed.
-
I will try the line in the robots.txt. I already created an XML sitemap which has been submitted via Google Webmaster Tools.
Thanks for helping.
-
Hmm, this is strange indeed. Google should follow the links on the home page and index the available subpages. 2 months should be plenty of time too. Maybe try these two things:
- Add this line to robots.txt:
Allow: / ```Even though the current robots looks in order, this specifically tells search engines to index all except the /wp-pages you excluded.
- Create an XML Sitemap (or just a manual .txt file) and submit it via GWT
This might speed it up. BTW, the site is correctly indexed in Bing.
- Add this line to robots.txt:
-
Thanks for the quick answer Baldea.
This website has been online for about 2 months now i think. It has been verified in the Google Webmaster Tools.
I also did a little linkbuilding (about 3 links) on dutch websites like www.ekudos.nl - which did not seem to help.
-
Hi Bestiaan,
If it was recently (1-2 weeks) created and if you used robots.txt to block search engines (when I create a website, I use to block search engines from robots.txt, until everything is working fine and only afterwards I modify the permissions) than it's normal.
Verify the website in Google's Webmaster Tools and go to the robots.txt section. You should see if that is the reason.
I see your robots.txt file is fine, you have no NOINDEX meta tag or so.
To "stimulate" the indexation process you could make some use of social media. Share a post and wait 1-2 days. Or, some relevant direct link to one of the posts.
I hope this helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why can't google mobile friendly test access my website?
getting the following error when trying to use google mobile friendly tool: "page cannot be reached. This could be because the page is unavailable or blocked by robots.txt" I don't have anything blocked by robots.txt or robots tag. i also manage to render my pages on google search console's fetch and render....so what can be the reason that the tool can't access my website? Also...the mobile usability report on the search console works but reports very little, and the google speed test also doesnt work... Any ideas to what is the reason and how to fix this? LEARN MOREDetailsUser agentGooglebot smartphone
Technical SEO | | Nadav_W0 -
How to remove all sandbox test site link indexed by google?
When develop site, I have a test domain is sandbox.abc.com, this site contents are same as abc.com. But, now I search site:sandbox.abc.com and aware of content duplicate with main site abc.com My question is how to remove all this link from goolge. p/s: I have just add robots.txt to sandbox and disallow all pages. Thanks,
Technical SEO | | JohnHuynh0 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
Google Sitelinks
Is there anyway to control the sitelinks under a listing in Google? I have a group of lawyers where 1 of the them is showing up in the sitelinks. They want all of the lawyers to show up. Right now it is showing 1 lawyer, about page, contact us page, etc. Thanks!!!!
Technical SEO | | SixTwoInteractive0 -
Hit by Google
My site - www.northernlightsiceland.com - has been hit by google and Im not sure why. The traffic dropped 75% last 24 hours and all the most important keywords have dropped significantly in the SERP. The only issue I can think of are the subpages for the northern lights forecasting I did every day e.g. http://www.northernlightsiceland.com/northern-lights-forecast-iceland-3-oct-2012/ I have been simply doing a copy/paste for 1 month the same subpage, but only changing the top part (Summary) for each day. Could this be the reason why Im penalized? I have now simply taken them all down minus the last 3 days (that are relevant). What can I do to get up on my feet again? This is mission critical for me as you can imagine. Im wondering if it got hit by this EMD update on 28 sept that was focusing on exact match domains http://www.webmasterworld.com/google/4501349-1-30.htm
Technical SEO | | rrrobertsson0 -
I was googling the word "best web hosting" and i notice the 1st and 3rd result were results with google plus. Does Google plus now play a role in improving ranking for the website?
I was googling the word "best web hosting" and i notice the 1st and 3rd result were results with google plus. Does Google plus now play a role in improving ranking for the website?I see a person's name next to the website too
Technical SEO | | mainguy0 -
Google.com
Hi We are managing a .com site for a client working on getting the site ranking. The site is hosted in the US. The content is rich, deep and unique. The site is in a competitive market but had begun ranking top 50 for a selection of keywords and we could see many more in the top 100. The site is now going backwards and only has a few keywords ranking top 50 and all the others have disappeared from the rankings all together. Any thought as to what could cause this. The site is managed from the Uk but as mentioned is hosted in the US. No penguin issues as all content unique, rich, relevant and fresh. SEO is also managed from the UK. Thoughts
Technical SEO | | SEOwins0 -
Two companies merge: website A redirect 301 to website B. Problems?
Hi, last december the company I work for and another company merged. The website of company A was taken offline and the home page was 302 redirected to a page on website B. This page had information about the merger and the consequences for customers. The deeper pages of website A were 301 redirected to similar pages on website B. After a while, the traffic from the redirected home page decreased and we thought it was time to change the redirect from a 302 into a 301 redirect to the home page. Because there are still a lot of links to the home page of website A and we wanted to preserve the link juice. Two weeks ago we changed the 302 redirect from website A into a 301 redirect to the home page of website B. Last week the Google webmaster tools account of website B showed the links from the 301 redirected website A. The total amount of links doubled and the top anchor text is the name of company A instead of company B. This, off course, could trigger an alarm at Google. Because we got a lot of new links with a different anchor text. A tactic used by spammers/black-hats. I am a bit worried that our change will be penalized by Google. But our change is legit. It is to the advantage of our customers to find us if they search for the name of company A or click on a link to website A. We didn´t change the change of address of domain A in Google webmaster tools yet. Is it a good idea to change the change of address of domain A into domain B? Are there other precautions we can take?
Technical SEO | | NN-online0