Your search - site:domain.com - did not match any documents.
-
I've recently started work on a new clients website and done some preliminary work with on-page optimisation, and there is still plenty of work to be done and issues to resolve. They are ranking ok on Bing, but they are not getting any ranking on Google at all (except paid) - I tried the site:domain.com search and comes up with no results... so this confirms that something is going on with the google search rank!
Can anyone shed light on what can cause this or why this would happen?
My next step is to look at their webmaster tools (haven't had access yet), but if anyone has any tips to resolve this or where to look, that would be great!
Thanks!
-
Thanks again for your help! I will give those ideas a go.
I hope to get to the bottom of it, if for nothing else than to learn more!
Cheers.
-
Hey,
It depends on the penalty, if any.
If you have no manual actions under the Webmaster Tools, that's a hint. However, it could be an algorithmic penalty.
If the penalty, again, if any, applies to the whole site, then changing the site's contents while making sure your entire site (backlinks too) is in compliance with Google's quality guidelines, then the penalty should be revoked.
If the issue is actually only the fact that Google can't access the site, then check why, fix that ASAP and you should be ranking again in no time (check using the fetch as Googlebot to make sure that is/isn't it first).
To sum up, you should run an extensive analysis on links, content, server responses errors and find the cause of the "penalty", then work on fixing it to start ranking. Once you do, you can continue with the other SEO/design tasks.
As I said before, opening a thread in Google's Webmaster Help forums could be of much help.
All the best!
-
Thanks again for all your helpful suggestions. Here's an update on this...
Access to GWT and analytics and some more Moz tracking have revealed some server connectivity and crawl errors on the site. So I'm thinking the bots are having trouble accessing the site and hence are penalising... Bing is sill ok strangely!
At this stage there is a hold on resolving this as we are also in the process of developing a new site for this client - so we plan to now just focus on getting this site live and hopefully all the crawl errors etc will be flushed out.
One last question - is a google penalty linked to a domain or the site/files? So if we launch a new site on the same domain, but new server (host), and new files, do you think this will clear any penalties?
Thanks again.
-
If all the pages are not indexed, then yes I would assume a penalty. One of the more common reseaons a site gets penalized is due to improper linking, either inbound or outbound in nature.
If you do not yet have access to webmaster tools, there are still steps you can take. This is something you are going to have to do anyway, once you figure out what the penalty was for.
First place to start: links.
There are a wide variety of backlink tools out there. Here are a few you can try:
http://raventools.com/marketing-tools/link-manager/
http://moz.com/researchtools/ose
https://ahrefs.com/Start looking for the spammy or paid links. How can you tell? Simple. If a link has a domain like rankmehighingoogle.com or something like that, chances are it's a bad or paid backlink. The example given is a silly domain name, but you will see some like that come up. If you are unsure of a links quality, manually visit the sites to see what they are all about. If the home page has a 0 or a ? for pagerank, chances are the linking site got hit with a penalty and you should disavow that linking domain.
Another way to test is to search for the linking domain in Google. If you search for a web directory site or linking domain specifially by their name and they are nowhere to be found, Google most likely hammered them for some practice they were using.
Since you dont have access to GWT yet, this would be a good way to see what is going on with this site. You stated that you just started doing the optimization for the client, so you most likely havent had time to research the domains history yet. Once you have access to GWT you will be taking a look at links anyway, so while you are waiting for access be proactive
-
Thank you Devanur. I will look into this.
-
Hi,
I 100% agree with FedeEinhorn.
roofrackworld.com.au seems penalized somewhere around November 2013 to December 2013.
Go here: http://www.barracuda-digital.co.uk/panguin-tool/
Give an offline access to the tool for your Google Analytics account. Select the date range from September 2013 till date. Look for any Google update related penalty.
Please post your observations here so that we can take it from there.
Best regards,
Devanur Rafi
-
Glad someone else thinks it is weird!
Thank you for your help and suggestions... I will get access to webmaster tools and see what I can find.
-
Holy... this IS weird.
Checked the robots.txt and there's nothing blocking the indexing, robots meta tags are present with INDEX.
You clearly need urgent access to Webmaster tools, seems like a penalty for pure spam or something like that, as there's no 1 single page indexed, while there are other sites linking to it.
What I would do? Before doing any further onsite SEO, get that resolved. Go to Webmaster tools and check any manual action, message, etc. Try the fetch as googlebot. Then go to Google's Webmaster forums and ask, usually someone from Google jumps in.
-
I was looking for more general advice on this issue initially, to see if others had encountered this problem. But happy to share domain if it helps... with the disclaimer as I mentioned above, that there is clearly much more work to be done to get a good rank - but this issues seems to be bigger than on-site optimisation...
Thanks
-
Care sharing the real domain?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird Bing Search Results
Hi all, I'm hoping someone can explain what's going on here, because after hours of searching I cannot find anyone having the same problem... We use Bing search to provide the site search functionality on our website and recently for a particular keyword search, the results include several pages which are not on our site: they are on completely different domains! You can replicate it by going to bing.com and using the "site:" operator together with that keyword. Again, results from other domains appear in amongst the pages on our site. I cannot find any other keywords which produce this same behaviour: every other keyword I have tried shows only results from our site. However, I obviously haven't tested absolutely every possible keyword combination. Bing isn't "padding" out the results or anything like that, because we have more than enough pages referencing this term on the site, and I'm at a total loss as to why this is happening. So, I suppose my question is: has anyone ever had this happen to them? And if so, what did they do about it? Many thanks, Dan
Algorithm Updates | | clarkovitch0 -
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
Canonical when using others sites
Hi all, I was wondering if this is a good way to safely have content on our website. We have a job search website, and we pull content from other sites. We literally copy the full content text from it's original source, and paste it on our own site on an individual job page. On every individual job page we put a canonical link to the original source (which is not my own website). On each job page, when someone wants to apply, they are redirected to the original job source. As far as I know this should be safe. But since it's not our website we are canonical linking to, will this be a problem? To compare it was indeed.com does, they take 1 or 2 senteces from the original source and put it as an excerpt on their job category page (ie "accountant in new york" category page). When you click the excerpt/title you are redirected to the original source. As you might know, indeed.com has very good rankings, with almost no original content whatsoever. The only thing that is unique is the URL of the indeed.com category where it's on (indeed.com/accountant-new-york), and sometimes the job title. Excerpt is always duplicate from other sites. Why does this work so well? Will this be a better strategy for us to rank well?
Algorithm Updates | | mrdjdevil0 -
How big is the effect of having your site hosted in the country you're targeting?
Other than having a ccTLD domain and assigning your target country in Google Webmaster Tools' "geotargeting" feature, how big is the effect of having your site hosted in the country you're targeting? Is it really necessary? or it is just a small signal? Thanks in advance! 🙂
Algorithm Updates | | esiow20131 -
Microsites for Local Search / Location Based sites?
Referring to the webinar on SEOMoz about Local Search that was presented by Nifty Marketing (http://www.seomoz.org/webinars/be-where-local-is-going). I have a question my client asked us regarding why we broke out their locations into microsites, and not just used subfolders. So here are the details: The client has one main website in real estate. They have 5 branches. Each branch covers about a 50 mile radius. Each branch also covers a specialized niche in their areas. When we created the main site we incorporated the full list of listings on the main site; We then created a microsite for each branch, who has a page of listings (same as the main site) but included the canonical link back to the main site. The reason we created a microsite for each branch is that the searches for each branch are very specific to their location and we felt that having only a subfolder would take away from the relevancy of the site and it's location. Now, the location sites rank on the first page for their very competitive, location based searches. The client, as we encourage, has had recommendations from others saying this is hurting them, not helping them. My question is this... How can this hurt them when the microsites include a home page specific to the location, a contact page that is optimized with location specific information (maps, text, directions, NAP, call to action, etc.), a page listing area information about communities/events/etc., a page of the location's agents, and of course real estate listings (with canonical back to the main site)? Am I misunderstanding? I understood that if the main site could support the separation of a section into a microsite, this would help local search. Local search is the bread and butter of this client's conversions. AND if you tell me we should go back to having subfolders for each location, won't that seriously hurt our already excellent rankings? The client sees significant visitors from their placement of the location URLs. THANKS!
Algorithm Updates | | gXeSEO
Darlene1 -
Do search engines penalize for too many domain aliases?
I have a main domain name and 20+ related alias domains pointing to it. Is there a danger, penalty or concern that I should be aware of related to alias domains? I saw something on Google "Apps Administration" under "What is a domain alias" saying "you can add up to 20 domian aliases" but I don't use Google Apps. Please advise... Thank you, Chris
Algorithm Updates | | caliboyz0 -
Anyone have stats on numbers of Google users searching while logged in?
In light of Google's recent "social search update", I am curious to know how many Google users perform searches while logged into their Google account thereby showing "social results".
Algorithm Updates | | Gyi0 -
Location and how it affects search results
I'm ranking in the top 3 for a keyword in Google but when I change my location to a state of city I'm at the bottom of the page. Can anyone help? I made sure that the state I want to rank for is both in my title and body. The page has been up for almost a year and is a PR3. It is not my home page but a sup page under "areas we serve." I'm looking for someone who can explain how I can optimize for a state or city. Thanks
Algorithm Updates | | donnye0