Homepage refusing to show up in Google (rest of pages fine)
-
edit
-
Ah, I was wondering since they may have entirely different pricing based upon who you talk to.
-
SiteLock
-
So, on an invoice, do you or the client pay Incapsula or SiteLock?
-
Exactly, I've been told that these problems surfaced around the time the firewall was put up. I've just removed the timthumb file and I'm working on disavowing the spammy links pointing to us. I'm considering ditching sitelock in the next few days and seeing if that helps at all. We were also looking at Sucuri as a firewall option as well.
-
All of the header checks I've done come back with Incapsula. I don't really want to get much further into that for a number of reasons. But if you're actually paying SiteLock that's pretty interesting.
But you're saying the site ranked for it's brand term, at least, before implementing either SiteLock or Incapsula?
-
This is a huge help. I spent some time yesterday going through the site and updating my links to https where possible. Those don't all appear to have indexed yet. The bit about the timthumb exploit is particularly helpful. My theme lets me disable that, and I can get rid of the timthumb php file. I'm still concerned that sitelock could be exaggerating the problem though, we started having these issues with google around when it was implemented.
-
The site is using Incapsula as a CDN and web application firewall. The site still has a timthumb file. So I wouldn't recommend stepping out from behind that right now.
A wildcard search on the domain yields a lot of spam backlinks. Check ahrefs.
-
The entire site appears to index fine. As Patrick pointed out, it appears some of the pages in the index aren't https. But I don't know when you made the move, so things may be chugging right along.
The issue is ranking. But I know what you mean.
So what we have is (not all bad, per se - just what I see):
- Previously hacked site
- Timthumb file
- Some very spammy links
- HTTPS implemented on unknown date
- Moved to CDN / WAF
- Redirects
No doubt, you're going to have to disavow the bad links. Take down requests are nice and all, and you should note them in your disavow submission, but you don't have to manually contact each individual link/domain. It's not really a fire-and-forget process. You can submit it more than once.
I would bet a shiny nickle the attack/hack exploited the timthumb file. The site still uses it. Stop using it. Find an alternative. All it does is resize images.
The https migration (redirects... etc.) is just a confounding factor.
After you've removed the timthumb file, request a security review. Also consider the site may still have issues from the hack. So fetch as google from Webmaster Tools. If you see anything different than the real page, you still have a problem.
Read a little more about recovering from a hacked site here. I think that's more than likely the core of the problem right now.
-
Let me guess - you're using SiteLock after you were hacked to keep them out?
SiteLock creates this issue frequently (we solved it for another Q&A user about a month ago.)
Disable SiteLock, check your settings are all right in Webmasters Tools and Fetch the page in WMT. Add a link to it on Google+ so it gets recrawled quickly.
I only see 1 backlink to the site from Ahrefs (https://ahrefs.com/site-explorer/overview/subdomains?target=www.newstaradhesives.com) and only 2 in Majestic (https://majestic.com/reports/site-explorer?folder=&q=www.newstaradhesives.com)
Very, very low authority & SiteLock - those would be the two I'd start with.
-
It absolutely was very hacked. I'm currently in the process of submitting takedowns manually for those spam posts in google's index. The site has been cleaned up and relaunched since. Could these be harming the indexing of the homepage as well?
-
I think Incapsula is throwing the false noindex tag. But yeah, that's just how Incapsula do. The home page shows just fine with a site: operator.
Judging by the anchor text I see pointed at the site... and the Timthumbs.php file... the site was very very hacked at some point.
Edit: Yep. It was hacked until late last year.
-
Hi Patrick
Thanks for taking a look. If I could ask, where are you seeing this noindex tag and what are you using to see it? I've got my homepage set up in the yoast seo plugin to index and follow, and I had also previously added a into my header just to make sure. My suspicion is that the sitelock firewall installed on our site right now is blocking robots. Does this make any sense?
Thanks again
-
I wanted to attach this image - in my crawl, I am getting a "noindex,nofollow" but your code isn't showing it. I would check with your web development team to see what exactly is happening and how this can be fixed.
-
Hi there
It appears your homepage has a "noindex,nofollow" tag - change this to "index,follow". Make sure this is fixed across the site.
If for some reason that doesn't work (which it will):
Have you checked to see if you have a manual action?
If you have multiple URLs going on with the same content - check your canonical tags and make sure you do a content audit to see if this information can be removed, consolidated, or updated. Your SSL seems to not be configured properly also.
I would also make sure that you do a backlink audit to see if any links can be removed or updated. Also, check your local SEO presence and that everything is on point and consistent. Same with on-site SEO.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multi-page articles, pagination, best practice...
A couple months ago we mitigated a 12-year-old site -- about 2,000 pages -- to WordPress.
Web Design | | jmueller0823
The transition was smooth (301 redirects), we haven't lost much search juice. We have about 75 multi-page articles (posts); we're using a plugin (Organize Series) to manage the pagination. On the old site, all of the pages in the series had the same title. I've since heard this is not a good SEO practice (duplicate titles). The url's were the same too, with a 'number' (designating the page number) appended to the title text. Here's my question: 1. Is there a best practice for titles & url's of multi-page articles? Let's say we have an article named: 'This is an Article' ... What if I name the pages like this:
-- This is an Article, Page 1
-- This is an Article, Page 2
-- This is an Article, Page 3 Is that a good idea? Or, should each page have a completely different title? Does it matter?
** I think for usability, the examples above are best; they give the reader context. What about url's ? Are these a good idea? /this-is-an-article-01, /this-is-an-article-02, and so on...
Does it matter? 2. I've read that maybe multi-page articles are not such a good idea -- from usability and SEO standpoints. We tend to limit our articles to about 800 words per page. So, is it better to publish 'long' articles instead of multi-page? Does it matter? I think I'm seeing a trend on content sites toward long, one-page articles. 3. Any other gotchas we should be aware of, related to SEO/ multi-page? Long post... we've gone back-and-forth on this a couple times and need to get this settled.
Thanks much! Jim0 -
Any way of showing missed sales in Google Analytics?
Sit down, this might get a little complicated... I was approached by a design company to do some SEO work a client of theirs. Usually, this stuff is white label but I have direct contact with the client as the design agency felt it was easier for me to do this. The website is performing really well and looking at the sales funnel, I'm getting people wanting to buy. BUT, and here's the problem, everything falls apart because of the way the check out works. It's appalling. The customer has to register to buy a product, there's no guest check out or anything. The checkout button is right below the fold and you'd miss it completely if you didn't actually search for it. Basically, it's losing the client money. Last month alone there were 300~ people entering the conversion funnel and NONE of them complete it. I've been talking with the design company and they basically saying that it's too much work for them to change it, it's a signed off project blah blah. UI reports have been conducted and sent to them but still nothing. I have the client asking (a great client, obviously wondering why there is a lack of return on his investment) why he isn't making money. He's asking me because I'm the guy thats meant to be getting him the cash back. I keep saying to the design agency the problems and that it's never going to make money. The potential is massive. But thats my problem. Is there ANY way in GA to calculate the missed sales? I know that I can view the total amount made when the customer successfully checks out but I need figures to present that I'm leading the horse to water, but the check out system is preventing it from drinking. tl;dr I need to show client/design agency missed sales due to poorly built checkout system. Cheers!
Web Design | | jasonwdexter0 -
From Google Sites to Wordpress - Anyone Ventured this SEO terrain?
We have a few sites in Google Sites - and they are ugly! We have a majority (40+) of websites in Wordpress. But we have a few websites just stuck on Google Sites, and since Google won't let you fully edit the HTML, add scripts, or implement any technology since 2000, we want to move. The sad problem - the Google sites are ranking well. We rank well in Manhattan, Atlanta, Dallas, and Philadelphia. The problem is - the sites do not give much room for growth - and the bounce rate is high because they are so ugly. Has Anyone moved from Google sites to Wordpress? Should we just stay with Google and bite the ugly bullet? My fear is that these sites will not allow for growth. It is hard to update them and even harder to make them look nice. To get a sample - beware: www.counselingphiladelphia.com Even another reason to leave: The slider is non-semantic and terrible SEO. Google won't allow a slider script with tags and a hrefs, so the only way to implement a slider is through a Google Docs Presentation that keeps sliding. I know - terrible SEO (#donthate) but we needed something. Any advice and thoughts would help! Thanks Mozzers!
Web Design | | _Thriveworks0 -
Show root domain (that is 301 redirected) in SERP?
Hi, If I have the domain name www.businessname.com.au pointing (using 301 redirect) to a particular page on a business directory site (eg www.bizdirectory.com.au/businessname), is it possible to have the URL www.businessname.com.au displayed in the Google search results rather than the destination page of www.bizdirectory.com.au/businessname? Thanks in advance,
Web Design | | blackrails
Adam0 -
Site Redesign: Bounce rate, converstion, page views, etc.
Hi Fellow Mozzers, I had a few questions regarding some analytics data we have been seeing since our redesign. Just last week we did a site design overhaul at www.lylif.com. One of the biggest changes we immediately saw was a 15-20% increase in our bounce rate. However, our conversion rates, page views, pages per visit, and site duration has increased. If anyone has some insight as to why we may be having such a large increase in our bounce rate that would be most helpful!
Web Design | | lylif11 -
Google cache and rel alternate
http://groups.google.com/a/googleproductforums.com/forum/#!searchin/webmasters/rel$20alternate$20not$20works/webmasters/xzwTBJemPss/LyRjRCigZdYJ http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
Web Design | | ctam
Situation:www.example.ca - canadian IP and https://www.google.com/webmasters/tools/settings:"Your site's domain is currently associated with the target: Canada" - done long time ago.<head><link rel="canonical" href="http://www.example.ca/" /> <link< span="">rel="alternate" hreflang="en-us" href="http://www.example.com/" /></link<><link rel="alternate" hreflang="en-ca" href="http://www.example.ca/" /><link rel="alternate" hreflang="en" href="http://www.example.com/" />```
www.example.com - US IP and
https://www.google.com/webmasters/tools/settings:
"Geographic target Target users in: United States" done long time ago.
<head><link rel="canonical" href="http://www.example.com/" /> <link< span="">rel="alternate" hreflang="en-us" href="http://www.example.com/" /><link rel="alternate" hreflang="en-ca" href="http://www.example.ca/" />
<link rel="alternate" hreflang="en" href="http://www.example.com/" /> Differences: Prices and some minor changes in design. cache:www.example.com - shows .ca version,
with snapshot's date after rel="alternate" had been added.
Results: In usexample.com pages do not appear in search results.
Some times www.example.ca pages do,
but they are even close so well ranked as example.com pages before. Question: What we are doing wrong?</link<>0 -
Duplicate Page Content
Currently experiencing duplicate pages for all hotel pages. What would be recommendation to fix the pop up pages that uses javascript? | http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=fotos http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=localizacion http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=panorama http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=tourVisual |
Web Design | | Melia0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0