Third part http links on the page source: Social engineering content warning from Google
-
Hi,
We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas?
Thanks
-
Hi Serge,
Google removed the warning from our pages and they are back to normal. However Google didn't give any info on what exactly caused the issue. Probably they marked by mistake and removed the warning. We have removed the meta descriptions after receiving this warning which were added two weeks back before this issue, but I don't think they have anything to do with this. Even the third party http links are not the culprits as they are still at the pages.
Thanks
-
That's interesting—please do! Interested to hear what the cause is...
-
Hi Serge,
Thanks for your interest on this. Seems like our GA tracking code is the culprit as for the info so far we have on this. Pointing to the same, I have requested a review to the Google. I will update you on this once we hear back from them.
Thanks
-
Hi there,
I haven't seen this warning from Google but reading their guidelines suggests that perhaps your site has been hacked or maybe you're running some ads that have content embedded that would trigger this warning?
Having a link to a site that doesn't use HTTPs shouldn't cause this warning to appear... maybe others have a different experience? Interesting to hear what others say.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unlisted (hidden) pages
I just had a client say they were advised by a friend to use 'a bunch of unlisted (hidden) pages'. Isn't this seriously black hat?
White Hat / Black Hat SEO | | muzzmoz0 -
Disavow links leading to 404
Looking at the link profile anchor text of a site i'm working on new links keep popping up in the reports with let's say very distasteful anchor text. These links are obviously spam and link to old forum pages for the site that doesn't exist any more, so the majority seem to trigger the 404 page. I understand that the 404 page (404 header response) does not flow any link power, or damage, but given the nature and volume of the sites linking to the "domain" would it be a good idea to completely disassociate and disavow these domains?
White Hat / Black Hat SEO | | MickEdwards0 -
Do I need to undo a 301 redirect to dissavow links from the source domain?
A client came to me after being hit by Penguin and had already performed a 301 redirect from site A to Site B. Site B was subsequently hit by the penalty a number of weeks later and we are planing on performing link removal for Site A. Only the webmaster tools account for Site B exists, none is still available for site A. I assume that I cannot dissavow links to site A from Site B's webmaster tool account (even though website A's links show up in the GWT account). So do I need to undo the 301 and then create a new GWT account for site A in order to disavow the links pointing to site A, or can I submit from Site B's GWT account since they are 301'd to site B? Thanks! Chris [edited for formatting]
White Hat / Black Hat SEO | | SEOdub0 -
Being Link Attacked - Should I worry?
Hey, Hope everyone is well. Just a quick question. I hope to get an answer from Google officially (I've asked in their webmaster forums area) but any experience or opinions from the community here would be great. I noticed recently that our site started to get thousands of links from comments in random blogs from all across the web. This is nothing to do with us as we don't "build links". I can only assume it is a competitor trying to get our site hit by the algorithm for a particular search term, as all the anchor text (I estimate about 1,800 links with this anchor text) point to one page on our site that is ranking for that term. I recently removed the website from webmaster tools and re added, due to an unrelated issue about the a video rich snippet not updating, and all the links have just popped up today on there. Is this something I need to worry about? and should I start collecting all these domains and using the disavow tool to block the whole domain of these sites with the comments (some of them seem like genuine sites). There seem to be new ones everyday and it looks to be an ongoing attack as well. Thanks in advance!
White Hat / Black Hat SEO | | JonathanRolande0 -
IS there such a thing as a Link Juice Viewer?
Hi, I am managing the tech and SEO for an ecommerce site with a big mega menu with over 140 cats/subcats and well, I know that my link juice is diluted and am thinking of cutting back on the categories but in the meantime. Is there a link juice visualizer? How can I see in a visual format how linkjuice is flowing through the site? Thanks
White Hat / Black Hat SEO | | bjs20100 -
Are landing pages making a comeback
Just recently I have noticed a ever increasing number of landing pages on websites, the ones I have come across have been in the sports industry like rugby/football and their landing pages are sparse but offering the social avenues on a plate. are Landing pages making their way back in the seo industry?
White Hat / Black Hat SEO | | TeamacPaints0 -
Switching prices for google base
We would like to be able to submit lower prices to google than we do to other sources. How i see it working is that at the end of each url we submit to google base there is a tracking code (source=googlebase). When a user visits the site via one of these urls we would knock 10% of the price of that item and store the item in a cookie to ensure that the price of that item, for that user would stay at the low price for 24 hours. My question is whether google would have a problem with us doing this? The second part of my question is whether they check the full url including the query strings? If theyt just checked the canocial URL they would see a price thats 10% higher than the one we submitted to base - which, of course - would be bad
White Hat / Black Hat SEO | | supermarketonline0 -
When asking for links, what are good incentives to offer?
New to SEO and want to stay clean, What are white hat incentives you can offer in exchange for links? Giveaway for their readers? Give them helpful advice? Record video of me drinking a gallon of milk within 5 minutes?
White Hat / Black Hat SEO | | 10JQKAs0