Social engineering Content detected
-
Hi there
I am Facing Social Engineering Content Detected on www.domain.com from long time, we have Removed All Bad Java script, unnecessary links, bad content. After removing we Did Review also in Google, But still again & again we are getting this Notification in webmaster, is This harmful for Our web traffic?? how do I permanently Clear This Notification ?
please any body can help ?
Thanx in advance
-
Hi
we did check almost Everything seems very ok, can you tell if there any feature tools which we can use to check where That malicious content lies, as we did scan With Norton safety. like that any other tool or software except Google search console ?
Thnx in advance
-
ok thank you so much , Let us check
-
ok thank you so much , Let us check
-
I agree with Donald Silvernail, it might be a script hiding in the DB or somewhere on the website. You can either ask for help from your hosting provider for a thorough scan or do it yourself if you have sufficient resources plus rights.
Regards,
Vijay
-
This basically means your website has been hacked in someway. You must remove all malware or malicious code. Many times hackers will setup skimming pages. Its a page that looks like for example a bank login but in reality its a scam trying to get access to someones info. After you remove this content then you have to tell google that you removed it in search console. Once they see it is gone they will lift the warning.
Thanks,
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with this duplicate content
Hello our websites offers prayer times in the US and UK. The problem is that we have nearby towns where the prayer times are the same and the pages (exp : https://prayer-times.us/prayer-times-lake-michigan-12258-en and https://prayer-times.us/prayer-times-lake-12147-en) are in duplicate . Same issue for this page https://prayer-time.uk/prayer-times-wallsend-411-en How can we solve this problem
On-Page Optimization | | Zakirou0 -
Unique Pages with Thin Content vs. One Page with Lots of Content
Is there anyone who can give me a definitive answer on which of the following situations is preferable from an SEO standpoint for the services section of a website? 1. Many unique and targeted service pages with the primary keyword in the URL, Title tag and H1 - but with the tradeoff of having thin content on the page (i.e. 100 words of content or less). 2. One large service page listing all services in the content. Primary keyword for URL, title tag and H1 would be something like "(company name) services" and each service would be in the H2 title. In this case, there is lots of content on the page. Yes, the ideal situation would be to beef up content for each unique pages, but we have found that this isn't always an option based on the amount of time a client has dedicated to a project.
On-Page Optimization | | RCDesign741 -
Duplicate page content
Hi Crawl errors is showing 2 pages of duplicate content for my clients WordPress site: /news/ & /category/featured/ Yoast is installed so how best to resolve this ? i see that both pages are canonicalised to themselves so presume just need to change the canonical tag on /category/featured/ to reference /news/ ?(since news is the page with higher authority and the main page for showing this info) or is there other way in Yoast or WP to deal with this & prevent from happening again ? Cheers Dan
On-Page Optimization | | Dan-Lawrence0 -
Fading in content above the fold on window load
Hi, We'd like to render a font stack from Typekit and paint a large cover image above the fold of our homepage after document completion. Since asynchronously loading anything generally looks choppy, we fade in the affected elements when it's done. Sure, it gives a much smoother feeling and fast load times, but I have a concern about SEO. While Typekit loads, h1, h2 and the page's leading paragraph are sent down the wire with an invisible style (but still technically exist as static html). Even though they appear to a user only milliseconds later, I'm concerned that a search engine's initial request is met with a page whose best descriptive assets are marked as invisible. Both UX and SEO have high value to our business model, so we're asking for some perspective to make the right kind of trade off. Our site has a high domain authority compared to our competition, and sales keyword competition is high. Will this UX improvement damage our On-Page SEO? If so and purely from an SEO perspective, roughly how serious will the impact be? We're eager to hear any advice or comments on this. Thanks a lot.
On-Page Optimization | | noyelling0 -
Does Widgetised Content Index The Same As A Regular Page
Hi, We have a website that was built in my opinion bizarrely where the bottom half of the page where most of the content is, is a widget. I just wondered if the content being in a widget is indexed any differently. I ask as normal pages seem to index and rank much better than the wordpress template using the widget. Hope someone might be able to clarify this. Thanks
On-Page Optimization | | denismilton0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0 -
Magento Layered Navigation & Duplicate Content
Hello Dear SeoMoz, I would like to ask your help with something that I am not sure off. Our ecommerce web site is built with Magento. I have found many problems so far and I know that there will be many more in the future. Currently, I am trying to find the best way to deal with the duplicate content that is produced from the layered navigation (size, gender etc). I have done a lot of research so far in order to understand which might be the best practice and I found the following practices: **Block layered navigation URLSs from the Google Webmaster Tools (**Apparently this works for Google Only). Block these URLs with the robots.txt file Make links no-follow **Make links JavaScript from Magento *** Avoid including these links in the xml site map. Avoid including these link in the A-Z Product Index. Canonical tag Meta Tags (noindex, nofollow) Question If I turn the layered navigation links into JavaScript links from the Magento Admin, the layered navigation links are still found by the crawlers but they look like that: | http://www.mysite.com/# instead of: http://www.mysite.com/girls-basics.html?gender_filte... | Can these new URLS (http://www.mysite.com/# ) solve the duplicate content problems with the layered navigation or do I need to implement other practices too to make sure that everything is done right. Kind Regards Stefanos Anastasiadis
On-Page Optimization | | alexandalexaseo0 -
Content within JavaSccript code
I know that it is not a good practice to inlcude SEO content within JavaScript, but are there exceptions to what Google can spider or is it best to just avoid completely?
On-Page Optimization | | mjmorse0