How to stop google bot from crawling spammy injected pages by hacker?
-
Hello,
Please help me. Our one of website is under attack by hacker once again. They have injected spammy URL and google is indexing, but we could not find these pages on our website. These all are 404 Pages.
Our website is not secured. No HTTPS
Our website is using wordpress CMS
Thanks
-
Hi Shahzad,
Here is the complete process of cleaning your website from "URL injection" by Google:
https://developers.google.com/webmasters/hacked/docs/clean_site?visit_id=1-636299342838938883-1291367479&rd=1And here is some more information in case you'd like to read more about it:
https://support.google.com/webmasters/answer/3311329?hl=enAs far as there is all the information needed on the links, I will keep the answer as simple as that
Cheers,
Martin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Targeted keywords in top half of the page or through out the page?
i have created a content, want to include target keywords but where do i place them for maxim seo benefit, i am asking this because i have heard looks doesn’t give much credit if the kws are at the end?
White Hat / Black Hat SEO | | Sam09schulz0 -
Canonical tag On Each Page With Same Page URL - Its Harmful For SEO or Not?
Hi. I have an e-commerce project and they have canonical code in each and every page for it's own URL. (Canonical on Original Page No duplicate page) The url of my wesite is like this: "https://www.website.com/products/produt1"
White Hat / Black Hat SEO | | HuptechWebseo
and the site is having canonical code like this: " This is occurring in each and every products as well as every pages of my website. Now, my question is that "is it harmful for the SEO?" Or "should I remove this tags from all pages?" Is that any benefit for using the canonical tag for the same URL (Original URL)?0 -
Hacked Websites (Doorways) Ranking First Page of Google
Hello Moz community! I could really use your help with some suggestions here with some recent changes I've noticed in the Google serps for terms I've been currently working on. Currently one of the projects I am working on is for an online pharmacy and noticed that the SERPs are being now taken up by hacked websites which look like doorways to 301 redirect to an online pharmacy the hacker wants the traffic to go to. Seems like they may be wordpress sites that are hacked and have unrelated content on their websites compared to online pharmacies. We've submitted these issues as spam to Google and within chrome as well but haven't heard back. When searching terms like "Canadian Pharmacy Viagra" and other similar terms we see this issue. Any other recommendations on how we can fix this issue? Thanks for your time and attached is a screenshot of the results we are seeing for one of our searches. 1Orus
White Hat / Black Hat SEO | | monarkg0 -
Starting every page title with the keyword
I've read everywhere that it's vital to get your target keyword to the front of the title that you're writing up. Taking into account that Google likes things looking natural I wanted to check if my writing title's like this for example: "Photographers Miami- Find the right Equipment and Accessories" ..Repeated for every page (maybe a page on photography in miami, one on videography in Orlando etc) is a smart way to write titles or if by clearly stacking keywords at the front of every title won't be as beneficial as other ways of doing it?
White Hat / Black Hat SEO | | xcyte0 -
Google penalty having bad sites maybe and working on 1 good site ?!!!
I have a list of websites that are not spam.. there are ok sites... just that I need to work on the conent again as the sites content might not be useful for users at 100%. There are not bad sites with spammy content... just that I want to rewrite some of the content to really make great websites... the goal would be to have great content to get natual links and a great user experience.. I have 40 sites... all travel sites related to different destinations around the world. I also have other sites that I haven't worked on for some time.. here are some sites: www.simplyparis.org
White Hat / Black Hat SEO | | sandyallain
www.simplymadrid.org
www.simplyrome.org etc... Again there are not spam sites but not as useful as they coul become... I want to work on few sites only to see how it goes.... will this penalise my sites that I am working on if I have other sites with average content or not as good ? I want to make great content good for link bait 🙂0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Shadow Page for Flash Experience
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0 -
Yahoo Slurp Bot 3.0 Going Crazy
On one of our sites, since the Summer, Yahoo Slurp bot has been crawling our pages at about 5 times a minute. We have put a crawl delay on it and it does not respect our robots.txt. Now the issue is it's triggering javascript (which bots shouldn't) triggering our adsense, ad server, analytics information, etc. We've thought of banning the bot all together but get a good amount of Yahoo traffic. We've though about programmatic-ly not showing the javascript (ad + analytic) tags but are slightly afraid the Yahoo might consider this cloaking. What are the best practices to deal with this bad bot.
White Hat / Black Hat SEO | | tony-755340