Manual Webspam Error. Same Penalty on all sites on Webmaster Tools account.
-
My URL is: www.ebuzznet.comToday when i checked webmaster tools under manual spam section. I got Manual spam and reason was Thin content with little or no added value. Then I checked other sites from same webmaster tools account, there are 11 sites, all of them received same manual action. I never received any mail, no notification on site messages section regarding this manual action. I just need confirmation whether it is something to do with any error in webmaster tools or all of the sites really received manual spam actions.Most of the article on sites are above 500 words, quality content (no spun or copied).Looking for suggestions, answers
-
As per your example above.. I have enabled auto syndication to all my social network.. so only excerpt will syndicate along with linkback to original content..if you open any of those 50 links it will be facebook post. So sharing content is not against webmaster guidelines..
Already started editing and rewriting of post... It will take more than a week..
-
This is very strange.. they penalized this page with no content too http://ivishalverma.blogspot.com/: Thin content... another strange thing is I have added 4 of my sites to another webmaster account with different mail id. same 4 sites on both account. I am owner on both accounts. They penalized all blogs there too... totally they penalized two webmaster account. with around 18 sites (5 blogspot).
-
I Googled text from a few of your articles, and found tons of duplicates. See, for example, this search. Are you actually writing your own content, or is this sourced from someone else? I also noticed a lot of non-standard and incorrect grammar across multiple articles, which isn't going to help in a panda-type penalty.
Google is getting better at measuring users' reactions to your content, and they've even begun to dig into factors that cause users to leave a page and seek another result. Whether or not Google can determine the quality of language, users can, and they don't react well to language that doesn't flow.
Here's what I'd do:
- Make sure your content is unique. If you're sourcing it, make sure it hasn't been republished in part or in full.
- Ensure your content is written by someone who can really engage the audience and sound like an expert writing in English (or whatever your sites' languages are)
- Really try to hook the user right away. Make the posts as visual as possible, avoid large first paragraphs, and make sure the user knows why they should care right away. You need to elicit an emotion early on in your audience: fear, anger, amusement, interest, surprise, etc.
- Use tags sparingly, and only when it makes sense. Avoid tags with only one tagged post.
- Credibility: add dates and authors to articles backed by full bios.
The way to get out of this penalty is to really add value on each of your pages.
-
This is definitely a unique situation in my experience, so keep that in mind, but I'd think about (after making sure nothing on- or off-page could have caused the manual action) submitting 11 reconsideration requests at the same time, telling them everything you looked into, the tests, and the situation with your entire account getting manual actions. See how they respond to that.
-
manual actions is one or two sites seems legit..but all sites listed on webmaster tools account looks suspicious... They even penalized a new blogspot site with no content at all... I just created a new blogspot for testing.. and there was no article posted... It is Blank... but received thin content warning.....
-
I do see a red flag right away in the page source. When I viewed the page source code I saw some spammy looking links right before the footer. This could be an indication of a hack. If all of your websites in webmasters are on the same server, or your login for all of them is the same, or something similar, it's possible you had all your sites hacked. If you put those links there, then remove them.
Manual action messages are very vague, so sometimes it takes some digging to identify the root of the problem. The above is an idea of where to begin.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bing webmaster tools incorrectly showing missing title and description tags
Hey all, Was wondering if anyone else has come across this issue. Bing is showing title and description tags missing in the head of my wordpress blog. I can't seem to find any documentation on this. Thanks, Roman
Technical SEO | | Dynata_panel_marketing0 -
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
Why are my 301 redirects and duplicate pages (with canonicals) still showing up as duplicates in Webmaster Tools?
My guess is that in time Google will realize that my duplicate content is not actually duplicate content, but in the meantime I'd like to get your guys feedback. The reporting in Webmaster Tools looks something like this. Duplicates /url1.html /url2.html /url3.html /category/product/url.html /category2/product/url.html url3.html is the true canonical page in the list above._ url1.html,_ and url2.html are old URLs that 301 to url3.html. So, it seems my bases are covered there. _/category/product/url.html _and _/category2/product/url.html _ do not redirect. They are the same page as url3.html. Each of the category URLs has a canonical URL of url3.html in the header. So, it seems my bases are covered there as well. Can I expect Google to pick up on this? Why wouldn't it understand this already?
Technical SEO | | bearpaw0 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
Any need to worry about spammy links in Webmaster Tools from sites that no longer exist?
I own an ecommerce website that had some spammy stuff done on it by an SEO firm through SEOLinkVine a few years ago. I'm working on removing all those links, but some of the sites no longer exist. I'm assuming I don't have to worry about disavowing those in Webmaster Tools? Thanks!
Technical SEO | | CobraJones950 -
Updating Meta Tag error quickly besides submit to index in Webmaster Tools
For a conference page marketing built the meta tag didn't have correct year and date of the conference. I updated and used webmaster tools submit to index to try and get it updated in google search quickly but meta tag has not updated. Are there other avenues to get this corrected?
Technical SEO | | inhouseninja0 -
Google Webmaster tools vs SeoMOZ Crawl Diagnostics
Hi Guys I was just looking over my weekly report and crawl diagnostics. What I've noticed is that the data gathered on SeoMoz is different from Google Webmaster diagnostics. The number of errors, in particular duplicate page titles, content and pages not found is much higher that what google webmaster tools is represents. I'm a bit confused and don't know which data is more accurate. Please Help
Technical SEO | | Tolod0 -
Should we block URL param in Webmaster tools after URL migration?
Hi, We have just released a new version of our website that now has a human readable nice URL's. Our old ugly URL's are still accessible and cannot be blocked/redirected. These old URL's use a URL param that has an xpath like expression language to define the location in our catalog. We have about 2 million pages indexed with this old URL param in it while we have approximately 70k nice URL's after the migration. This high number of old URL's is due to facetting that was done using this URL param. I wonder if we should now completely block this URL param from Google Webmaster tools so that these ugly URL's will be removed from the Google index. Or will this harm our position in Google? Thanks, Chris
Technical SEO | | eCommerceSEO0