Spam Flags on my minutedrone.com
-
Hello,
I launched few months ago minutedrone.com and i started to use moz to improve my SEO. I discovered that i have a hight spam score. 9/17 . What should i do to be not penalized on Google ? Where should i start ?
The service is new so we only have DA : 10 PA : 19 and i'm not an expert in SEO
-
Your right , thanks for thewarning !! Will found a freelance asap as even if the service is only available in French speaking country, some of our customers don't speak french.
-
No problem! One more thing that caught my eye. You need a native speaker to reword your English page. Some of the sentences don't translate very well, and apart from killing your conversion rates it's also a bad look for your SEO.
Best of luck, looks like you're offering a cool service!
-
Thank you so much Elliotte !! This will help a lot !
I will try screaming frog and work on those 80 pages .For the "no contact info" i don't know why beecause there already our social link in the footer. For the HTTP/2, your right , i'm currently building the site in html/css with bootstrap only so i will wait to have a dev
Thanks again !!
-
Let's start with some easy things to fix shall we?
First, thin content. I crawled your site with Screaming Frog and found you have 482 internal pages on your site. That's a good size! But 80 of those pages have less than 300 words, and a some of them have as few as 18. My french is a little rusty, but most of those pages that have a low word count look like contact pages. So while not entirely necessary for the user experience, it can't hurt to add some text that's easily readable by users and bots in HTML. At the very least, provide a call to action that represents your brand so people are more likely to fill out your form. This should help reduce or remove your "thin content" warning.
The "No Contact Info" warning is easily fixed. Add social links to your footer and an email address where users can contact you on your contact page.
Ignore the "Low number of pages found" warning for the time being. You should check this against Google Search Console to see if Google is finding all the pages you've listed in your sitemap. If Google says they've found everything and indexed it, you're probably pretty safe. When I crawled it with Screaming Frog, I found that all the pages returned either 200 or 301 status codes, so this may just be something wonky with the Moz Crawler. You can also check how Googlebot sees your website with a tool such as httpstatus.io by setting the user agent to Googlebot.
A couple other notes:
1. You did a great job specifying the language for your english pages, but you can add hreflang tags to your French pages.
2. You have both https and http pages being generated. Make sure you're only creating pages in https. This is complex and beyond anything I can easily describe in a response here, so you'll have to work on this with your web developer. Once you have everything in https, go ahead and enable HTTP/2 for your website. It will improve security and speed.
3. Finally, check you link profile with a tool like OSE. If you're being linked to from shady websites with suspicious anchor text you'll want to disavow those links. Anything from Russia check to make sure they're not scraping your site. I've been getting a lot of that junk with my clients.That was a what I got from a quick browse. I think you will reduce your spam score sufficiently by following the steps listed above as well as improve your site's security, speed, and user experience. Good luck!
-
Thanks PlanetHurf, i was responding to the previous answer et the same time. The potentials reasons from OSE help but i definitely don't know where to start. Some reasons will be fixed over time i think as the service is new but may be there is something that go wrong with on my site. For exemple there is a lot of links pointing the same page so may be the problem come from here ?
-
Thanks Andy,
I just checked my Search Console and there's no particular message.
9/17 spam score seems to suppose a big problem and hight risk as 71% of sites with this score as been banned. There's some recommandation by default thanks to moz tool but i don't know where to start. Knowing the most important reasons may help me to tackel it :
Low MozTrust or MozRank Score
The site link profile is not trustworthy.Large Site with Few Links
We found very few sites linking to this site, considering its size.Ratio of Followed to Nofollowed Subdomains
The ratio of followed to nofollowed subdomains linking to this subdomain is outside the normal range of others in our index.Ratio of Followed to Nofollowed Domains
The ratio of followed to nofollowed domains linking to this subdomain is outside the normal range of others in our index.Small Proportion of Branded Links
Links to this subdomain have low amounts of branded anchor text.Thin Content
A subset of pages within this subdomain have little content.Low Number of Internal Links
Pages crawled on the subdomain have a small number of internal links.Low Number of Pages Found
Crawl only gets a valid response to a small number of pages.No Contact Info
None of the pages crawled contain an email address or links to a social profile. -
You can also look at Open Site Explorer, here: https://mza.seotoolninja.com/researchtools/ose/spam-analysis/flags?subdomain=minutedrone.com
However, as Andy says, starting with Google should really be your first port of call (assuming you have access): https://www.google.com/webmasters/tools/home
-
Hi,
Open up search console (as long as you have it setup) and check to see if there are any warnings from Google.
It might be something very straight forward, but I would start here.
I would then read the MOZ guide on how they score these - you might pick up some ideas as to what can be causing this.
It's a bit awkward to say what else might be causing this without spending some time looking at it.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam Links Attack, Negative SEO?
Last April we migrated our old domain www.nyc-officespace-leader.com to a new domain www.metro-manhattan.com. The old domain has been receiving numerous links from very spammy sites such as these: -adinternet.net/the_worlds_most_visited_web_pages_958.htm <colgroup><col width="263"></colgroup>
Technical SEO | | Kingalan1
| -www.online-advertisement.net/the_worlds_most_visited_web_pages_958.html | <colgroup><col width="263"></colgroup>
| www.webfind.org/the_worlds_most_visited_web_pages_958/ | Since the old domain redirects to our new domain we are concerned this could be very detrimental. Oddly enough the 50-100 spammy domains that link to us all are a site called: "http://theglobe.se/start/" when the linking domain is entered in the browser. What should we do? Should we disavow these links? Is this some kind of an attack? Would very much appreciate some input/advice. Thank,
Alan1 -
Blog archive pages are meta noindexed but still flagged as duplicate
Hi all. I know there several threads related to noindexing blog archives and category pages, so if this has already been answered, please direct me to that post. My blog archive pages have preview text from the posts. Each time I post a blog, the last post on any given archive page shifts to the first spot on the next archive page. Moz seems to report these as new duplicate content issues each week. I have my archive pages set to meta noindex, so can I feel good about continuing to ignore these duplicate content issues, or is there something else I should be doing to prevent penalties? TIA!
Technical SEO | | mkupfer1 -
Www to non www on a .com/blog url
hi guys, I have had to reset my site from www to non-www. via htacces and this worked out just fine.However, the /blog WordPress section will not redirect to the non-www. I have changed the config.php to non-www. However, the /blog WordPress section will not redirect to the non-www. I have changed the config.php to non-www. Does anyone have an idea as to what I need to do to force the non-www in a folder installed blog http://5starweddingdirectory.com/ http://www.5starweddingdirectory.com/blog/ Regards T
Technical SEO | | Taiger0 -
Setting up a site with different extensions (.co.uk and .com)
hi i am setting up a new site but have bought two domains to cover those who may type the wrong version. So i have: regionwithchildren.co.uk and regionwithchildren.com i am just setting up both on my wordpress host with a coming soon page (to include social links and sign up form). but had a few questions: as the main site is .co.uk should i just set up a redirect from the .com to the .co.uk as the root folders on the two will be the same (regionwithchildren) i need to change one as host cant have two identical - what should i change the .com one to? any other considerations for this kind of set up would be much appreciated? thanks neil
Technical SEO | | neilhenderson0 -
Rel=canonical - Identical .com and .us Version of Site
We have a .us and a .com version of our site that we direct customers to based on location to servers. This is not changing for the foreseeable future. We had restricted Google from crawling the .us version of the site and all was fine until I started to see the https version of the .us appearing in the SERPs for certain keywords we keep an eye on. The .com still exists and is sometimes directly above or under the .us. It is occasionally a different page on the site with similar content to the query, or sometimes it just returns the exact same page for both the .com and the .us results. This has me worried about duplicate content issues. The question(s): Should I just get the https version of the .us to not be crawled/indexed and leave it at that or should I work to get a rel=canonical set up for the entire .us to .com (making the .com the canonical version)? Are there any major pitfalls I should be aware of in regards to the rel=canonical across the entire domain (both the .us and .com are identical and these newly crawled/indexed .us pages rank pretty nicely sometimes)? Am I better off just correcting it so the .us is no longer crawled and indexed and leaving it at that? Side question: Have any ecommerce guys noticed that Googlebot has started to crawl/index and serve up https version of your URLs in the SERPs even if the only way to get into those versions of the pages are to either append the https:// yourself to the URL or to go through a sign in or check out page? Is Google, in the wake of their https everywhere and potentially making it a ranking signal, forcing the check for the https of any given URL and choosing to index that? I just can't figure out how it is even finding those URLs to index if it isn't seeing http://www.example.com and then adding the https:// itself and checking... Help/insight on either point would be appreciated.
Technical SEO | | TLM0 -
.com & .ie website how to avoid duplicate blog content?
We have 2 websites .com & .ie (both are more or less identical except 2 different markets). How can I avoid duplicate blog content as lots of our .com/blog and .ie/blog is the same? Maybe.... Our main .com blog articles are searchable then on our .ie blog content non searchable? (This way both markets get to view the content but only Google actually searches our .com blog) Alliteratively I would need to rewrite each article so that is unique Advise would be appreciated, thank you.
Technical SEO | | AdvanceSystems0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90