Is there a tool to find out if a URL has been deemed "SPAM" by GOOGLE
-
I am currently doing a link audit on one of my sites and I am coming across some links that appear to be spam. Is there a tool that I can plug their URL into to see if they have been deemed spam by GOOGLE?
-
Few things you can try:
- Google the URL if it doesn't come up strong possibility it was de-indexed and has a penalty.
- Use a service like link detox which will roughly shows you what it thinks to be nasty.
- Majestic seo has a neat tool for finding out what it thinks of sites - https://www.majesticseo.com/reports/neighbourhood-checker
- Similar to above http://spyonweb.com/ can be handy for working out link wheels
- Look into the stats of the site e.g trust flow, authority etc. recommend tools like Open site explorer, majestic seo or Hrefs.
Research is the key and you could dig pretty deep Hope some of those help.
but as to what Google thinks you're still going to have to figure that out on your own.
Good luck
-
No, there isn't a tool that does this conclusively. Welcome to Shades of Certainty, I will be your host.
You could try the cache: operator in conjunction with the URL. If it isn't cached, there are either really big crawl problems or it's the worst kind of spam. This isn't a silver bullet, but it's one step to determine if a page is the worst kind of spam.
For on-page considerations, you may want to try the Moz On-Page Grader. Google it ain't, but it's better than nothing at all if you're wondering.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subcategories within "New Arrivals" section - duplicate content?
Hi there, My client runs an e-commerce store selling shoes that features a section called "New Arrivals" with subcategories, such as "shoes," "wedges," "boots," "sandals," etc. There are already main subcategories on the site that target these terms. These are specifically pages for "New Arrivals - Boots," etc. The shoes listed on each new arrivals subcategory page are also listed in the main subcategory page. Given that there is not really any search volume for "Brand + new arrivals in boots," but lots of search volume for "Brand + boots," what is the proper way to handle these new arrivals subcategory pages? Should each subcategory have a rel=canonical tag pointing to the main subcategory? Should they be de-indexed? Should I keep them all indexed but try to make the content as unique as possible? Thank you!
Intermediate & Advanced SEO | | FPD_NYC0 -
What counts as a "deeper level" in SEO?
Hi, I am trying to make our site more crawlable and get link juice to the "bottom pages" in an ecommerce site. Currently, our site has a big mega menu - and we have: Home > CAT 1
Intermediate & Advanced SEO | | bjs2010
SUBCAT 1
SUBSUBCAT 1
PRODUCT Our URL Structure looks:
www.domain.com/cat1/subcat1/subsubcat1/ and here are the links to the products but the URL's look like: www.domain.com/product.html Obviously the ideal thing would be to cut out one of the CATEGORIES. But I may be unable to do that in the short term - so I was wondering if by taking CAT1 out of the equation - e.g., just make it a static item that allows the drop down menu to work, but no page for it - Does that cut out a level? Thanks, Ben0 -
Somthing weird in my Google Webmaster Tools Crawl Errors...
Hey, I recently (this past may) redesigned my e-commerce site from .asp to .php. I am trying to fix all the old pages with 301 redirects that didn't make it in the switch, but I keep getting weird pages coming up in GWT. I have about 400 pages under crawl errors that look like this "emailus.php?id=MD908070" I delete them and they come back. my site is http://www.moondoggieinc.com the id #'s are product #'s for products that are no longer on the site, but the site is .php now. They also do not show a sitemap they are linked in or any other page that they are linked from. Are these hurting me? and how do I get rid of them? Thanks! KristyO
Intermediate & Advanced SEO | | KristyO0 -
Is there a way to contact Google besides the google product forum?
Our traffic from google has dropped more than 35% and continues to fall. We have been on this forum and google's webmaster forum trying to get help. We received great advice, have waited months, but instead of our traffic improving, it has worsened. We are being penalized by google for many keywords such as trophies, trophies and awards and countless others - we were on page one previously. We filed two reconsideration requests and were told both times that there were no manual penalties. Some of our pages continue to rank well, so it is not across the board (but all of our listings went down a bit). We have made countless changes (please see below). Our busy season was from March to May and we got clobbered. Google, as most people know, is a monopoly when it comes to traffic, so we are getting killed. At first we thought it was Penquin, but it looks like we started getting killed late last year. Lots of unusual things happened - we had a large spike in traffic for two days, then lost our branded keywords, then our main keywords. Our branded keywords came back pretty quickly, but nothing else did. We have received wonderful advice and made most of the changes. We are a very reputable company and have a feeling we are being penalized for something other than spamming. For example, we have a mobile site we added late last year and a wholesale system was added around the same time. Since the date does not coincide with Penquin, we think there is some major technical driver, but have no idea what to do at this point. The webmasters have all been helpful, but nothing is working. We are trying to find out what one does in a situation as we are trying to avoid closing our business. Thank you! Changes Made: 1. We had many crawl errors so we reduced them significantly 2. We had introduced a mobile website in January which we
Intermediate & Advanced SEO | | trophycentraltrophiesandawards
thought may have been the cause (splitting traffic, duplicate content, etc.),
so we had our mobile provider add the site to their robots.txt file. 3. We were told by a webmaster that their were too many
links from our search provider, so we have them put the search pages in a
robots.txt file. 4. We were told that we had too much duplicate content. This was / is true, as we have hundred of legitate products that are similar:
example trophies and certificates that are virtually the same but are
for different sports or have different colors and sizes. Still, we added more content and added no index tags to many products. We compared our % of dups to competitors and it is far less. 5. At the recommendation of another webmaster, we changed
many pages that might have been splitting traffic. 6. Another webmaster told us that too many people were
linking into our site with the same text, namely Trophy Central and that it
might have appeared we were trying to game the system somehow. We have never bought links and don't even have a webmaster although over the last 10 years have worked with programmers and seo companies (but we don't think any have done anything unusual). 7. At the suggestion of another webmaster, we have tried to
improve our link profile. For example,
we found Yahoo was not linking to our url. 8. We were told to setup a 404 page, so we did 9. We were told to ensure that all of the similar domains
were pointing to www.trophycentral.com/ so we setup redirects 10. We were told that a site that we have linking to us from too many places so we reduced it to 1. Our key pages have A rankings from SEOMOZ for the selected keywords. We have made countless other changes recommended by experts
but have seen no improvements (actually got worse). I am the
president of the company and have made most of the above recent changes myself. Our website is trophycentral.com0 -
Fetch as GoogleBot "Unreachable Page"
Hi, We are suddenly having an error "Unreachable Page" when any page of our site is accessed as Googlebot from webmaster tools. There are no DNS errors shown in "Crawl Errors". We have two web servers named web1 and web2 which are controlled by a software load balancer HAProxy. The same network configuration has been working for over a year now and never had any GoogleBot errors before 21st of this month. We tried to check if there could be any error in sitemap, .htaccess or robots.txt by excluding the loadbalancer and pointing DNS to web1 and web2 directly and googlebot was able to access the pages properly and there was no error. But when loadbalancer was made active again by pointing the DNS to it, the "unreachable page" started appearing again. This very same configuration has been working properly for over a year till 21st of this month. Website is properly accessible from browser and there are no DNS errors either as shown by "Crawl Errors". Can you guide me about how to diagnose the issue. I've tried all sorts of combinations, even removed the firewall but no success. Is there any way to get more details about error instead of just "Unreachable Page" error ? Regards, shaz
Intermediate & Advanced SEO | | shaz_lhr0 -
How to remove "Results 1 - 20 of 47" from Google SERP Snippet
We are trying to optimise our SERP snippet in Google to increase CTR, but we have this horrid "Results 1 - 20 of 47" in the description. We feel this gets in the way of the message and so wish to remove it, but how?? Any ideas apart from removing the paging from the page?
Intermediate & Advanced SEO | | speedyseo0 -
Proper use and coding of rel = "canonical" tag
I'm working on a site that has pages for many wedding vendors. There are essentially 3 variations of the page for each vendor with only slightly different content, so they're showing up as "duplicate content" in my SEOmoz Campaign. Here's an example of the 3 variations: http://www.weddingreportsma.com/MA-wedding.cfm/vendorID/4161 http://www.weddingreportsma.com/MA-wedding.cfm?vendorID=4161&action=messageWrite http://www.weddingreportsma.com/MA-wedding.cfm?vendorID=4161&action=writeReview Because of this, we placed a rel="canoncial" tag in the second 2 pages to try to fix the problem. However, the coding does not seem to validate in the w3 html validator. I can't say I understand html well enough to understand the error the validator is pointing out. We also added a the following to the second 2 types of pages <meta name="robots" content="noindex"> Am I employing this tag correctly in this case? Here is a snippet of the code below. <html> <head> <title>Reviews on Astonishing Event, Inc from Somerset MAtitle> <link rel="stylesheet" type="text/css" href="[/includes/style.css](view-source:http://www.weddingreportsma.com/includes/style.css)"> <link href="[http://www.weddingreportsma.com/MA-wedding.cfm/vendorID/4161](view-source:http://www.weddingreportsma.com/MA-wedding.cfm/vendorID/4161)" rel="canonical" /> <meta name="robots" content="noindex">
Intermediate & Advanced SEO | | jeffreytrull1
<meta name="keywords" content="Astonishing Event, Inc, Somerset Massachusetts, Massachusetts Wedding Wedding Planners Directory, Massachusetts weddings, wedding Massachusetts ">
<meta name="description" content="Get information and read reviews on Astonishing Event, Inc from Somerset MA. Astonishing Event, Inc appears in the directory of Somerset MA wedding Wedding Planners on WeddingReportsMA.com."> <script src="[http://www.google-analytics.com/urchin.js](view-source:http://www.google-analytics.com/urchin.js)" type="text/javascript">script> <script type="text/javascript"> _uacct = "UA-173959-2"; urchinTracker(); script> head>0 -
URL Length or Exact Breadcrumb Navigation URL? What's More Important
Basically my question is as follows, what's better: www.romancingdiamonds.com/gemstone-rings/amethyst-rings/purple-amethyst-ring-14k-white-gold (this would fully match the breadcrumbs). or www.romancingdiamonds.com/amethyst-rings/purple-amethyst-ring-14k-white-gold (cutting out the first level folder to keep the url shorter and the important keywords are closer to the root domain). In this question http://www.seomoz.org/qa/discuss/37982/url-length-vs-url-keywords I was consulted to drop a folder in my url because it may be to long. That's why I'm hesitant to keep the bradcrumb structure the same. To the best of your knowldege do you think it's best to drop a folder in the URL to keep it shorter and sweeter, or to have a longer URL and have it match the breadcrumb structure? Please advise, Shawn
Intermediate & Advanced SEO | | Romancing0