It's not a no-no, but the benefit you get from it is likely to be limited. Blog commenting can be useful, but it's pretty limited as a link building method these days.
Posts made by matbennett
-
RE: Build Backlinks on this site? - Advice Please
-
RE: Build Backlinks on this site? - Advice Please
wow - big topic Seriously, it;s a big topic and very dependent on your niche. Any answer that I can give that applies to all sites in any niche is probably not that great an answer.
However start by understanding who links to your competitors and why they do that. open site explorer is a great tool to find those links, but the "Why?" you'll need to use your own imagination for. When you understand that you can find new targets and try to get them to link to you.
Here are a couple of resources to get you started, but also search seomoz for more on link building:
http://www.seomoz.org/blog/the-noob-guide-to-link-building
-
RE: Build Backlinks on this site? - Advice Please
No, these really won't help I am afraid. The fact that you can just type in your address and add a link is the very reason why they won't help. If you think about it, what logical reason is there for a link like that being any indication of your website's quality or authority?
I'm not saying that it never worked. However you shouldn't expect that to work now. In fact if you did too much of that sort of thing it could be damaging.
-
RE: What is the value of getting a Facebook like?
It depends on what you are trying to achieve. I'd agree that a lot of the time people are requesting likes without any real purpose, however they can be very useful too.
A like to your faceboook page is like joining your mailing list. It gives the page owner a way to contacting people with their message (albeit it through the filter of Facebooks edgerank algorythm). Very useful indeed.
A like of your page/site makes that visible to the connections of the person who likes the page (again, some - depending on edgerank). Useful as it gets your site seen.
-
RE: Diigo.com
diigo is a bookmarking service. I use it myself, largely to save things I find online for later reading in a cross platform manner: ie I spot something on my phone and diigo it for later reading on my laptop.
diigo has a few features that similar services don't offer, which I why I switched from evernote. Groups sharing is a win for me as are some of the annotation features.
You can make your bookmarks public as well, which is where those backlinks you are seeing probably come from. Are they all from a couple of users or seem to be quite widespread?
Social bookmarking as a link building means is pretty old hat to be honest and generally not too effective. In fact too high a proportion of this could be risky. I think that most people would rather have one nice authority link than a load of diigo links.
Short version then: good service, but not for link building.
-
RE: My text does not show up in Google
I think that your problem is that you have no links pointing to your site. Literally - None.
Links are one of the main means that search engines like Google use to determine the importance of a website. The logic goes that a site that has authority sites linking to it must in itself be useful.
Your site is in a competitive niche and needs to demonstrate that it is worthy of appearing above the other sites in that niche. Having other sites link to you is a vital part of that.
Read this chapter of the seomoz beginners guide http://www.seomoz.org/beginners-guide-to-seo/growing-popularity-and-links - this explains what you need to do fairly well. I'd also suggest reading the rest of that guide whilst you are there as it will really help you out.
Good luck.
-
RE: Could large number of "not selected" pages cause a penalty?
Yes, this can definitely cause problems. In fact this is a common footprint in sites hit by the panda updates.
It sound like you have some sort of canonical issue on the site: Multiple copies of each page are being crawled. Google is finding lots of copies of the same thing, crawling them but deciding that they are not sufficiently unique/useful to keep in the index. I've been working on a number of sites hit with the same issue and clean up can be a real pain.
The best starting point for reading is probably this article here on SEOmoz : http://www.seomoz.org/learn-seo/duplicate-content . That article includes some useful links on how to diagnose and solve the issues as well, so be sure to check out all the linked resources.
-
RE: Very Big Pr blogs
I checked the PR9. That is definitely fake.
PageRank is a product of the links that point to a site. Therefore you CANNOT have a high PR without high incoming links. Even using the freshest and most generous backlink checkers that site has no more than a handful of poor quality backlinks.
Let's ignore backlinks and just use common sense. What other PR9 sites can you think of? Amazon.com is only an 8, seomoz is only a 6. Apple.com is a 9, as is adobe.com (one of the most linked domains on the planet). Would you say that this blog with it's keyword anchor laden fiverr style content is on a par with those sites? Is it likely to pick up the same quality and quantity of links?
PR is a product of the links. If the links are not there then that PR does not belog on that domain. Just because you can't see how they are faking it doesn't mean that it is not fake.
(earlier 301 redirect to a higher PR domain is the likely trick used by the same)
Here is an OSE comparison that shows the difference. It could be argued that OSE is usually a few weeks behind, but then you have to assume that this site could close the gap on Microsoft.com in a few weeks, which is even more far fetched!
-
RE: Do a lot of related articles in lower subfolders, boost higher level subfolder keywords?
I htink that /related-topic/news can work, but it does depend on the topic and what people are looking for and expecting.
If they got to /related-topic/news will they miss out on stuff that they are likely to want to see under /other-topic/news ? Will people be looking for /news/ which this structure probably wouldn't support that well? Users first - bot second.
One other tip: If you are thinking of adding date in to the URL have a think about how evergreen your content is likely to be. If it is all really topical then date in url can work well: People search, can see it is recent and your CTR will go up.
However if you are using the same content area for longer term content then it can have the opposite effect. Someone searches a year later, seest the old date and assumes that it is our dated even if it isn't.
-
RE: SEO For New Website
You're welcome.
**I figured as much but was hoping there might be a few onsite things I could do. **
I think that is a common pattern. On site is easier, so we all tend to look for it. Even when we really know that the answer is "lots of hard link building to do now" we tend to look for other answers. You don't have to be a noob to fall in to that trap!
-
RE: Marking our content as original, where the rel=author tag might not be applied
Hi András ,
I think that you are getting confused to what rel=author actually does. It can help as part of the picture that shows google who the originator of content is, but it doesn't assert it in the way you seem to be suggesting. I'll come back to that, but let me address another point first:
as our programmer says, what you can see on the internet, that you can basically own.
This is plainly wrong. I would agree that whatever you see on the internet can just be stolen. However that is not the same as owning it, something that international law backs up.
If you have valuable content that is likely to get stolen then you need to do 2 things:
1. Ensure that search engines find your copy first and see you as the originator
2. Police it
#1 You seem to be doing. Manual submission via webmaster tools sounds painful to me, but will do that. Tweet it, link it, ping it etc. Do what you can to establish "this was here" early and to get Google to index it.
Part of that same picture is to be seen as trustworthy. Get those high authority citations, ensure you content is always unique etc.
However, #2 is about you taking responsibility for your content. It's yours, you own it, there are no internet police so it is up to you. Try a service like copyscape, or just use google alerts to let you know when people steal stuff. When they do hit them with a take down notice, send the same to their hosts, domain registrar etc - then follow it up with a DMCA request.
This will stop a lot of it. It will also make it a pain in the bum for some of the others (if it is more hassle to steal from you than someone else then they will steal from someone else!). It also starts undermining the trust in their sites. If google have frequent DMCA requests about particular domains it helps build that picture. If you see them stealing other people content let the other victims know as well and encourage them to do the same.
-
RE: Can you do a 301 redirect without a hosting account?
You can do these with forwarding etc. However that isn't the most efficient way.
What you want to do is to "park" the old domain on the new one. In technical terms this means pointing the nameservers at the new hosting account and ensuring that there is MX record for them. Most hosting companies do this for you or have a built in tool. It's very easy and is usually called "domain parking", "domain mapping" or "add on domains".
Doing that will ensure that visitors (and bots) visiting olddomain.com will be served newdomain.com
With that working you can just add an .htaccess rule on newdomain.com to ensure that it 301s correctly to newdomain.com . The same directive can tidy up all your subdomain woes at the same time!
This method has the least points of failiure and creates the fewest redirects. It's also generally the cheapest and easiest to manage. Win!
-
RE: What would your Seo tactic's be for this
Out of interest, does that term bring you much traffic? One of the biggest mistakes you can make in SEO is to focus on keywords with limited potential.
However, let's assume lots of people are searching for that phrase.
Forums are hard. Free form forums rarely create blinding content. They can have a great community, but the content itself rarely screams "link to me". One way around this is to use the community to help you create additional stand out content. Methods like surveys are built for forums. That could lead to something like:
- Poll members on something interesting
- Create static page highlighting the survey and its results
- Put out press or maybe an infographic to generate links
Another thing that could work well in your niche is something like a "readers choice" award. Get people to nominate their favourite clubs then throw it open to votes. Be sure to let the shortlisted clubs know that votes are being sort. You'll be surprised what links and social signals that can generate. Again, put out press to announce the winner, maybe give them a badge etc.
-
RE: Very Big Pr blogs
Even without looking I would suggest that they probably have fake PR.
There are a couple of ways you can trick google in to displaying a fake toolbar PR. It doesn't actually help your site at all in the rankings, but can be useful if you are being less than honest (for example selling links based on PR).
A good way to spot these is to compare PR with DA/PA in open site explorer. Big differences can be a sign that something is amiss. There are some online "fake PR" checkers too, although their reliability varies.
-
RE: Ecommerce good/bad? Showing product description on sub/category page?
I've only looked at a handful of pages / categories, but I am not sure that you are doing yourself too many favours.
In SEO terms I think that you are undermining your unique content pages. By repeating the unique content from the product pages on to the category pages you are effectively introducing duplicates. Yes, it is your own content, but spreading it across those pages is unlikely to do you many favours.
More importantly it's also pretty confusing in places. Some of those categories are quite off putting.
What would I do?
- Remove those descriptions from the cat pages
- Add category descriptions
- Try to flesh out the product descriptions with more unique content and keep an eye out for those that duplicate each other.
I hope that helps.
-
RE: SEO For New Website
However the client is not using Social Media at all and there's no link building strategy in place. Will this undermine the rankings a lot?.
Yes - totally. You really cannot underestimate the importance of links. In simple terms: if you niche is in any way competitive (even if that competition is just handful of people with a vague idea) you are not going to rank positively without links.
Getting the site set up well and search ready is great, but you need to get moving on your links. I'd much rather work on a poorly optimised site with a load of links than a well optimised site with none.
-
RE: "Too many links" - PageRank question
If that is what makes sense then do it.
Adding a second tier of structure would allow you to direct more rank to certain areas (tier 1 all get an equal share of rank, having more links in some categories than others would force more page rank towards those). However the overall effect of that is less overall rank reaching the bottom pages.
Personally I would go with user experience first. If linking to all 70 in the menu makes sense then do that. What is the point in even ranking a site that people don't want to use because it's a pain to navigate? Then use cross linking to add greater emphasis to those that you want to reinforce.
-
RE: Does link building through content syndication still actually work?
Blimey - you've been trawling the archives! Ahh... doesn't rand look fresh faced back in 2010 !?
This is definitely trickier these days. I think that Google is now better at understanding who the originator of the content is and also like duplicate content even less now than then. So harder to do and more risk for getting it wrong.
Making it unique is still the way to go. However I think that you are largely now only going to get the benefit from the unique part, rather than the whole part. Using syndicated content as a catalyst for UGC can work well. I've just been helping a site that does this: They manually syndicate content from a variety of sources, but their user base tends to add a few hundred words of response quite quickly. This seems to work best when they are also picking up links to the content.
So, yes I think it still works - it's just harder. I'd definitely take the "value add" approach rather than trying to be the authority though.
-
RE: Best url structure
If you saw that address in a search result would you actually click it?
I'd say cramming that many keywords in to a URL would send a bad signal anyway. More to the point though it is going to look like someone jumped in Marty McFly's Delorean and came back with a boot full of spam from the late 1990s.
Ranking is NOT the most important thing (even if this would help, which I would doubt). If the listing looks poor quality then that ranking will bring less traffic. Less traffic means less money.
I would much rather see a short URL without the keywords, and use the keywords in the title. Better still break it up in to every page that makes logical sense and have an appropriate URL and matching content for each. There is no "trying" to avoid duplicate content though - you have to avoid it
-
RE: Video Hosting and SEO
Why do you say that the self hosting of video is better for SEO? I'd disagree.
I'd much rather host it with a specialised service that ensures fast streaming, widespread CDNs etc. Hosting elsewhere doesn't have to mean letting them compete against you with your own video like youtube. Vimeo is a good and affordable solution for video hosting that doesn't self publish all the videos.
-
RE: Do a lot of related articles in lower subfolders, boost higher level subfolder keywords?
I think I follow.
Looking at it in a vacuum I would say that example2 has a tiny advantage. The net link equity of what is pointing back to the category page is the same in both cases, but there is greater emphasis from the internal anchor text. In practical terms this will be very small though.
In a real situation there are so many larger issues at play that you'd struggle to measure this. the effects in incoming links, overall domain authority, on page optimisation etc etc etc are going to far out weigh this.
Reading between the lines...
I am guessing that you are really asking "should I structure my site like this, or like that?". If that is the question then do what makes for the most usable site. Do though factor in whether more category pages could be useful in their own right as landing pages as well.
Picking the most usable site means a site that people are more likely to enjoy using. That means that they stay longer, hopefully make you some money whilst they are there, mention it to friends, tweet it, share it, link to it etc. Those things can bring real,measurable benefits
I hope that is useful.
-
RE: How do websites get links here...
Can you share a link to the one with the klout scores.
Is there a date filter as well? Are you seeing examples where the links in the twitter box point to content newer than than the story, or does it all predate it?
-
RE: Duplicate title-tags with pagination and canonical
I frequently use the page number in titles. It's not a bad solution where you want them all to get indexed.
Keep an eye on whether it affects CTR from the results though. I also like to ensure that there is always a link to the first page of results. This is useful for the user and also helps push more authority to that first page so that it is more likely to be the one that appears.
-
RE: How do websites get links here...
The fact that they are wrapped within the Twitter branding strongly suggests that they are coming from Twitter rather than being under editorial control from huffpost.
The naming of those elements in the HTML would seem to back that up as well: id="twt-top-links-page-1" and id="twitter_top_links"
It's not a widget style I have seen anywhere else, so presumably it uses the twitter API to do searches. What logic they use to rank those is anyones guess. Could be as simple as pulling the most tweeted URLs from posts matching the shown search string... could be something far more complicated.
Very interesting though... might be worth some experimentation comparing API results with the results on some of the more obscure articles.
-
RE: I've tried everything, and my blog still falling
There are some good points in the other answers, but honestly do the course.
With all due respect a lot of what the other answers are covering is minor detail compared to the big picture stuff that you are missing. You need to understand that stuff if you want your site to do better. Simply following check lists of technical issues will not get you the big jumps you probably want.
The course will take you through the process. It will help you understand how to identify the keywords that are going to bring you traffic, how to target those keywords effectively, how to spot the sort of issues that others are pointing out and how to build the authority you need to rank for those terms.
I'm not knocking the points others have made. They're all good. However you really need to understand the basics properly and then come back and work through those sorts of points.
The good news is though that if you are getting those traffic numbers from the site in it's current state it has potential. It should be well worth getting this stuff right.
-
RE: I've tried everything, and my blog still falling
I've only looked briefly and did that through Google translate. However in SEO terms I think that there is a lot of room to improve things.
You might also want to try some different ad positions with your adsense. I suspect that could help with revenue.
-
RE: I've tried everything, and my blog still falling
Looking at that site I would say that you are doing pretty well if you are getting 700-1000 visitors per day. I would have expected that number to be lower.
You have some real basics missing. Quite a lot of them as well. Rather than rattle off a load of stuff out of context I would say that you need to really understand what SEO entails. One good way of doing that is to work through the whole beginners guide to SEO here on seomoz
That will give you a much better understanding of how search engines can send you more traffic and how you can ensure that your website makes the most of that.
At the end of it I would bet that you would be looking at:
- Starting again with your keyword research
- Restructuring your site to make the most of your keyword research
- Changes to your on page structure
- Ways to get more links
Sorry to just give you a link, but I think that is the best place for you to go next.
-
RE: URL purchasing strategy
This is all really dependent on how big a target the brand is. If I were representing a household name I'd buy every permutation I could lay my hands on. If it was a 1 man band I'd probably buy just the most obvious exact match names. For most projects it would be somewhere in between.
As an example, I've not bought the .xxx names for anything. I don't see that any project I work on is particularly a target for anything that negative and certainly none of them are likely to move in to that market. It all really depends on what you are defending against.
Redirects
First, the obvious point: You do have 301s in place right?
Moving on your points 1&2 assume that those domains have some link equity. You can check this in open site explorer. enter the redirected domain then read the message in the yellow box. click the link in there and it will show you the links pointing to that domain.
Open Site Explorer doesn't show you all links, but it will show most of those likely to be passing value.
That should show you which domains are worth keeping based on link equity.
-
RE: Whats Next, noobie needs some help :)
Paid link would be really risky. Buying links to influence results is against google quality guidelines so you could expect a penalty if caught. It certainly works for some, but you need to know what you are doing as damage caused by such techniques can be very hard to reverse.
I'd start by going back to your keyword research to be honest. Where your site is still very new (in terms of backlinks) I'd be looking to broaden out the number of terms that you are looking to rank for. I'd then ensure that you have pages that target each of those terms, hopefully with content good enough to pick up it's own links.
Link building wise I would probably start by offering up accounts for review from sites that have previously reviewed rival platforms. Referral sign ups should be key in your market anyway, so starting with these will give you a double whammy. I'd do the same to sites that target work from home markets, mummy businesses etc.
**should I pay for help? **
If you have the budget now, then yes. It looks like you have a lot of basics still to cover. Even some consultation time that left you with a solid plan that you could work through yourself would probably be very valuable to you at this stage.
-
RE: 404 error
sounds like you have some bad html in one of your templates. Looking at it there appears to be a missing " on an image tag.
Your server logs should show you the referring page. However I'd guess that those image names are a pretty good clue. Find the appropriate page. View source. Search for the following:
images/products/detail/AD9058RoundGlassTableChairs.jpg target=
(because %20 is an encoded space)
-
RE: Website evaluation questions-what to ask a client
My favourite is always "Why do you want a website?" or "Why have you got a website?" if they already have one.
It's amazing how many people have absolutely no idea what they want to achieve by having one. I find that helps get them in to a more result driven mindset, which works well with our approach.
-
RE: Create better conversion at checkout?
How deeply are you analysing your checkout path? That is a very low conversion rate, so I'd imagine that there is a specific issue causing problems.
As Nakul suggests; anything around shipping (or other "surprise" charges like taxes) can be a killer. Look at the step(s) that users are bailing and think about what new information comes to light at that point. Also consider the interaction you are asking them to make at that point:
- Are you asking for sensitive information?
- Are you asking for a LOT of information?
- Anything on that page that might cause trust issues
- Is it clear (REALLY clear) what they are meant to do next?
- What doubts might come in to their mind at that point?
- Where they intending to checkout, or just doing that to see final price?
I've never seen much result from trust logos. In fact using images of padlocks together with reassuring messages can sometimes be more effective than expensive trust badges. However the idea should be to find potential issues and eliminate them.
-
RE: Directory site with an URL structure dilemma
You're welcome. As you might have guessed, I've tackled this problem myself a few times!!
-
RE: Forms and link juice
wow - 40-80 forms. My concerns would be usability and performance. Can't imagine how they are used: I'd love to see the page.
However I've never seen evidence that forms pass link authority, or at least not in large enough amounts to cause a noticeable effect. If they did then peoples cart pages would be much more authoritative than they generally are.
I've never specifically tested that though.
-
RE: Directory site with an URL structure dilemma
It's painful, but that is your answer:
Q. Why isn't google ranking these pages better?
A. Because they are not unique or usefulGoogle can be annoyingly smart like that. The cheapest/easiest fix is probably to have a paragraph added to the top of each page. So your /budapest/jatekbolt page would have a paragraph about the wonderful choice of restaurants available in budapest and it's rich culinary heritage. (queue affordably copywriter to help keep them different).
You could also consider adding a field to your business database for "recommended snippet" which if filled in highlights that listing and gives a more in depth amount of information. You could the have someone look at reviews for the listings in key categories, pick our favourites and write a fresh description to those featured businesses.
The result of that will be a page that has more unique content and is in fact slightly more useful, That puts you in good standing for improved rankings.
-
RE: Directory site with an URL structure dilemma
Actually, I would say that uniqueness probably is an issue. keep in mind that I don't speak Hungarian,but it looks like everything on that page is a snippet from the sub pages. ie none of the text on that page is unique to that page. Is that correct?
Adding unique content at category level, even just a few lines of natural text that include the main keywords can make quite a difference. I've found it much harder to rank category pages that do not have that.
That would probably be my first job. Even if you just did it on a sample set of pages and monitored those for any improvement. Making them useful (and therefore attracting links) might be harder.
-
RE: Minimum Title Tag Length
I wouldn't worry overly about chasing A grades on the on-page tool here. The goal isn't to please seomoz! I use some VERY short titles very effectively. The too short warning is really about possible missed opportunity. Go with what makes sense, but use the "too short" warning as a flag to look at whether it is the best title to use.
-
RE: Sites with dynamic content - GWT redirects and deletions
Get someone to look at the database queries in coldfusion. Unless you have tens of millions of flashes it should be able to handle it on even a reasonably modest server for your traffic levels. It doesn't sound like it should be taxing.
However it sounds like your problem is some badly structured queries. The good news though is that this is probably quicker and easier too fix than upgrading hosting, coding new removal behaviour or any other "work-around"
What would you do to avoid Google thinking its a poorly maintained site?
Sorry to sound glib, but the answer is "maintain it better".
-
RE: Directory site with an URL structure dilemma
My guess would be that the bigger problem is not the URL structure, but the content on those category pages. The change you propose to the URL structure is good in terms of helping the business listing pages and in creating a logical hierarchy, but it isn't going to help those category pages.
I'd start with looking at:
- Content of the category pages : Do they have unique content. Is that content useful in it's own terms?
- Internal linking of category pages : Are you linking back up to the categories from the businesses, are you linking down to them ok. Are those links close to the top of the site hierarchy?
- External links: Are you getting links from other websites to those pages (easier if they are useful)
- On page optimisation: Are the category pages themselves well optimised
I'd question whether there is any benefit at all to your category pages in changing the URL structure of your business pages. However if there is some it's impact will pale to nothing compared to the above.
-
RE: How can I make Google Webmaster Tools see the robots.txt file when I am doing a .htacces redirec?
You could add an exception to the htaccess to allow the robots to be loaded. You would do this with by adding another condition. I'd use something like:
<code>Options +FollowSymLinks -MultiViews RewriteEngine on RewriteCond %{REQUEST_URI} !/robots.txt RewriteCond %{HTTP_HOST} ^(www\.)?michaelswilderhr\.com$ [NC] RewriteRule ^ [http://www.s2esolutions.com/](http://www.s2esolutions.com/) [R=301,L]</code>
Disclaimer: I am lucky enough to have people at work who check these things. This hasn't been checked! Use at your own discretion
However I'll admit that I've never used this. I just stick the 301 in and it all seems to work out fine. Probably done it on hundreds of domains over the years.
-
RE: Hi I wanted to clarify whether what I am describing is a link wheel and is this black hat ?
That's like not trusting builders, or cab drivers, or in fact anyone because there are bad ones out there.
There are plenty of agencies that don't use such tactics. There are others who will use tactics from across the spectrum, but ensure that the client is able to make informed choices based on risk v reward.
-
RE: Local Listings for a Virtual Product
I'd double check the terms, but I reckon this is ok. You have a real physical location at each spot, which is exactly what Google wants to map. They specifically ask whether you serve customers at each location or not when you submit to places, so that isn't a criteria to be listed.
-
RE: Keyword Cannibalization/stuffing on an ecommerce category page
Whether you were talking about the anchor text or titles it is the same principal. If you are worried about over-doing titles don't use them on all the links. I'm actually not a heavy user of them at all myself - only using them when they are needed rather than when there might be some seo benefit.
Can't help you with the schema question I am afraid. I'm lucky enough to have someone who deals with that for me. I'd put it up as a separate question.
-
RE: Hi I wanted to clarify whether what I am describing is a link wheel and is this black hat ?
It's not what I would call a link wheel. However it certainly does demonstrate the intention of building a network of sorts. Sadly quite a few SEO firms (including some quite large ones) do this. What is really bad is that they don't tell the clients that they are effectively abusing their websites for them either. Really underhand tactic if you ask me. I don't think that the "white hat / black hat" thing is particularly useful. It's over simplistic and makes our industry sound childish. However these are not links that have editorial merit, so you would think/hope that they don't pass authority. It is also a clear case of trying to manipulate pagerank, which is against google quality guidelines. As such it would seem to be putting the host sites at risk. Oh... it's really damn easy to spot as well.
-
RE: Exporting Google and Bing Search Results
There is a chrome extension called "scrape similar" that is useful for doing small batches of stuff like this. However it does have a couple of limitations in that you have to view each page & google will not show you all pages of a large domain. However it is quite easy and effective for sites with under 1000 pages.
https://chrome.google.com/webstore/detail/mbigbapnjcgaffohmbkdlecaccepngjd
The process can be sped up using other tools. I use tool that is designed for black hat forum/comment spamming to do SERP scrapes like that. Even if I did such spamming (I don't), I don't actually think this is a very good tool to do it with. However it is rather good at scraping results from google. However, again you are limited to how many results Google/Bing choose to show you.
If you need a bigger list then log files might be the way to do. You can get a list of all crawled URLs for any particular agent (including the likes of googlebot) from your server logs. Some hosts limit the size of these, so it might be worth checking before you start. However the data does get collected. The downside here of course is that you need access to the logs.
Of course crawled is not the same as indexed. Once you have that list you might need a further step to see which is indexed. Possibly cross-referencing it against google analytics landing pages or querying the google cache for that page (SEOtools for Excel from Biels Bosma is good for this).
Similarly, if you have a definitive list of the URLs on site you could start with that list and query which are cached.
Harder than it seems isn't it? Hopefully one of those methods will put you on the right track.
-
RE: Should I change my product titles from singular to plural to satisfy optimisation?
You didn't get an A grade when I checked it I am afraid. However results will depend on what keyword you enter. When I used the URL you gave above plus the keyword "towel" that gives you a B - mostly downgrading due to keyword stuffing.
The word towel is mentioned 105 times on that page.
This is clearly too much. Who ever put the CMS together probably read something about SEO written in 1998 and thinks that is the way to get results. Definitely get this addressed. Getting your CMS output correct on those template pages is vital.
Tabs can work well. However they do have to be thought about. At the moment it seems that the main purpose of the tabs is to hide keyword stuffed content - which doesn't help anyone. I actually quite like tabs for product pages, however I'd still have the most important information visible on the default page. However, that is personal opinion. A single page can work equally well and can actually be useful in terms of forcing you to edit and prioritise the information on that page.
-
RE: Keyword Cannibalization/stuffing on an ecommerce category page
If you were worried about keyword stuffing you could always use images. However if those are logical, natural links I don't think it really causes much of an issue. This is fairly normal behaviour in a website.