Ah - that gets trickier then.
The official line is to clean up everything you can, document your attempts where you can't, really show that you are trying then go back to google with a grovelling re-inclusion request.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Ah - that gets trickier then.
The official line is to clean up everything you can, document your attempts where you can't, really show that you are trying then go back to google with a grovelling re-inclusion request.
Have you actually had a warning through google webmaster tools?
Poor quality links might be harming your site, or they might just be no longer helping. If they previously helped and no longer do that can look a lot like a penalty. If you are registered on google webmaster tools and have not received a penalty it might not be the best use of time to spend it trying to remove links that simply no longer count.
I don't know SEO Link wheelers, but presumably they create a number of satellite sites and point those at your site and bolster those wil mass link methods. If so you might only have a few poor links. It could be easier to "dilute" them by getting some good ones.
Do you know when it tanked? Whenever I read the words "nose dive" I usually start by checking the dates that it did against this page http://www.seomoz.org/google-algorithm-change
If the dates match up you pretty much know what has gone wrong without any further investigation.
Looking at a few of the backlinks to that site I would bet that the drop lines up with one of the Penguin updates. You've got some pretty ropey links there I am afraid.
I didn't get on with directorypress. I just couldn't see the point in building a directory on top of wordpress when it then overtakes any wordpress like functionality.
I'd agree with Brent that it is worth a look - I know others who like it. However for me it was just a convoluted way to go about things with a not great end result.
Wonder how many people will contact them about speaking for them as a result of this thread! I'd do it for a link from Barclays that's for sure.
Good niche directories are still useful to both site owners and the public. There is nothing wrong with the format of a directory, it is the use of that format that has been causing problems.
I'm not a big fan of much off the shelf software for this sort of thing (I'd either build from scratch or build using drupal). People I know who use them mostly complain about limitations of the software. However phplinks directory seems to be a very common choice . However if you were building with that I would suggest spending extra time to make sure that it stands out as a quality directory and doesn't look like another spammy one.
It looks like these are given to those who speak as part of the Barclays Entrepeneur Speakers campaign. See here: http://www.barclaysandme.com/Seminars/Entrepreneurspeakers/P1242587029441
Very interesting Seems a little... err, uhmm... Naa.. they wouldn't
You're welcome. Good luck with finding definitive research though. There is very little of that around in the world of SEO
Keyword density has been a bit of a myth since at least the turn of the millennia to be honest. Search engines are also pretty smart: They can read HTML and know the difference between what makes up the code of an HTML element and what constitutes visible copy.
There are a lot of things that they ignore (because it makes sense to). That doesn't necessarily mean it detracts from anything else going on that page.
The only possibly exception I can think of in this case would be a load of keyword stuffed Alt tags. This could logical be used as a signal for over-optimisation, but if they make sense I wouldn't change them.
Worth noting that meta desc isn't one of those 3 markup styles. it is a different thing completely so you aren't actually mixing schema in your example.
Yes - this is a problem. Or at least it can be.
I'd guess you are using wordpress which seems to be the worst offending platform for this. There are a number of plugins that will resolve this issue for you. I am not a wordpress guy (I am sure one will be along). However I believe that the yoast seo plugin is a well respected solution to this issue.
Why do you think this would be a bad thing? I'd question how much benefit will be gained in most areas by doing this, but I can't see it causing harm and it is good to get in there now with this rather than adding it later (assuming you've backed the right format!).
DA/PA is a good indicator. Not any more than that though. I didn't say i wouldn't consider it. I said I still maintain that I wouldn't base my decision on it.
Can someone please explain how you would manipulate the PR to be higher then it should be?
The most common way involves 301ing to a higher PR site until it reports the higher value, then removing the redirect. The usual reason for this is to sell links of the fake PR site until the next toolbar PR update.
Worth being aware of.
In a vacuum, yes. However hopefully you'll be linking in and out anyway. Like most things in SEO it is good to understand the principal without being a slave to it.
If one area is picking up lots of links then fantastic. You could link back around the site to spread that link equity. Better still - try to ensure it is your money pages that are getting the incoming links!
That example is a bit different as it has been faked. PR is often faked on blog networks that are build for link purposes.
Nice question.
Search engine bots are many headed beasts. When they read a page they will note what links are on that page and add them to their list to crawl. They might then follow several or them (or none at all) and come back later and start with the next URL on their list.
Instead of thinking of the bot like a visitor who is deciding where to go next, think of pouring sand in to the top. It'll flow down every connected route.
First is the DA/PA not connected in anyway to PR from the metrics they take into account?
SEOMOZ's DA/PA are independently derived figures based on similar concepts to Google's pagerank. The two are totally independent of one another, but there is correlation because of concepts they share.
Secondly which one would you go for if you had to choose one: 1. DA/PA 60+ and PR 2 2. DA/PA 30+ and PR 4
Both . Actually I wouldn't base my decision on either metric. I'm finding all such metrics to have even more limited value this year. I'd much rather look at the site itself and what links to it.
It's all about targeting the right keywords. Sadly it is an area that lots of people get wrong and then are left wondering why they don't have traffic.
However you are promoting a niche product, so you don't have to compete on the really big terms. You've done well on the villa terms - maybe start down another seem next: maybe luxury accommodation.
See the attached image.
Broad match searches are 4400. Phrase match are 73. Exact match are 36.
Broad match is useful if you are doing PPC, but less so for organic search. In short they are saying that:
36 searches a month are match for "switzerland villas"
73 searches a month are made for phrases containing "switzerland villas"
4400 searches a month are made for phrases that google thinks are related to "switzerland villas". That might include "swiss hotels" for all we know.
As I said - it is VITAL to understand what those numbers represent if you are doing your own SEO work.
Sorry - that just isn't a great phrase. As a phrase match "Switzerland villas" only has 73 Global monthly searches. As an exact search [Switzerland villas] has just 36. Although kwt estiates are never particularly accurate at small numbers, you are not going to get more clicks than there are searches.
The broad match numbers are not really relevant if you are focusing on the organic term.
I'd broaden the phrases you are targeting. Chalets would seem to be a word with more potential for Switzerland.
Looking just at that first key term: (initials cdav). keyword tool shows 210 GLOBAL searches on a broad match. <10 on local. However switch that to a phrase match and it is <10 on global.
That means that any traffic from that term is a good thing. It just is not a busy phrase.
When doing keyword research it is absolutely vital that you understand the difference between broad, phrase and exact match.
I doubt that is the answer you were hoping for, but I hope it helps none the less. I would start researching additional terms.
Can you share the keywords (by PM if not publicly)
You can track any type of link you like. Google is good at detecting affiliate links built in common formats or run through big networks. However there are plenty of links around that are part of an affiliate program and totally undetectable by algorithmic means.
One example: I've previously written scripts to track referred sales from the referring domain. No tracking URL was used.
If I was building an affiliate program with the SEO benefit of the links in mind I'd probably use (or at least offer) a method like that. There is also another trick that I've seen used in the travel sector quite heavily using malformed URLs which I think it slightly more theoretically traceable, but seems to work just fine.
Even simpler I would suggest that most forms that don't go through a known network and don't use an obvious form like &aff_id=xxx or &affiliate=xxx will probably count.
I'd imagine that doing such is pretty much a textbook example of a paid link. You'd need to consider that and be sure you were happy with such risk. Either way I would strongly disagree with the other comments that affiliate links don't count. I'd also question the suggestion that affiliates will always nofollow as well.
More pressing I would consider whether you are possibly causing a duplicate content issue with your URL structure.
If abcweb.com/affiliate1/nameqaz and abcweb.com/affiliate2/nameqaz both point to the same page and there is no 301 or canonical in place then you could end up with multiple versions of the same page indexed. I'd opt for a 301 if you are stuck with that structure which would also help direct the link equity if that is your concern.
Better still I'd pass one (or more) get vars and have these ignored through webmaster tools. If you were using abcweb.com/nameqaz?rand=111 & abcweb.com/nameqaz?rand=222 and had ?rand ignored in webmaster tools that would solve a couple of issues.
I'm not endorsing everything above as a great tactic for every project. However it is all correct.
There are really 3 possible causes.
Hopefully that will start you off in the right direction.
It depends on what the errors and warnings are. With that number you will probably find that there are a few common root causes. Fixing one cause will probably clear up a number of them in one go.
Start with the most common critical ones. If you need any help with specific errors just start a thread here. You'll be surprised how quickly you can clean them up.
Hard to say without knowing the detail of what is on the pages. However it sounds like a perfect set-up for a site to be hit my the Panda updates to me. This is exactly what Panda was built for!
It could be worth checking your traffic levels alongside the dates on this page for a good idea of what changes have already impacted on your site: http://www.seomoz.org/google-algorithm-change (I like to put those dates in as events on Google Analytics). However, even if you haven't yet been hit by this I'd suggest you are risking it.
If you have a lot of "thin content" pages then this can impact on the whole site. Generated pages are probably the quickest way to hit such problems.
You don't need to inform google that you have removed them. Just remove the pages and be sure that it either returns a 404 error or does a 301 redirect to the most logical (not thin) page.
If you have authority and speed on your side then you have a far greater chance of being seen as the originating source of the article.
They have faked the pagerank. Faking only works for the checkers - google isn't actually fooled by it.
If you are worried about a duplicate content penalty then you should noindex it. Considering the possibility of a site wide panda penalty if enough of your content is considered to be low quality this might be a good idea.
The domain suffix is unlikely to have a big impact on this sort of tactic. If this is the way you want to go then just get whichever name is available.
In truth though I'd really weigh up the tactic. The return from this approach is likely to be low unless you are set up to pretty much automate this at scale (which I am guessing from the question you are not - and is the sort of tactic very much frowned on at SEomoz).
There are two extremes in building such satellites. One is just to throw up some cheap thin content and rely on the exact domain match to do the work. This is not without cost, but not too expensive. However you have a risk attached as this is very likely to come under google definition of spam do could result in your main site getting penalised if you do a lot of it. The other approach would be to build a quality site that can stand on it's own merit - which obviously costs more. Also don't forget to factor in that you will still need to be link building and doing SEo on the satellite sites too.
Considering that an extra site means an extra large step in the conversion process (you have to get users from site A to site B before purchasing and you will lose a lot of them at that stage), you need to be really sure of the returns.
Many people here will tell you that what you are talking about is spam and you should therefore instead concentrate on building a better core site.
I'd say that what you are suggesting is unlikely to be the best return on your time and you should therefore concentrate that time and effort in to the core site unless you are really sure about what you are doing and understand the risks involved.
Your product is interesting enough that you can probably get much better links much more affordably through a content marketing approach and get more traffic and sales as a result. Start a thread here asking for content marketing ideas for a fake snow company and see the responses you get. I reckon it's a dream product to promote!
If NO pages at all have been indexed then the first thing to check is that you are not blocking google. the easiest way to check all the most likely issues there is to get your site registered on google webmaster tools and then look for any crawl errors.
However if you want to check manually start with the robots.txt file - make sure that there are not any disallows in there. If there are then make sure that they are not blocking the mail content (if you are not sure then post the contents back here - happy to look). Also look in the source code of the page for any "no index" instructions.
If neither of those flag up problems then you need to ask does Google know about the site? Has it been linked to from external places, or submitted? How long since it went live.
If you post (or message me) the URL I'd be happy to take a quick look for you.
If anyone did manage to get control of googlebot they could find better uses to put it to than that.
Much more likely is that there are links somewhere to those URLs - they may well be on someone else's site. Google is following the link to see what it there, then finding nothing. However it works on a file by file basis rather than by directory so it could happen quite a bit.
If you want to stop it clogging up your error logs (and ensure that googlebot cycles are spent indexing better stuff) just block that directory in your robots.txt file.
I had this same conversation with someone yesterday about a very similar set-up.
In a 2009 blog post, Matt Cutts said that the main reason not to include over 100 links was for user experience. It used to be for technical reasons, but those no longer apply. Here is the post: http://www.mattcutts.com/blog/how-many-links-per-page/
Lots of links can lead to lots of code, which slows things down. It will also be dividing up the page rank fairly heavily. However in the age of mega-menus I don't think that the number of links is, in itself, a problem.
Just for reference (and the answers to your situation may be different) our conversation about this ended with the decision to reduce the number slightly - structuring to leak less page rank to unimportant pages. However overall we still have a LOT of links and are happy with that.
Hate to say it, but if SEO is important to you I would dump the theme. There are ways to mitigate the issue, but you are always going to be "swimming upstream" - working harder than you need to to get anywhere.
It'll get there in it's own time. However a few fresh links & a bit of new front page content never harms the crawl rate.
or, do it the easy way! Tweetreach : http://tweetreach.com/
Good morning back at you from 22 degrees C Hayling Island, UK :¬)
I don't know of a specific too, but if the group isn't massive you could do this fairly easy through the twitter API. You could use the search API, which is really simple and do a search something like this:
http://search.twitter.com/search.json?q=%23seomoz&result_type=recent&rpp=100
That will return the last 100 tweets that contained #seomoz . You can then cycle through results adding the page=X variable for up to 1,500 pages. So as long as there are less than 150,000 results that should do the job. If there are likely to be more you would do the search more often and add a date qualifier.
If there isn't a tool to do it I'd possibly be up for writing it - sounds useful. However it is simple enough that I suspect someone has done it
The general principal that we're trying to work towards now is to lose the /blog part completely. I'd much rather have the content "in deep" close to where it will help reinforce sales and where any shares/links will do the most good.
Where the site has regular users who might like the chronological nature of the blog we're using "new on site" type pages, giving headlines and snippets of recent pages. These are quite handy in ensuring that new content gets indexed too!
Is there any software (other than Wordpress) that people would recommend using?
I've never been a huge wordpress fan myself, partly because I find it tricky to implement with other systems like this. We've been working with Drupal more and more this year. If you are used to nice friendly Wordpress to set up it is rather less than welcoming (See Drupal learning curve). However it is incredibly capable at this sort of thing.
If the other 14 domains are like those 2 then I wouldn't waste my time. Probably wouldn't hurt, but life is too short!
One tool I occasionally find useful for larger lists is market samurai. It's a free keyword analysis tool much loved by sectors of the market that don't frequent seomoz as much. Can be useful though.
The tool itself is actually a paid one, but you can use it in free mode and get a few features. All the keyword research stuff is included in the free version and is actually more useful than most of the paid stuff!
It is quite common to shorten URLs by taken common words out of them. Remember also that URLs appear in search results too, so having something that looks helpful and makes people want to click is an advantage in itself. Which looks more like a quality resource to you?
websitename.com/eat-you-way-to-beauty-with-superfoods
or
That doesn't sound right at all. I'd email [email protected] with that one to be honest.
It looks like you may have been hit by the penguin update.
According to OSE 1546 linking root domains out of 1634 link to you using the anchor text "reputation management". Looking at a few of those links they look a lot like quite a few of them have come from blog link networks as well.
So you potentially have either 2 or 3 issues that are causing you an immediate problem:
#3 is the most scary, so we'll deal with that first. If the site is set up on webmaster tools log in and check for any messages. If it isn't then get it set up. If you do have a penalty you should have a message. If not then breathe a sigh of relief... for now.
How you deal with 1&2 is up to you. Have a good read around "penguin" and "poor quality links". There is LOADS written elsewhere about this.
check out distilledu as well. the free module they offer is on keyword research & it is very good: http://www.distilled.net/u/
Ah - understand now
There is currently a note on the rank tracker tool that it will be updated soon. Fingers crossed they might include this feature.
Hi Ehab,
The Rank Tracker and On-page optimisation tools are very different. I'm not sure where you have got confused, so I'll try to explain each and see if that helps.
Rank Tracker : This tool measures where your website appears on particular keyword searches. If, as an example, you were tracking the term "purple hats" then the tools effectively does that search for you once a week and reports back where your site appears.
On-page optimisation tools : This looks at the page you specify, checks it for common problems and sees how relevant it is to the term that you specify. It doesn't take in to account any off page factors (such as links or competition) and is just rating the pages "readiness".
I hope that helps clarify it a bit for you.
My concern with the shorter name is that people might easily miss the S. That is when branded domains perform better.
I don't know the US realtor market well, but I assume this domain will end up plastered over boards outside properties. If that is the case then there is a pretty strong argument for a branded domain rather than a kw led one.