This may be late, but yes you would need to add each sub-directory to GWT to set their location.
If you've already claimed the main site, it should automatically verify the sub-directories for you.
More info (check out the video):
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: Company
Company: WrightIMC
Favorite Thing about SEO
Excessive use of H1 tags
This may be late, but yes you would need to add each sub-directory to GWT to set their location.
If you've already claimed the main site, it should automatically verify the sub-directories for you.
More info (check out the video):
We've read reports, heard from colleagues and some of our clients have experienced a drop in impressions starting on February 20th. Going from an average of 500+ impressions to 10-30.
Anyone else experiencing this?
Hello,
I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages associated with /login.
I will see site.com/login?r=http://.... and have several duplicate content issues associated with those urls.
Seeing this, I checked WMT to see if the Google crawler was showing this error as well. It wasn't.
So what I ended doing was going to the robots.txt and disallowing rogerbot.
It looks like this:
User-agent: rogerbot
Disallow:/login
However, SEOmoz has crawled again and it still picking up on those URLs. Any ideas on how to fix?
Thanks!
We haven't experienced any issues (as far as rankings) from claiming with a separate email. However, I think it's a good practice and shows trust to the particular account.
From what we've seen so far with the radius is usually Google will serve the results based on the users specific location. So it may not show up for you, but it could show up for other users. Unfortunately, we haven't gathered enough data to say for sure. I recommend getting more citations. Check out what citations they have that your client doesn't. Also I'd look at the website and see if they are sending any other signals - for example if they are ranking organically for the other areas.
Try out White Spark's Local Citation Finder.
I think this is going to depend on two things: 1. Your Site Structure and If you want those pages indexed.
Rand Fishkin - recommends for paginated results not to put the canonical tag pointing back to the top page, which I agree.
Site Structure
If the final pages can only be found by going through the paginated structure, you'll definitely want them followed. You'd only want to no-follow to prioritize your crawl rate, but not recommended unless you have multiple formats (see the article above).
Indexed
If the content is unique (usually blog content) and you are getting traffic to those pages from searches then it may be worthwhile to keep them indexed.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
I just purchased it out of sheer curiosity, I'm sure you may find some gems, but if you've exhausted your list - you've more than likely run into most of these sites. Still may be worth a look, for $27 this is a good list to start with and at least it's organized.
Also check out: http://pointblankseo.com/link-building-strategies
For Ian's study on Google Plus Box Ranking Factors - I agree, Keyword relevance is not a huge factor. Don't think it really needs to be, since the Plus Box seems to provide influential people on a broad subject/topic. However, I don't expect to see Madonna popping up for SEO in the Google Plus box anytime soon. I'm not exactly sure where Google is pulling the data to associate the influencer with a particular keyword/topic, but I'm sure it has something to do with search trends, engagement (+1s, posts, etc..) and a variety of other factors.
As for your question "Google+ Ranking Factors - Are Keywords meaningless." This would be a totally different story and I believe keyword relevance has to be an important factor.
Thanks Karen. That would explain it!
I'm using Google Chrome with SERP Overlay and it use to say Export to CSV... now it says Get Keyword Difficulty Report. Was this purposely removed?
This is one feature that I really liked (SEOQuake does this)... basically where I can export the search results with relevant metrics.
I'm mixed on this.
One thing that I think is often missed is that the way subdomains are used vs subfolders. A lot of times when you have a subfolder - it is linked within the internal structure of the site, while subdomains are not. So link juice is not passed. That's why you don't see *.wordpress.com or *.blogger.com sites outranking everything under the sun, none of the root domain juice is passed down to the subdomain.
However, we've seen for our clients that subdomains still work as long as it's part of the internal link structure - with the added benefit of having 2-4 serp results (mostly with branding type queries).
So with that said, reverse proxies are not the necessarily the same as 301 redirects. In general, you would use a reverse proxy if you had content on a remote server and the easiest way to serve that content is using a subdomain - by using a reverse proxy you can instead use a subfolder instead. While a 301 redirect, is generally used to redirect a page to a newer/canonical page.
However, if you're talking about the best solution to handle subdomains - 301 vs reverse proxy. You may end up doing both if you're using a reverse proxy, you may have to 301 redirect certain content pieces... especially if you're proxy doesn't properly redirect/handle relative paths. The issue is talked about in the comment section.
If you're starting a completely new site - my guess would be that a reverse proxy would be better than a 301. As a reverse proxy would not lose any link juice - in theory.
We haven't experienced any issues (as far as rankings) from claiming with a separate email. However, I think it's a good practice and shows trust to the particular account.
From what we've seen so far with the radius is usually Google will serve the results based on the users specific location. So it may not show up for you, but it could show up for other users. Unfortunately, we haven't gathered enough data to say for sure. I recommend getting more citations. Check out what citations they have that your client doesn't. Also I'd look at the website and see if they are sending any other signals - for example if they are ranking organically for the other areas.
Try out White Spark's Local Citation Finder.
I'm mixed on this.
One thing that I think is often missed is that the way subdomains are used vs subfolders. A lot of times when you have a subfolder - it is linked within the internal structure of the site, while subdomains are not. So link juice is not passed. That's why you don't see *.wordpress.com or *.blogger.com sites outranking everything under the sun, none of the root domain juice is passed down to the subdomain.
However, we've seen for our clients that subdomains still work as long as it's part of the internal link structure - with the added benefit of having 2-4 serp results (mostly with branding type queries).
So with that said, reverse proxies are not the necessarily the same as 301 redirects. In general, you would use a reverse proxy if you had content on a remote server and the easiest way to serve that content is using a subdomain - by using a reverse proxy you can instead use a subfolder instead. While a 301 redirect, is generally used to redirect a page to a newer/canonical page.
However, if you're talking about the best solution to handle subdomains - 301 vs reverse proxy. You may end up doing both if you're using a reverse proxy, you may have to 301 redirect certain content pieces... especially if you're proxy doesn't properly redirect/handle relative paths. The issue is talked about in the comment section.
If you're starting a completely new site - my guess would be that a reverse proxy would be better than a 301. As a reverse proxy would not lose any link juice - in theory.
Integrated Marketing Communications – the IMC in our name – means integrating all the right communications tools so that they work together in harmony to create a competitive advantage, boost sales and profits, save money, drive traffic, and promote a brand. An easy task? No way.
The Internet has fundamentally changed the way that business is done – forever, and most importantly, how your customers find you in the first place. If you intend to be found online by your potential customers, then you’d better have a plan – because your competition does. At WrightIMC, we use the most current technology, bona fide subject matter experts and strategists, and a boat load of know-how that only comes from experience to not only give you that plan, but to create an honest-to-goodness difference in your bottom line.
If your customers can’t find you when they look for you, then it doesn’t matter how good you are. And, in today’s world, broadcasting a one-way message to an unknown and unquantifiable black hole of potential customers doesn’t work. True connectedness, engagement, and interaction are possible – and they convert people to customers.
We can do that – and we can prove it. When it comes to reaching your audience, the sum of your marketing and communications strategies is greater than the parts. That’s what WrightIMC does better than anyone else.
Single discipline agencies proliferate today. There are very few agencies that have real experience in the entire spectrum of communications disciplines. We do. We can make company truly successful at engaging customers. And, that means more revenue. Period.
Looks like your connection to Moz was lost, please wait while we try to reconnect.