I sent you a direct message
-Dan
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Nice Doug, you bring up a good point and that is to not assume the spammy links are the (only) cause. I would:
1. Ask your client to stop doing that
2. Comb through Google Webmaster Tools for errors - 80% of the time you'll find what need in there.
3. Do what Doug suggests which is run a check on the site to be sure its not some sort of on page factor - dont forget the often overlooked things; robtos.txt meta robots, xml sitemap etc.
4. And finally what Doug suggests: building quality links to balance out the negative ones is essential.
-Dan
Anytime you want to answer the question "why does this page rank" use the SERP analysis tool - this tool is PHENOMENAL and provides dozens of metrics for Google results. You have to run "full report" of which you get 50 a month. Then you have to know how to read the metrics to make a good analysis. Rand wrote a good post on the tool here.
Just looking at the OSE data may not be enough. This report is much more comprehensive.
-Dan
That's exactly right, however don't forget one important detail - the OpenGraph meta tags! Facebook needs these to send the data about your page back over to Facebook.
You can get more info about the tags here
-Dan
I can only answer from past experience, but I typically see major changes every 30 days or so.
-Dan
Hi
How are you checking rankings?
I see a few on-site issues;
The first thing to do is check Google Webmaster Tools. This is the best source of problems with the site. Correct every possible thing in there first.
Before thinking its something malicious I would
Hope it helps!
-Dan
Great answer - "bingahoo" - love that.
Hi
I think you're asking a question about duplicate content, and Google possibly indexing more than one version of the same page?
To avoid this from happening, add the canonical tag to the head of all your pages;
Each canonical should contain the original URL for each page. This will avoid indexation of the same page multiple times.
Jassy is likely correct, that the engines know how to ignore referral parameters, but you should have a canonical in place in most cases anyway.
-Dan
Tim
The essential pages to redirect are;
Depending on how your site is built there are several options for doing the redirects in an efficient way. Assuming it's a custom site and you've got a good clean URL structure, I might try something like;
If the site is so small, you might not need something intense like that. Cut and paste will do just fine If some other setup, I'd need to know more to suggest a different way to create your redirects.
After you do everything, check webmaster tools for the next few weeks for any 404s that might pop up.
Hope that helps.
-Dan
Brian
As far as which to remove, that's just up to him to decide. Unless I saw how the site was put together with plug-ins etc he should be able to determine which one is best to use. Its whichever is easier for him really. From an SEO standpoint you just need to choose one, it doesn't matter which.
Hope that clarifies!
-Dan
Hi
Few things I saw right off the bat:
The other answer about backlinks might have merit to it, but none of that's going to help until you get these technical issues fixed.
I would honestly proceed this way;
1. Get a complete site audit done by a professional
2. Make the technical changes needed
3. Fill out then content some more. The site overall is very "thin". Not much text, especially important pages that should carry authority like the home page and category pages.
4. Wait for Google to re-crawl
5. The use this tool to answer the question "why does a site rank where it does?" Make sure to run the full report.
6. Then work on backlinks.
-Dan
ALSO - your site is loading with AND without the www and /index.html does not redirect. All possible variations of the homepage should redirect to only one.
Hi
I'd also look at this from the user's point of view. In my personal experience the first option (a subfolder) is the least confusing.
-Dan
Hi
To my knowledge, you don't stop it from crawling through the button (like a nofollowed link), rather you block the robot at the page it ends up on after clicking submit.
Say the user hits submit and it takes them to mydomain.com/confirm.html On that page you'll want to add;
....if you want it to NOT index the page but follow the links on it.
or
...if you want it to NOT index and NOT follow the links on that page.
Its advised that its better to do this with the meta tag than in robots.txt.
Hopefully I've understood the question correctly!
-Dan
Hi
The Open Site Explorer while exhaustive does not have all the links on the web, plus it won't have the most recent links.
You could also try www.majesticseo.com.
You could use a variety of advanced operators to find some more links;
That might catch some more of them. You can try adjusting the time range to get recent ones.
-Dan
Hi
It looks like your have two canonicals on all of your pages. They are being generated by two conflicting sources in your WordPress setup. One looks like its coming from the wooCommerce and the other perhaps from the default WordPress settings.
So the bigger issue is to have your developer decide on which source is best to take the canonical from. This will depend perhaps on if one works better than the other.
Hope that helps!
-Dan
They have links on that domain with that exact tracking parameter. Maybe a campaign at one point?
Tom
Looks like a crazy combination of things;
1. Search [site:benetton.com] and it reveals their Italian page it.benetton.com as their intended 'home' page.
2. The Italian page does not have a robots meta tag, therefore it is indexed but only in google.it
3. Go to google.it and search [benetton]... annnd... you get the Italian site, no tracking URLs - site-links etc, per normal.
4. So the 'robots noindex/nofollow' on the www page must be because they're trying to index the Italian site but not the US site? I'm honestly not an expert on the correct way to handle these international issues. They have basically two identical sites on separate sub-domains, the only difference being the language. Doesn't seem like the right way to handle this to me...
5. www.benetton.com does not redirect properly for /index.html or /index.htm
6. Because of all of this, Google USA is not given a homepage on the www subdomain to index.
7. BUT, there is an internal link (at least one I can identify) pointing to http://www.benetton.com/?utm_source=centrum.sk&utm_medium=listing on this page on the blog - so the robots noindex/nofollow is ineffective because Google likely sees the presence of the tracking parameters and thinks its a different page? Or this is one of the few times where Google can choose to ignore the canonical tag (as they have said its a strong directive).
(As a side note, many of their other www US pages have noindex/nofollow, but there are still about 2,500 pages in the index from the US site.)
Conclusion
Thoughts? Does all that makes sense?
"Use your Google Analytics to determine which links are not used. If you offer links which are not used, it is a waste and adjustments should be made. An unused link should generally either modified so it is more visible or removed."
I would look at this in a holistic manner. First of all, why would you link? Is this just for backlinking/ranking purposes? Then don't do it.
But are you asking because your friend has actual content you want to share? In THAT case, is the content (topic) relevant to the content on your site? And do you generally only link out to other sites for the sake of sharing the content on them? If so for both, I don't see anything wrong with it. And if it is just to share the content, you can just nofollow it too for added "security".
But if its just to try and get your buddy a link, don't bother.
-Dan
An alternative for removing them from search results, besides a 301, is meta robots - noindex. You would do this if you do not want them passing any link value to 301d pages, which it doesn't sound like you would want them to.
-Dan
Good call Ryan. I simply did a site: search for that page and it came up in the index. I also searched for a portion of random text from that page and it still came up. Could it be indexed but because its an orphaned page it doesn't rank for the term its optimized for, because it doesn't get any linkjuice / domain authority flow?
Nice catch again!
-Dan
Hi Gary
You should have your 301s ready, as you'll want to implement them as soon as the site changes.
A quick way to get all the URLs (new or old) is to spider the site with Screaming Frog or Zenu - get the URLs in a spreadsheet, and that way you have your list of old/new URLs. Depending how well you know excel you can get them matched up and ready to go quite easily with very minimal typing.
Then check webmaster tools right away when you may the switch to catch any 404s. Also, re-spider the site to look for the same.
-Dan
Hi
Honestly, I don't think you should worry much about pagerank. I'd be more concerned if your traffic or rankings dropped. Check out this whiteboard Friday on pagerank. There are many more important things to worry about!
-Dan
PS - In about a 5 second analysis of your backlinks, I'd say you might want to focus on improving that. Not sure if someone has been helping you with linkbuilding or anything, but your backlink profile is severely unnatural. WAY too many exact match anchor text links from irrelevant sites.
Your other option is to use meta robots tag like this;
Its a one step process instead of two.
Full documentation on URL removal from index here and on the robots meta tag here
Hi
That seems like it should work just fine to me. Just follow these guidelines.
-Dan
With free you can check if the name has been taken for all of them. Then you have to registerer yourself. Paid will just do some registration for your I believe, so I've always stuck with free.
-D
Use knowem to find and claim 100s of social profiles, especially on social sites within a niche you might not have been aware of. The social profiles aren't link building per se (because many aren't followed links) but Alan has listed this #2 and I would agree. It creates a social web around your site and strengthens trust in the brand.
-Dan
PS - all of Alan's tips are spot on, as always
1. Keyword research - [celebrity name + fashion] or [celebrity name + clothes] in adwords tool (set to exact match). See screenshot.
2. Choose what phrase to optimize for to help guide content. Try to create content with a viral / social media hook to it. Obviously include the photos, but also maybe you could crowdsource comments on the pictures from your twitter followers or email list first. Try to find people that have a good social following. Give them a 'sneak peak' at the photo, gather their comments. Or, find a fashion critic to give it a glowing review/write guest commentary.
3. Make sure the article is on-page optimized for the chosen keyword
4. Then when you release the article, the people you collected comments from will share it in their social networks, helping to get it indexed and noticed right away giving it a social boost.
5. Build a few links to the page...
The search volumes are relatively long-tail it seems for this sort of thing, but if you find the right phrase, it could be highly transactional. Someone searching for "brad pitt fashion" may in fact be highly interested in purchasing something that he has in fact worn. The trick is in finding the right keywords. Don't just throw up an article and see what happens. A little bit of planning to it up front could be a huge win.
Good luck!!
-Dan
I'd look at this in a holistic manner. If your site looks great, functions great, has good information, promotes trust among visitors, has good on page optimization, etc this is likely not of concern. The search engines know that an e-commerce site is going to have many add to cart buttons on a page and on a site. I'm sure they take this into consideration when judging an e-commerce site. As long as you're not doing anything sneaky with the buttons like hiding text behind them or anything, I'm sure your fine.
Of course, an example of your site, screenshot or link, is always most helpful when determining these things
-Dan
Hi Maria
While I can't say I know of the best theme for SEO, here are some considerations;
I would purchase from a trusted vendor like woothemes or themeforest If from themeforest, look for a theme with a 4-star rating or higher, a decent amount of purchases, and look to see that the developers actively provide support and respond to questions.
I'd say more importantly, is what you do with wordpress, no matter what theme you use. I have been trying Yoat's seo plugin for wordpress and am very happy with it.
-Dan
Yes it is, what you want is called "event tracking" to track clicks or other events, which can now also be set up as a goal in Google Analytics.
This is their guide to the setup.
Yes, you can try Screaming Frog. Free up to 500 page crawl, or you can buy the pro version.
Also, this google webmaster tools video is interesting too: what steps can I take to diagnose a drop in rankings?
Hi Ryan
Yes, you're absolutely correct to question that, good call. What I saw in Majestic (which I hadn't time to share before) was all of those sitewides were coming from "bestdirectory4you.com" (see screenshot). What do you think? Not a very high value site, including the others from the screenshot. It was all exact match keyword anchor text, on pages with tons of other links etc.
Secondly, check out the next screenshot - text cut and paste from the homepage and put into a search with quotes, but we're only getting two inner pages in the results. Seems like there's something going on with the homepage not showing (although it is in the index).
In fact a site: search doesn't show the homepage first, but it brings up all the inner MLS pages.
It would seem the drop in rankings has something to do with the homepage losing a lot of its juice, in google anyway?
Thoughts?
-Dan
Your last month of backlink history does look very suspect (see screenshot from Majestic SEO) See the giant spike about a week ago?
I took a quick look at the site of the company who designed your site, and I wouldn't be surprised if this was something that they were involved with, although I can't say that conclusively without further research.
I would HIGHLY suggest doing what Ryan says, especially the part about finding a reputable professional SEO, not a designer who happens to "do SEO" on the side.
-Dan
Hi
If you have not checked out these few resources, highly recommended for local;
Your plan sounds pretty good, although I am not familiar with "Build My Rank".
The one thing I am curious about is, how is his website? Is it technically sound (crawlable, indexable)? Does the navigation/architecture (albeit small) make sense? Is it optimized for on page keywords? Does it have a great design, and is it full of useful information that's easily digestible?
That's just my general sense, that you have a good focus on off-site seo, but just don't forget about on-site, especially when it comes to users arriving at the page and taking that action; calling, emailing, sharing or whatever it may be.
Hope that helps! Its exciting, and good luck!
-Dan
I do not know of such a tool - maybe try SEMRush? They have a lot by way of competitive analysis.
You could use the percentages from any of the click through rate reports out there for a rough guess;
Coconut Headphones (there's a 2nd part to this article too)
Bear in mind, everyone's reports are always a bit different. There are so many variables to estimating click through rate, its nearly impossible to come up with exact percentages across the board, as they can vary by industry, amount of PPC ads, local search vs general search, if there's videos or images in the result etc.
But hope those links help!
-Dan
Hi
Nice photos by the way
How do you arrive at the pages with the dynamic URLs? Try signing out of WordPress and viewing the site while not logged in.
You have the canonical tag in place, so honestly, there is no duplicate content issue. I'm just curious how you discovered the different URLs? That's not usually something a normal user browsing the site will see, at least not what I've seen with WordPress anyway.
-Dan
EDIT: It just occurred to me, you probably found these URLs in the SEOmoz campaign tools?
Hi
Someone recently showed me Power Mapper Sitesort. I have not personally used it. But he ran a full audit right in front of me, and it looks pretty powerful. Its $150-$500 though.
In general I love SEOmoz's suite of software. For an audit you may want to try their Crawl test
And real nice to have for an audit is Screaming Frog. Free up to 500 pages, then you have to pay for larger sites.
Depending on the site, I generally use a mix of tools to pull together the info needed, and get it into a mix of excel, word or powerpoint, depending on how its going to be presented.
-Dan
You can view a lot in Majestic just by creating a free account.
I would agree with using separate google accounts. I usually put everything like analytics, webmaster tools, youtube etc within a unique account for each client.
-Dan
Hi
I use as many of the services as possible. Also, this article has a great point, about adding yourself to the GPS services as well, among some other local SEO ideas.
-Dan
It appears they already have multiple pages (teen gifts for example).
So I was going to say, run them all through the on-page optimization tool, and the SERP analysis tool (run a full report), get all the ranking factors of Gifts.com and then improve their own for each page.
Thoughts on that?
-Dan
Hi Brett
Few things to add:
-Dan
Hi
The difference to understand about pages vs posts has to do with how they are organized
or another way to think of it...
So, use pages when you have more or less static pages, organized in a folder type format like described above. Use posts anytime you have content that you want organized dynamically; ie blogs, news, product catalogs.
Some of the themes you can get make this more confusing than it needs to be too.
Hope it helps!
-Dan