Can I say I admire your inventiveness? You go to some lengths to not register and really, apart from the majority of people not knowing how to do a reverse image search, probably reflects people's attitude to those sorts of lightbox registration forms.
Posts made by Nobody1560986989723
-
RE: Cloaking? Best Practices Crawling Content Behind Login Box
-
RE: Webmaster Tools finding phantom 404s?
OK, well if it truly doesn't make sense (does sound odd, and it does seem like you've done the redirects fine) and three months is more than long enough for GWT to have caught up I'd take the above approach and periodically download the 404 list and seeing if there are any additions, as well as seeing if maybe Bing Webmaster Tools agrees with GWT.
If everything is redirecting fine, then I'd be inclined to just disregard it for the time being and focus my energies elsewhere. Good luck with it!
-
RE: Webmaster Tools finding phantom 404s?
Maybe temporarily stop the 301 on the old site. Re-run your crawl reports and see if there were any 404s in existence on the old site that you hadn't previously thought. Plug the links and then reinstate the 301?
Either that or, if you're sure there's no problem, download the phantom 404s to CSV and then only take note of additions to that list in future?
-
RE: Why should I add URL parameters where Meta Robots NOINDEX available?
Don't disallow: /*?
because that may well disallow everything - you will need to be more specific than that.
Read that whole article on pattern matching and then do a search for 'robots.txt pattern matching' and you will find some examples so you can follow something based on others' experiences.
-
RE: Why should I add URL parameters where Meta Robots NOINDEX available?
I suggest then you use pattern matching in order to restrict which parameters you don't want to be crawled.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449
-
RE: Cloaking? Best Practices Crawling Content Behind Login Box
Heya there,
Thanks for asking your question here
My first point would be that human visitors don't like to be given forms when they first visit a site, so would suggest you don't do this.
My alternative strategy would be to provide a home page of good content talking about the data etc that is available on your site and then provide a button for people to register if they want to.
Don't detect the user agent and provide alternative content as, however good your intentions are, that could be considered cloaking. Google is against you providing Google different content to humans, so don't do it.
Do things differently
-
RE: Webmaster Tools finding phantom 404s?
If you can't tell where the links are coming from then the next bet is as Ben said is to identify the URLs which are being linked to and 301 them to a page which is closely related (or the home page otherwise), that way you don't lose any potential visitors to your site via those links.
-
RE: Why should I add URL parameters where Meta Robots NOINDEX available?
I'd say the first thing to say is that NOINDEX is an assertion on your part that the pages should not be indexed. Search Bots have the ability to ignore your instruction - it should be rare that they do ignore it, but it's not beyond the realms of probability.
What I would do in your position is add a disallow line to your** robots.txt** to completely disallow access to
/patio-umbrellas?canopy_fabric_search*
That should be more effective if you really don't want these URLs in the index.
-
RE: More youtube results from the past 2 days
If you're seeing more YouTube results then maybe it's because those things are more relevant for the search you're doing. In which case you have to look at them and see why they are more relevant, as that may be affecting your SEO campaign. If you need to get a video done to compete, then so long as it's quality and relevant (and done professionally, not rushed) then do that.
Otherwise, just treat the new videos as a normal competitor and analyse it accordingly to see why they outrank other pages.
SERPs change and algorithms change, so how you approach your SEO campaign as to change over time as well.
-
RE: Home page canonical issues
I have yet to come across an Analytics campaign where the traffic report doesn't show some traffic from the site we're analysing. The reason is that people visit the home page from the other pages so it's basically an 'internal referral'. What you want to use is the landing page analysis instead, or just use the filters to exclude example.com referrals.
If you're redirecting all variants of the home page to root and your canonical is going to root, then you should be fine.
Check using a site:example.com search in Google and if you only have root and no /default.asp /index.asp etc then it's fine and you don't have an actual canonical problem - which is what I suspect.
I think it's just ensuring you look at the right analytics reports now and don't get confused in the myriad reports it lets you have.
-
RE: Temporary redirects
To answer one question first - a trackback basically is an alert to you, the blog owner, when someone links to your post.
Great definition over at Wikipedia: http://en.wikipedia.org/wiki/Trackbacks
Wordpress often auto-creates redirects, as this saves you, the blog owner from considering these things and it minimises 404-Not Found... the main things that create a redirect are posts being moved or deleted, the same with comments.
As to whether you NEED to do anything about the redirects SEOmoz software is finding for you will depend on whether those addresses being redirected
- exist on-site
- Are linked to on your site
- OR are in the SERPs
Do a site:redlandsorthodontics.com search in Google/Bing to see if those pages exist in the SERPs.
The only thing I would say is that if the pages being redirected don't exist, or shouldn't exist, and probably won't be brought into existence then go into the .htaccess file and look for the offending URL and make it a 301 Redirect (Permanent Redirect, rather than temporary), as there's no need for it to be temporary.
-
RE: Domain Choise
Seriously, people over-estimate the importance of having a keyword in the domain. Google have said they are weakening the importance of this and have said many a time that a good brand is much more important.
As Kevin rightly says, that is still 1 factor in a list of over 200. There is still opportunity to get the keyword into file names (therefore still in the URL) if you really want to.
But don't over-emphasis something that Google is de-emphasising. You should build a strong brand first rather than making longer and longer domains just to get keywords in. Save your money and keep & use your main branded domain.
-
RE: Are nofollow links really not effecting rank?
The value of a link is a complex thing, to be honest. Your 2m 'no follow' links can have a variety of value depending on
- The number of domains those are placed on
- The position of those links on the linking site
- The value of the page the link is placed on
- The actual anchor text in that link.
Also try not to think of links purely in terms of ranking. At the end of the day, ranking is great, but what you really want is traffic. A 'natural' link profile will have some nofollow links in it (in most cases) and you may find some nofollows are great at driving traffic.
On the flip side you also want a good proportion of followed links as these represent a 'full value vote' for your site, rather than somewhat of a disassociation, which a nofollow can imply.
So get a variety of links from a variety of sources in a variety of ways but don't ignore a potential linking opportunity just because it may be nofollow as there may be other advantages to be gained that purely ranking.
-
RE: Would you keep Paid Directory Submissions a part of your SEO Strategy?
I suppose it all depends on what you mean by 'directory'. If you just mean paid inclusion - i.e. there's no value in being on that site, or even no listing on that site if you don't pay, then No, because essentially it's a paid link (which is generally bad).
If that directory is offering you some other paid service on top of a good free listing on a quality directory, then you have to assess that website in terms of its return for you. Do a few minutes research to determine if being listed on that site is likely to be good for your business, except for another 'keyword rich anchor text'...
You need to assess each site on its own merits, but we don't use paid directories - in our eyes you'd be better investing that time and money in producing some great content and then getting that out over social media. There are usually much better investments for your business finances than a paid directory - especially if there's the risk that a website you've paid to be listed on falls out of the Google index and therefore is giving you precisely nil value.
Remember link building should be a natural thing, so get a variety of websites, a variety of link texts and as is preached here at SEOmoz, get good quality editorial links where you can too.
Hope this helps a little.
-
RE: Image hosting, afraid it will be viewed as doorway
I'm still not sure you have a problem if I'm honest. If your actual site is fine, then purely providing an image off-site is not a classic 'doorway' website model so can't see why you'd be penalised for that, in my view, quite sensible practice.
-
RE: Technical question about site structure using a CMS, redirects, and canonical tag
That's not a great mechanism for a CMS even before you consider SEO!
Do you understand ASP sufficiently to move the default.asp to the root directory and then apply the rel=canonical?
If the actual homepage is /content/default.asp then there are two things you should probably consider
1. Make the redirect from root to /content/default.asp a 301 as it is permanently at that address, not temporarily
2. Any links you get in need to point to /content/default.asp for max effect.(2) is really tough as it's messy for webmasters and doesn't do your website branding any good. So, to be honest I would be looking at moving that default page to root, if the choice were mine.
Open to other opinions.
-
RE: Is it possible to have the crawler exclude urls with specific arguments?
You will need to do a block in your robots.txt based on a match:
**User-agent: ***
Disallow: /variable=I think the above would work.
The other approach would be to add a line in your page code itself to generate a 'nofollow' robots tag when that argument is present.
This link should help you, by the way:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449
-
RE: Image hosting, afraid it will be viewed as doorway
I don't know what my other Mozzers here would say, but I don't see how your site could be viewed as a doorway just because you're hosting images off-site.
If the e-commerce site itself functions properly and you have the right content on it and is, unique in what it provides (i.e. no duplicate content) then I don't see it can be a doorway.
I also assume that the full-size images won't be accessible by a normal 'link', i.e. you only get the right to that image once you've paid for it?
If that's the case then I genuinely can't see any risk to your e-commerce site. Provide great content, a good shopping environment and build your reputation online, as with any business.
Personally, I think you can relax - and from a web design perspective, I don't blame you for looking to host large images off-site
-
RE: Backlink Building Places
My first major comment will be to choose wisely, shop around and make sure any SEO company you take on explains how they go about the link building strategy. Source someone local, or at least in your own country who understands you, your needs and your business environment.
Secondly, you should consider what you are already doing - or think about what you should do. i.e.
- Are you becoming part of communities online?
- Are you contributing to blogs or writing some great content that relates to your website
- What are you doing to engage people on social media
- Do you know what your competitors are doing - so you can learn the good stuff and avoid the junk?
Your question sounds simple, but you need to balance knowing what you can (and maybe should) be doing yourself, while taking on a business you can work closely with who will help you on this front.
-
RE: Redirect a blog category page to the homepage?
You're welcome - I'm subscribed to this thread by e-mail so if you have other questions as you're working on it, feel free to post here and I will get back to you.
-
RE: Niche sites: how to optimize them?
I do however agree with what Ryan wrote - in that domains with keywords (unless a valid part of a brand) is not a worthwhile focus. Google particularly likes brands, so build a reputation based on quality content in your niches and you should be okay.
There's no harm in building a good niche sites, i.e. filling a gap the main competition aren't in, but do it genuinely and do it well - remember you're building a quality site and everything affects your reputation - oh and finally, remember there are no short cuts
-
RE: Niche sites: how to optimize them?
Hi there,
You don't have to do much differently for 'niche' sites in terms of SEO than any non-niche site, to be honest. You will still need to choose your keywords, put together some great content, manage your site structure and on-page optimisation well and ensure a great user experience.
After that your link building efforts are fairly simliar in that you'll need to look for good linking opportunities. Look at your competitors for the keywords you're targeting and then maybe look at some advanced search queries:-
http://www.seomoz.org/blog/9-actionable-tips-for-link-prospecting
Niche Communities The thing I would suggest is getting involved in communities based around your niche - maybe hook into some blogs or forums, or other community types and be of use to the people there - get to know people well (don't just see it as 'link building'). If you're in a niche then you want to get known in that niche.
Social Media Look up some communities through social media or do keyword searches in those site to identify people you could get involved with or get to know.
Backlinks How many links it takes to get you noticed? How long is a piece of string? It depends on the quality of the sites linking to you and what your competitors are up to - that will change niche-to-niche.
Web Designer You should be able to get any web designer to create your niche site but I would suggest you go with a designer who understands SEO in order to get the site structure and content right before the site goes live. Ensure you get an Italian Copywriter to look over the content so that keywords are inserted in a natural, linguistically acceptable way - i.e. it reads well.
Hope this is of use to you
-
RE: Bringing a Google+ business page and a Google+ Local (Places) page together
More than you wish you had to have, most definitely! Glad we could help!
-
RE: Forced Page Views and Search Engines?
Oh in that case if there is no actual page, neither for humans nor for search engines, then I don't see how it would affect your SEO efforts, positively or negatively.
-
RE: Large volume of ning files in subdomain - hurting or helping?
Heya,
I don't know what 'Sched.com' is as there's nothing on that domain, or what you mean by a 'Ning' file, but applying basic rules
- Do what you can/have to, to reduce errors on the site - this may involve restructuring the site or moving files around
- You don't need new domains for storing content, sub-domains or sub-folders will suffice
- Having content/files which are not 'SEO-able' is not an issue. If you focus on the user's experience of the website, reduce clutter and errors and ensure the site is easily crawlable then you are getting things off on the right footing.
- 600 pages in a root domain is crazy, but if they are named helpfully then it doesn't necessarily have to be a problem. I often have sites where an index.php governs the site and then all the content is stored in a sub-folder. It's not necessarily where the files are stored, but how they are managed and organised that makes a difference to the webmater, website visitors and indeed, search engines.
- You should be able to fix errors without moving pages off-site, else why have them anywhere?
Hope this helps in some way
-
RE: Rel=author?? google auth?
Apologies for the delayed reply - I've just spotted your question.
Rel=author is really easy to implement on your own site or on guest posts.
1. You have to make sure you are a contributor to the website you're writing for and that you have a link to that site on your Google+ profile.
2. Then, you have two options - the meta tag or a rel="me" part of a link which credits you for the post.
There are good number of guides on the web - two good ones are here
http://www.blindfiveyearold.com/how-to-implement-rel-author
http://www.pr2020.com/blog/claim-your-content-how-to-set-up-rel-author-tags
I'm afraid I can't advice on 'Good Auth' but if you have a decent blogging platform you should be able to get a good plugin to help you with implementing rel=author.
-
RE: Google sitemap just for a part of site?
You can add multiple sitemaps in Google Webmaster Tools, that's not a problem. So you could, I suppose, add a sitemap of just your new pages.
In my opinion though, I think you should just generate a new one at http://www.xml-sitemaps.com and upload and resubmit it. That would work just as well as a part sitemap.
If your site is really large though, maybe the part sitemap is the right answer, but going forward you'll have to remember which bits appear in which sitemap so you don't have overlap or accidental omissions.
Hope this helps.
-
RE: Redirect a blog category page to the homepage?
Why does the homepage need to rank rather than the category page? By it not being the home page that suggests the category page is actually more relevant to that keyword.
So what you should do is add more great content onto the category page and check your on-page optimisation then build some links directly to that category page.
The homepage is not always the most relevant page for a searcher. If you want people to go back to your homepage then provide some on-page encouragement on the category page instead.
Unless you have a compelling reason that the homepage should be the one that ranks, because at this point, Google appears not to agree with you
-
RE: Missing Title Tags on Include Files?
By 'include files', do you mean that you have one central design (e.g. an index.php) and then a query pulls in the correct page?
If your pages are pulled in from that subdirectory and if you're now getting errors, then I suggest you
- Check to see if those pages are included in Google's index, using a "site:" query
- Use a robots.txt disallow command to prevent it from being included in future.
For it to have appeared in Google's index in the first place, either these files are in your sitemap.xml or you've accidentally created a direct link from somewhere to those files. Either way, check your folder security and block anything that shouldn't have access to those files.
-
RE: Yahoo rankings anomaly?
The more I've worked with search engines, the more you get used to the occasional 'blip' in your rankings. The Bing/Yahoo algorithm seems to yield particular inconsistent rankings as well.
I would suggest that if you were out the top 50 before and now, ignore the sudden great rankings and work on your on-page and off-page optimisation.
Bear in mind Yahoo & Bing share data and Bing (as with Google) uses social signals, so up your social media game and interaction (in a non-spammy way).
No easy route to the top I'm afraid, just hard work, but work done in the right direction
-
RE: I think I've been hit by Penguing - Strategy Discusson
Your problem started with 50-60 domains with duplicate content - why do this? If you are working in different geographical regions then honour those places by giving them some unique content to justify your presence in the local search results.
I'd either nofollow or remove the link to your main websites as you have essentially 50-60 identical websites linking to your website - so, from Google's perspective, why should they improve your rankings?
Do a search across the Q&A's as there are loads of threads on Penguin recovery but in short it's about doing positive things
- Un-spammify any of your pages - so any on-page/site issues must be resolved (In your case, this means changing your unique content problem)
- Do positive link building
- Ensure a natural anchor text profile
- Don't seek any quick fixes.
Unfortunately for you that may mean changing 60 sites, but as that's where your problem started, that's where your clean-up starts.
-
RE: Bringing a Google+ business page and a Google+ Local (Places) page together
My understanding of the changeover Google is enforcing of Google Places -> Google+ pages is that they are automatically switching them over, but over a period of time.
You're right to not create a business 'person' account in Google+ because that goes against their guidelines. So for now, work with the two solutions and if it hasn't changed over in a week or two, maybe contact the Places Team directly.
Oh and Google historically have not made anything straightforward on the Google Places side of things, so I would just say have a bit of patience marrying up your Google services
-
RE: Forced Page Views and Search Engines?
What do you mean by 'Forced Pageviews' - either someone is actually viewing a HTML5 page, in which case a normal page view will be in play, or you provide a link to a non-HTML5 one for those with non-compliant browsers in which case you will only get a pageview on the other page if they click through.
Unless you're talking about an automatic redirect for non-compliant browsers? Can you clarify what is actually happening on-site please?
-
RE: Summarize your question.Sitemap blocking or not blocking that is the question?
I use Screaming Frog SEO Spider (free version) to check the internal link structure of a website. If a page is blocking ALL spiders it will pick it up.
Another thing I would say would be to check in Google Webmaster Tools to see if there are any crawl errors.
And the last thing I would add is to make sure that you have a non-JavaScript way to find all the pages on your website - through strong internal linking or a manual sitemap page that isn't generated through JS.
Hope this helps
-
RE: Google sitemap just for a part of site?
To be honest a sitemap.xml is predominantly there to inform search engines of all your site's pages and their importance in your structure.
It's helpful to keep this up to date, so that the right pages are appearing quickly in the SERPs.
However a site with the right content in robots.txt and/or robots META tags will get crawled and, if the site is well structured and the internal links are all present, then all the pages of a site will end up in the SERPS anyway.
My question back to you would be - are there pages that you don't want to appear in Google's results and if so, why? The reason I ask this is that usually it's about excluding pages from search engines rather than making sure pages are included (assuming site structure and internal links are good).
-
RE: Responsive Vs Mobile Sites
Glad I could be of help. If you need any other guidance as you get into it, do just get in touch, I'd be happy to help. My contact details are on my SEOmoz profile
-
RE: Responsive Vs Mobile Sites
Right, you seem to be asking two questions here - responsive or not? And if you head for responsive then could it impact your SEO.
Responsive or Not As with any website question the issue is going to come down to what's best for your users or your target users. The same question could be 'app or mobile website' for example.
The more I've worked in web design the more I am seeing that when a user is searching on their phone they want the same answers as if they were searching on their laptop or desktop. The relevancy of what they are delivered should not be changed however the format must be changed to suit their device.
The above point being said about 'best for your users', I fall very much into the line of thought that you should be providing exactly the same content to mobile and non-mobile users, it will simply be the design or layout which changes.
A responsive site takes time to code and test - but once the wireframe is sorted and responds well to different devices, then you're sorted as each page should flow across the devices without an issue.
If you have a separate mobile website then you are suddenly coding and managing two websites and, to be honest, a mobile website will need testing across devices and tweaking accordingly so you're almost duplicating your work (you're doing responsive web design but on a second site), something I just don't see the point of, if your whole website fits the majority of devices accessing it.
So for me: responsive
Can Responsive Affect SEO? You need to make sure that it is done well and that you're not deluding the search engines or users in any way. Personally, I don't see the point in 'hiding a sidebar' when responsive web design and CSS permits you to reformat it and display it in a mobile-friendly way. Why reduce the mobile user's experience if you, with a bit more work, can give them an appropriate and rich experience?
So if you do it properly, you're providing the same content to mobile users but just showing it differently. If you keep that in mind then there should be no negative SEO implications and you never know, your conversions from mobile users and referrals/shares from mobile users may increase above your competitors because you've taken time to give them a great experience.
Hope this helps - you're not dealing with small issues - we're in the middle of recoding our website for responsive web design, so all the best as you make these decisions.
-
RE: Has my Rich Snuppet attempt passed the test?
This is looking okay to me - with the only issue being that the +44 and (0) are outside of your Telephone Itemprop. You want to include one or both inside this.
What you're trying to achieve is a good rich snippet and consistency with other places your details are shown - e.g. Google Maps, local directories etc. And the area code is part of your local identity, so in the very least, put the zero inside the telephone itemprop.
-
RE: Ok to use rich snippets for same product on multiple pages?
Why don't you move everything to the root domain?
You can keep all the subdomains in existence and put in 301 redirects from the sub-domain to the root domain and then any links that may be coming in you just ask to get them changed?
Then the search feature also is tweaked to only look at root domain?
This would simplify the site a lot and make it much easier to manage in the long term.
Just my opinion, but hope it helps.
P.S. Spend the money wisely, we're in a recession don't you know?
-
RE: Site Re-Design - Running old XML site map for 301's
Heya,
Seems like on the whole you've at least thought through the redesign etc, so well done on that score. Here's my opinion on your questions though:
1. Leaving the XML file on-site for a short while won't do any harm - but make sure you have a new XML files containing the correct website structure going forward. Make sure your new structure has strong canonical tags so that the 301s are also recognised by their new URLs. As far as I know, the XML file doesn't help 'pass the juice', the 301's will.
The other consideration is to ensure you systematically look at external links to old pages and get them changed to point to new pages as the juice value of 301's diminishes over time.
2. You should only see a drop in your rankings if your new page content and titles become less relevant. So ensure your on-page optimisation is done well for your target keywords as soon after that page 'going live' as possible. If you drop in rankings, it may be temporary, or as you say with fresh content it may give it a boost. The issue is not your rankings, but ensuring relevance for your page. If you drop, compare your on-page optimisation to those of your competitors and see what they are doing differently.
If you do everything systematically/methodically and do it well then you should be fine.
-
RE: Does the duplicate content on the crawl errors report test content on external websites?
I would slightly disagree with you on one point - the need to check every line. There are high probabilities, given the vastness of the web, of individual lines being found somewhere else. Where duplicate content becomes an issue is if a decent-sized section is found elsewhere or a large proportion of your page is not unique. So for an individual page, maybe checking half paragraphs wouldn't take too long, unless you have to do it site-wide.
It hadn't come across Copyscape, so that's helpful for me going forward. Thanks!
-
RE: Should I delete a page or remove links on a penalized page?
Start with the page itself - is it adding value to your website? If not then even before you think about links you should consider re-writing or removing it.
If it's a quality page with useful or interesting information on it then so long as the page isn't spammy then you should be okay to keep it.
In a recent WBF, Rand suggested that ways to deal with negative SEO (whether caused by yourself or someone else) include:
1. Trying to remove the 'bad links', assuming you've worked out which ones are bad
2. Building more good links to outweigh the bad ones.Like you said, if you have good links, you may not want to lose them. If you're talking about a page that has been penalised (potentially during Penguin) then fix any issues with the site's content itself and then file a reconsideration request.
And then get on with some great link building and promoting of your page.
-
RE: Ok to use rich snippets for same product on multiple pages?
My first question would be why are you showcasing the same products on different pages? Do you have 'featured products' or 'most popular' products or 'special offers', for example or is it something else.
After considering that, assuming there's a logical user-driven reason for doing it - I can't see a major problem if the rich snippets are consistent for each product.
Of course, depending on whether this is manual or automatic/CMS driven it will become a headache to keep product snippets updated in multiple locations so on that front I wouldn't encourage it. Get a balance though - if you can encourage click through from the root to the sub-domain then the snippets will be picked up anyway. Don't make running your site more hard work than it strictly necessary.
Just my £0.02p (I'm in the UK so I can't give you 2 cents ;))
-
RE: Does the duplicate content on the crawl errors report test content on external websites?
As far as I can tell it's just on your own site. To check externally copy a few sentences from the page you're testing, surround it in " " and paste it into a Google query. Repeat for a variety of sentences across the page.
-
RE: Strange meta description shown in SERPS
Is the snippet Google is using less relevant than what you've provided in the META DESCRIPTION or is it still okay? As description isn't used for rankings it won't harm your SEO, but could impact your click-through if Google's choice is not as hot for encouraging clicks. They don't always make the best decisions.
The only other thing is to make sure you have only one meta description per page. I'm hoping that sounds obvious, but it occurred to me that I have seen a site with more than one, so Google makes its own mind up.