Thanks - that was my feeling as well, but as I'd never done it before, I'm interested to hear others thoughts on it.
Posts made by alecfwilson
-
RE: Is it ok to correct someone who spelled and styled our name incorrectly in a blog post?
-
Is it ok to correct someone who spelled and styled our name incorrectly in a blog post?
A writer recently recommended my company, Swiftype, in a blog post. However, he called us Swift Type, which adds an additional t, splits our name into two words, and uses an additional capitalization. Would it be out of line to email him and gently correct him? I appreciate the recommendation, and will absolutely focus on thanking him for that, complimenting his content (which is fairly good), and just add the correction at the end of the email. While it isn't a huge deal, a search Swift Type brings up the Swift programming language and SwiftKey as first page results, while Swiftype only displays our content (he also didn't link to us, so a user would have to search to find us - reaching out could also be a chance to spur him to link to us).
-
Glossary pages - keyword stuffing danger?
I've put together a glossary of terms related to my industry that have SEO value and am planning on building out a section on our site with unique pages for each term. However, most of these terms have synonyms or are highly similar to other valuable terms. If I were to make a glossary, and on each page (that will have high-quality, valuable, and accurate definitions and more), wrote something like "{term}, also commonly referred to as {synonym}, {synonym}," would I run the risk of keyword stuffing penalties? My only other idea beyond creating a glossary with separate pages defining each synonym is to use schema.org markup to add synonyms to the HTML of the page, but that could be seen as even more grey-hat type keyword stuffing.
I guess one other option would be to work the synonyms into the definition so that the presence of the keyword reads more organically.
Thanks!
-
RE: Can i create more blogs/websites for SEO?
All the value you are trying to claim: links for relevant sites with high-quality content. That's a smart goal. However, creating websites solely to link to your site will trigger link scheme algo penalties. Additionally, even if it didn't trigger the penalties and just added no value, creating high quality, unique content will drive more value if you just host it on your own site. Then reach out to bloggers and other writers pointing them in the direction of the content.
-
Looking at acquiring a competitor with a high organic ranking (WordPress Plugin)
I'm looking at acquiring a competitor who operates a WordPress plugin with the same function as mine (although ours is much better - they used this plugin to acquire leads for other software). We're trying to determine the best way to execute it. They are the number 1 result on a very high value search, and the number 3 on another (which appears to the source of all their customer acquisition, they are simply a very well named, long existing WordPress plugin, and so haven't had to fight much to keep their ranking). Ideally, we'd be able to take over that ranking, but I'm not sure if that is possible. For one thing, they don't have a website beyond the WordPress plugin, so even if we wanted to, we couldn't 301 redirect it (as far as I know). Second, wouldn't a 301 redirect likely result in entirely losing the ranking? We are actually highly relevant to anyone who ends up at this plugin (the owner has actually been recommending us and contacted us because his customers who have left have had a good experience), and wouldn't be redirecting it to piggyback on the link juice, we'd be trying to acquire the customers that have been discovering this plugin organically.
I understand why Google defaults to policies making purchasing high ranking domains not very valuable. However, it seems like if you've built a company that derives a high volume of its leads (and thus acquisition value) from organic search, that there would be a way for a company in the same space to acquire at least the rankings, if not the link juice.
Hopefully that was clear, if not, let me know where I lost you, and I'll attempt to clarify. Thanks!
-
RE: For URLs that require login, should our redirect be 301 or 302?
Thanks - so the 302 redirect suggests the page has moved temporarily, which it has not. It simply cannot be accessed until a login. Is that considered "moved temporarily?" Since it will be accessible with a login, is that considered a temporary move? I will absolutely make pages that require login noindex. Should I make them nocrawl as well?
-
For URLs that require login, should our redirect be 301 or 302?
We have a login required section of our website that is being crawled and reporting as potential issues in Webmaster Tools. I'm not sure what the best solution to this is - is it to make URLs requiring a login noindex/nocrawl? Right now, we have them 302 redirecting to the login page, since it's a temporary redirect, it seems like it isn't the right solution. Is a 301 better?
-
RE: Spotted Hidden Omiod Links in Footer - What do you think is Going on Here?
Looks to me like someone added some kind of SEO tracker to the footer at some point that no one else was aware of. Have you tried installing Ghostery and seeing if it shows up as a tracker? I agree with the other commenter that it is almost certainly not an SEO issue, and that you should be safe removing it.
-
RE: Image Optimization & Duplicate Content Issues
If you have permission to use their images, just get images from them, name them accurately, and give them accurate alt-text. Duplicate content has to do with your own content, in general. Since the point of naming images and alt-text is to help Google understand them, it's not a big issue if an image has the same alt-text as another or appears multiple times on the site (especially since they should all be coming from an images directory, no matter where they are on the website). Also, images are much more likely to be naturally reused than text, as licensing photos is a long accepted practice.
-
RE: Will links be counted?
They'll be easily visible to a crawler. They'll be extracted based on their HTML/CSS elements, which are additional signals of importance and of website structure. Don't worry about it. If the HTML of the page includes the raw text, a crawler will understand it.
-
RE: Does Lightbox Gallery Clicks Count Positively Toward Page Rank?
Counting clicks and click throughs are somewhat outdated for SEO. Google's gotten much better at determining relevnce with other, more sophisticated signals. You should be worried about clicks and CTR for your paid search, but SEO, it's less important.
-
RE: Backlink level of domain authority
Maybe, it's more likely that they've simply moved to an updated, more sophisticated version. Generally speaking, PageRank above 4 is good. Also, try installing the MozBar and judging by Domain Authority or MozRank. For the metrics out of 10, look for 4 or above. Out of 100, 30 or above.
-
RE: How to fix Medium Priority Issues by mozpro crawled report??
It's all going to depend on your CMS, but if you are looking to understand what is being referenced, right click on a page on your site and select view source, the search for meta. The meta property description is what should be there (almost all CMS's allow you to edit this). Your page title should be much shorter, and you shouldn't have multiple pages with the same title. Again, nearly every CMS makes this very easy to edit.
-
RE: Added 301 redirects, pages still earning duplicate content warning
How long ago did you add the redirects? As with many SEO changes, you'll need to give Google time to recrawl and reindex your site. This can take several weeks in some cases, and even then, it will sometimes take multiple recrawls for Google to entirely index the changes. This assumes everything else was done correctly and nothing else is causing the duplicate content warnings.
-
RE: Domain.com/keyword1.keyword2.html vs doamin.com/keyword1-keyword2.html
Dots aren't used in URL structure except to signal a change, either from subdomain to root, root to tld, and in the URL string to indicate file format. Use dashes to separate words.
-
RE: Weird Google SERPs after New Domain Transfer 301
This is one of those things that just takes time. We moved our blog from /blog to a blog. subdomain about 2 months ago and have only recently seen SERPs reflect that. Also, the SERP results got better gradually rather than shifting overnight.
-
RE: Outranking a definition result
Could you provide a bit more context? If you are simply trying to outrank the definition of a keyword, it's not going to happen. Google is pulling that definition from its Knowledge Vault, and their algorithm has determined that result most likely to be related to that term is the definition. However, if, for example, you are trying to rank for the name of a business that is also simply a word (let's say "Twist"), you may be able to out rank it in local searches. Setup a Google My Business account and a Google Places page, and start optimizing for Local SEO (I've never done this, so I don't have great tips).
Unfortunately, if you are trying to outrank the definition of a word Google's algorithm believes most searches are looking for the definition, it's likely not going to happen. If it's the name of the business, then you may eventually be able to outrank it, but if it's just a keyword, it is almost certainly impossible.
-
RE: Wrong pages ranking for key terms
Agreed. I work at a company that builds search engines for websites (TechCrunch uses our search), and one of our biggest advantages over a product like Google Site Search is we'll crawl your site on demand (we're not a competitor to Google's web search product, obviously, so our on demand recrawls aren't an option for SEO ranking). Google, in its attempt to index the entire internet, cannot recrawl your site as soon as you make website changes, nor can it determine perfect relevance for every page immediately upon crawling. Unfortunately, a lot of times, the answer to "I fixed everything, still hasn't helped," is to wait it out. My recommendation would be keep doing what you are doing and work on good, sustainable SEO, and give Google a bit more time to crawl and index the new pages. It doesn't look like there is anything you are doing that is causing the problem, so I don't have a ton more to offer here.
Good luck!
-
RE: How do I know if my SEO person is creating solid links vs spammy links?
Some good suggestions above, try some back link checking tools, check their Domain Authority, etc. However, in my opinion, the best way for you to ensure your SEO person is building good links is to learn the basic difference between a good and bad link and actually check them yourself (the bigger your site and the more links you build, the less feasible this is, but the concept that you should be able to look at the links being built and understand what is a good or a bad link is still applicable). Obviously if you are building massive numbers of links, this is difficult (although there are tools that can help), but if your SEO employee (I assume it is singular) is building good links, they shouldn't be building massive numbers of them unless they are coming organically (through creating content or a product that is so popular that high quality links are appearing without traditional link building). Also, how are you measuring success? Ranking growth? Number of links? Quality of links? If you ask your SEO person to report on the links being built and ask he/she to include measures like Domain Authority, Page Authority, etc and then just try and audit the links periodically, you'll start to learn enough about SEO to measure their performance yourself (seriously, try Googling "audit my back links," there's some great tools out there, as well as reasonably simple explanations of the major things to look out for.
I also agree with those mentioning that outsourcing SEO is a dangerous (if somewhat necessary) strategy. In my opinion, learning about SEO basics is one of the single most valuable things a small business owner can do, since it will both improve your ability to market online, as well as protect you against hiring a bad employee.
-
RE: 50% drop in search, no changes to site over 2 days, no notifications, A rank...
Sorry to hear about that. If the problem is spammy back links, using disavow is a good start, but you need to wait for Google to recrawl them all. Also, make sure you use multiple tools to identify the back links, different back link checkers behave differently, and using several will make sure you identify the maximum number to disavow. Also, even disavow isn't going to address all the losses (SEO ranking is somewhat of a king of the hill game, getting knocked down will mean you still have to climb back up once you address the penalties). You'll want to reach out to as many of the spammy links and see if you can get them removed (frustrating, and many simply won't respond, but actually removing the links is the most effective version of disavowing them). You can also ask for a reconsideration if you feel the penalty was unfounded (see: https://support.google.com/webmasters/answer/35843?hl=en). Finally, while it doesn't directly address the drop in rankings, you should consider investing in improving SEO signals. As mentioned by other searchbuzz, adding a blog that updates your page content regularly will help your SEO, as will updating your sitemap so that Googlebot doesn't think you are misleading it about the freshness of your content. You could also consider periodically going through your site and updating parts of it outside the blog to keep page freshness up a bit. Adding structured data will help search engines find the right content on your site, and right now you haven't added any (http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fapplianceassistant.com%2F). Google PageSpeed Insights have a lot of suggestions for increasing page speed (https://developers.google.com/speed/pagespeed/insights/?url=applianceassistant.com). You may also want to consider moving to HTTPS, since you're dealing with a penalty and should be trying to maximize all the other SEO signals to compensate.
Unfortunately, getting hit with a penalty for algorithm changes not meant to specifically address your site (but to address those being extraordinarily abusive) doesn't have a quick fix. You just need to start practicing more sustainable SEO and mitigate the damage as best as possible.
-
RE: How Many Words To Make Content 'unique?'
Great question, and great answers from some of the other commenters. I've struggled with this question myself in building landing pages. The 20% rule is a good one, and makes sense, especially as Google gets better at semantic search and "keywords" become a bit less important in favor of query meaning. In a perfect world (one where search engines could understand queries the way your friend would when you told him what you searched for), if you cannot come up with 20% of a landing page that is entirely unique to that page, it's not something you should be building a landing page for. In the world we operate in, it's a nice guideline. My method for long tail landing page creation is: figure out what the head keyword that this long tail landing page is most related to (if you are trying to reuse the same value prop), and just rewrite every sentence. You should alter your word choice, sentence structure, and page organization (it's a nice opportunity to test those things as well, a long tail page that does unexpectedly well may give you some insight into a better converting format). At this point, I add the unique content. For keywords that aren't different enough to have true unique content, I'll generally write a section summarizing a few of the others all together, or add a different customer testimonial. To the commenter who mentioned that you can create unique content to search engines, but humans would laugh - a landing page for long tail keywords really shouldn't be something a customer can get to without coming to it from an external referrer. The root domain shouldn't link out to both domain.com/landing-page-head-kw and domain.com/landing-page-long-tail-kw.
-
RE: Wordpress SEO/ Ecommerce , Site with Multiple Domains ( International ) & Canonical URLs
As I understand it, you are asking about rendering the same content to two different domains that differ entirely in the tld (.au & .nz).
Based on my reading of Google webmaster guidelines, if the only difference is the domain and what country the visitors are from, you should not have any issues so long as you use rel-alternate-hreflang as you described. If you have multiple domains targeting the same users, just be sure to add a rel="canonical" tag.
Google's webmaster guidelines state:
"Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this might not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both
example.de/
andexample.com/de/
show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers."source: https://support.google.com/webmasters/answer/182192?hl=en#3
-
RE: How does switching to HTTPS effect Google Analytics?
I'm not a netsec expert or a technical SEO expert, but I'm running SEO for my company and have been looking into this for a while now. The tips I can give you are: add rel="canonical" tags to all the http:// versions of your site pointing to the https:// version. Once you get an SSL certificate, make sure to claim all 4 variations of your URL (http://, http://www., https://, https://www.) in webmaster tools and designate which is canonical (this will just make sure the Googlebot knows which is canonical, you'll still want to add the tag to your site pages). Finally, make sure that if you do decide to switch to HTTPS:// (which I highly recommend - some people, myself included, now instinctively use https:// over http:// and if someone points a link at https:// when you aren't using it, Chrome will display a yellow warning interstitial and a red X over the https:// in the address bar), get an SHA-2 certificate, rather than SHA-1, as Google is sunsetting it in the next year. To the comment on page load speed - Https:// slows page load down, but generally not by a substantial amount (also, there are plenty of other ways to address page load time that can offset the hit, and if you've already done all that, the page load hit won't hurt you, since you're in better shape than everyone else). Also, while Google's incorporation of https:// as a signal so far has not seemed to impact results much, it's a near certainty that, based on Google's current behavior, it will become less of a signal and more of a necessity, and as more websites adopt it, the less the slight page load hit will matter. Websites are essentially required to adopt it sometime in their lifecycle, as growth makes security much more of a concern.
Further reading on SHA-1 and SHA-2:
https://konklone.com/post/why-google-is-hurrying-the-web-to-kill-sha-1
http://googleonlinesecurity.blogspot.com/2014/09/gradually-sunsetting-sha-1.html
-
RE: Coupon websites as affiliates
No problem! Affiliate marketing is a space that could really use something that shakes the current landscape up. I'd say nearly all affiliate marketers are great, and very helpful (and affiliates are too valuable a lead source to ignore either way), but some elements of the way the space works attracts less-than-aboveboard players. I've seen great value from affiliates, even when we were constantly fighting the coupon skimmers. One last thing that may or not be applicable - not all companies that use affiliates even offer a coupon, but affiliates will still advertise {{company}} coupons. My current company does this. This makes it a bit easier to keep an eye on the skimmers, since I just added a line to our policy stating that affiliates may not advertise coupons, whereas the only thing you can really do about skimmers when you do have a coupon is make sure they aren't bidding on branded search terms or using your trademark in their root or subdomain name.
-
RE: Product or Shop in URL
For SEO purposes, either structure will work so long as it's clear where in the site navigation the page is, since it will just be making it easier for the site to be crawled. For optimizing for sales, the other commenters are right in that you are getting bogged down in things that aren't going to substantially impact customer behavior. Really the only thing you should be worried about with your URL in terms of conversion is whether you are using SHA-2 (people trying to come to an https:// version of your site will get a warning and no green reassurance in the address bar, and a red warning in the address bar if you aren't using SSL at all) and whether a visitor could look at the URL and have a good idea of what the content of that page is. Beyond that, you're optimizing at the extreme margins.
However, if you really do want to attempt to optimize your URLs to an extreme, it really depends on your site structure. If the entirety of the website is a store, then the /shop/ subdirectory is unnecessary. If you have a store as a part of your website, a subdirectory of /shop/ or /store/ or something like that would be helpful in indicating where in the site the URL is pointing. Similarly, having a product subdirectory makes sense if you have multiple categories of products (in your case, say you had both vitamins and paleo cooking ingredients, each with multiple SKUs within the product category). However, if your store only has 9 SKUs, all of which are vitamins, /products/vitamins is unnecessary for indicating where in the site you are (or, you could use /products/ to direct to a page listing all 9 SKUs, in which case the /products/keyword would ultimately turn into products/nameOfProduct).
If you have a site that has a store with multiple product categories all with multiple SKUs, you could consider /store/products/keyword as a format, although that starts to get a bit long. Have you considered using a store.website.com subdomain?
My URL process is: Is it the best indicator of where in the site navigation the visitor is? In most cases, this should mean it indicates that they are in a store, where in the stores navigation they are, and what item they are looking at (keyword). If that's true, then it's a good URL. Secondary concern is keeping the URL from being too long, aim for the most concise but clear indication of where in your sites navigation the user is. The final piece of the URL string (that indicates the specific page the visitor is on) is where you can add the keyword you care about.
-
RE: Coupon websites as affiliates
Hi unikey,
I'm not 100% sure on negative SEO impact (would be interested to hear if the assumptions I've made from experience are accurate), but I previously worked for a company that did ~10% of their $75mm annual revenue from affiliates, and we primarily used ShareASale. We never saw a negative SEO impact from the affiliates (and we approved nearly all of them, at least initially), which I am assuming is because ShareASale affiliate links are all nofollow. Since ShareASale's value prop is massively expanding your reach to potential customers through their network, they should have been incentivized to make sure there is no SEO downside to leveraging their platform.
However, while this isn't negative SEO-related, I do have a decent amount of experience with affiliates, so you may be interested. You should be somewhat wary of affiliates, especially when the link you provide has a coupon attached. Some (not all, and definitely not the good ones) will just spin up tons of sites like cheap{{company}}coupons.com to try and capture some of the search traffic that is generated by people seeing a coupon code box on your checkout page. Since the way the affiliate model works is by giving a revshare to the affiliate for acquiring the customer, tracked by a browser cookie, they should only be receiving their cut if they sent the customer to you as the original referrer or sent someone to you who was not already planning on purchasing. Someone searching for {{company}} coupons isn't someone the affiliate sent you. The basic ShareASale policy forbids this, but you need to periodically run some searches for your brand + coupons and make sure no one is violating the policy. Luckily, this only really becomes an issue if you are seeing substantial success with affiliates, but I try and check in on this once a month or so.
For a better summary of the problem and a counterpoint (Don't worry too much about whether affiliate marketing is or isn't a scam, what matters if it works for you. These are just more eloquent summaries of the potential upsides and downsides of affiliates), see:
http://venturebeat.com/2013/08/12/the-big-ugly-affiliate-marketing-scam/
http://venturebeat.com/2013/08/15/affiliate-marketing-not-bad/