If I mention a client in a blogpost about SEO, do I have to use a rel= no follow link?
-
I do SEO and webstuff (obviously, that's why I'm here). I want to write a blog post congratulating my client for getting to #1 in the local listings for a search for "plumber". When I include my link to my client's site, should it be rel=no follow? Could they be penalized if I don't?
Thanks,
Adam -
Thanks everyone!
I really appreciate all of the feedback.
-
Would second the answers of all responses here, particularly Anthony's.
At the the end of the day, a do-follow link will not hurt your client's site, especially done for reasons such as yours. If that were the case, a lot of sites would be in trouble... as too Google's system.
-
Did they pay your directly to place the link on your site? No.
Go right ahead and link to them with a followed link. People link to business partners, suppliers, vendors, etc all of the time. That's as natural as it gets. Your companies are connected offline as well as online. As others had mentioned, I'd still play it safe and link to them with their company name, not some KW rich anchor.
-
If you just want to congratulate them Adam, then yes, the sensible thing to do is add a nofollow to the link. The alternative would be to form the link as their URL or say at the end "Click here to visit their site". Something that isn't a money keyphrase or that isn't looked upon as an obvious attempt at trying to provide a spammy link.
-Andy
-
I'd say it depends on what current external links they have and what your domain authority looks like. If you've got a great domain authority, you will definitely help him out. If the client has no or very few external links, it would help them for you to endorse their services. I don't think they would be penalized for the link. It's legitimate and not spammy.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you use HREF lang tags when each page that is localised only exists in that language?
Hi, I have 2 questions I am seeking an answer for. We have a home page in english GB, we then also have products which are specifically served in US. For these pages where the phone number is american, the spelling is american, the address is american, do we need to implement href lang tags? The page isn't a version of another page in english, the page is only in the native language.Secondly, is it recommended to create a second home page and then localise that page for US users?I'd be really greatful if anyone has any pointers as googles forum doesn't explain best practice for this case (as far as I can tell).Many thanks
Local Website Optimization | | Adam_PirateStudios0 -
SEO for Franchises - Subdomains or Folders?
Wondering if there ever has been any recent consensus on best SEO strategy for a Franchise. I feel it is safe to assume that just having one corporate website with a "store locator" that just brings up the address, phone and hours of a location is not optimal. Yes, the important thing is to get a Google Places for Business listing for each location so you can come up in the 3-pack and regular Maps result, BUT, the rankings for the 3-pack is largely determined by the site's authority and relevance to the specific search term used, IN ADDITION TO, the proximity of the business to the search user's physical location. Apparently it is widely believed that domain authority does not transfer from www.mycorporatedomain.com to somecity.mycorporatedomain.com. And of course we also know there is a potential for a duplicate content penalty, so you can't just duplicate your main site for a number of locations and change the address and phone number on the contact page. If the products and or services are identical for each location, then it's going to be somewhat ridiculous to try and rewrite many sections of the website since the information is no different despite the location. It seems in general more people are advocates of putting location pages or micro-sites in a subfolder of the corporate domain so that it can benefit from the domain's authority. HOWEVER, it is also widely known that the home page (root URL) of any domain carries more weight in the eyes of Google. So let's assume the best strategy is to create a micro-site where phone and address is different anywhere they appear and the contact page is customized to that location, and the "Meet The Staff" page is customized to that location. The site uses the same style 'template' if you will as the main site. Let's also assume you can build a custom home page that has some different content, but still shares the same look and some of the same information as the main site. But let's say between the different phone, address, and maybe some different images and 20% of the content rewritten a bit, Google doesn't view it as dupe content. So would the best strategy then be to have the location home page be: somecity.mycorporatedomain.com and the product and services pages that are identical to the main site you just use a rel canonical to point to the main site? Or, do you make the "home page" for the local business be a subfolder of the main site. So I guess what it boils down to is whether or not the domain authority has more of an effect compared to having a unique home page on a subdomain. What about this? Say the only thing different on the local site is the contact (phone/address) in the header and/or footer of every page, the contact form page, and the meet the staff page. All other content is identical to the corp site, including the home page. I think in that case you need to use a script to serve the pages dynamically. So you would need to server the pages using a PHP script that detects the subfolder name to determine the location and dynamically replaces the phone and address and server different contact and staff pages. You could have a vanity domain mycity.mycorporatedomain.com that does a 301 redirect to the subfolder home page. (This is all ofcourse assuming the subfolder method is the way to go.)
Local Website Optimization | | SeoJaz0 -
Best SEO practice for project galley (image gallery) ? I need SEO Professionals advice.
Hi, i Have a website that is powerful and i dont want to hurt it. http://dreamgaragedoor.com/ right now i need a projects gallery page that people goes there to find out the models and products and services images. i have created the page and it would be 6 slider in the page and each slider has at least 10 images inside. first question is having this much images would or wouldnt hurt my webiste. second what ALT should i use for this many pictures in 1 page. for example i think having ALT like below in one page would be bad SEO wise. Sliding-gate-1, Sliding-gate-2, Sliding-gate-3, Sliding-gate-4,... please take a look at the gallery page and let me have your pro ideas. http://dreamgaragedoor.com/galleries/ thanks
Local Website Optimization | | Mishel2980 -
What is the SEO effect of schema subtype deprecation? Do I really have to update the subtype if there isn't a suitable alternative?
Could someone please elaborate on the SEO effect of schema subtype deprecation? Does it even matter? The Local business properties section of developers.google.com says to: Define each local business location as a LocalBusiness type. Use the most specific LocalBusiness sub-type possible; for example, Restaurant, DaySpa, HealthClub, and so on. Unfortunately, the ProfessionalService page of schema.org states that ProfessionalService has been deprecated and many of my clients don't fit anywhere else (or if they do it's not a LocalBusiness subtype). I find it inconvenient to have to modify my different clients' JSON-LD from LocalBusiness to ProfessionalService back to LocalBusiness. I'm not saying this happens every day but how does one keep up with it all? I'm really trying to take advantage of the numerous types, attributes, etc., in structured data but I feel the more I implement, the harder it will be to update later (true of many things, of course). I do feel this is important and that a better workflow could be the answer. If you have something that works for you, please let us know. If you think it's not important tell us why not? (Why Google is wrong) I understand there is always a better use of our time, but I'd like to limit the discussion to solving this Google/Schema.org deprecation issue specifically.
Local Website Optimization | | bulletproofsearch0 -
What more can be done to get Google to change the landing pages it uses for certain search terms?
For one of my SEO campaigns, Google is using the website's home page as the landing page for the majority of search terms being tracked. The website splits its products by region and so we want specific region pages to rank for search terms related to that region, rather than the home page. We have optimised each regional page to a reasonably high standard and we have ensured that there is a good amount of internal linking and sign-posting to those region pages, however, Google is still using the home page. The only complication is that for the first few months there were canonical tags on these pages to the home page. These were removed around 3 months ago and we've checked that the region pages are indexed properly. Is there anything we are missing? Has anyone had any success in getting Google to change its landing pages?
Local Website Optimization | | ClickHub-Harry0 -
Local SEO - Adding the location to the URL
Hi there, My client has a product URL: www.company.com/product. They are only serving one state in the US. The existing URL is ranking in a position between 8-15 at the moment for local searches. Would it be interesting to add the location to the URL in order to get a higher position or is it dangerous as we have our rankings at the moment. Is it really giving you an advantage that is worth the risk? Thank you for your opinions!
Local Website Optimization | | WeAreDigital_BE
Sander0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Local SEO + Best Practice for locations
Hi All, Based on a hypothetical scenario, lets say you are a plumber. You live and operate within Chelsea in London. You have established a Google places profile and incorporated schema data to tell Google your fixed place location. In addition you operate in several nearby towns with no fixed location presence. i.e Brentford, Bromley, Catford, Cheswick and Tottenham. I create a feature rich page on 'How to find a quality plumber'. Within the page I incorporate the following description: blah blah, as a quality plumber serving the community of Chelsea, we also offer our services to nearby towns of Brentford, Bromley, Catford, Cheswick and Tottenham. I create hyperlinks for the towns (Brentford, Bromley, Catford, Cheswick and Tottenham) that allow the user see in details a full list of services, operation hours, etc. Naturally all towns will have there own unique content (no duplication). Question
Local Website Optimization | | Mark_Ch
Is the above scenario the correct way to provide local seo or is this approach considered spammy to Google? Thanks Mark0