Cantags within links affect Google's perception of them?
-
Hi, All!
This might be really obvious, but I have little coding experience, so when in doubt - ask...
One of our client site's has navigation that looks (in part) like this:
<a <span="">href</a><a <span="">="http://www.mysite.com/section1"></a>
<a <span="">src="images/arrow6.gif" width="13" height="7" alt="Section 1">Section 1</a><a <span=""></a>
WC3 told us the
tags invalidate, and while I ignored most of their comments because I didn't think it would impact on what search engines saw, because thesetags are right in the links, it raised a question.
Anyone know if this is for sure a problem/not a problem?
Thanks in advance!
Aviva B
-
Thanks, Ryan. Good ideas, and we'll see what "the authorities" choose to do.
-
If they would have to pay a significant amount of money to have it redone, though, would it be worth it in this kind of case? What would the odds be?
Without having any information about the site, it's not possible to offer any credible details, odds or measurements of worth. If you are asking for a guess, I would say it is very unlikely for the div tags to cause any SEO problems, but that's the problem with invalid code, you don't know how it will be handled.
The bigger concern I have is if that line of code was coded so poorly, there are likely other coding issues with the site.
May I suggest asking a couple developers for an estimate on how much it would adjust the site's code so it validates?
-
Thanks, Ryan. Point well taken. I think I may copy and paste this for the client in question. If they would have to pay a significant amount of money to have it redone, though, would it be worth it in this kind of case? What would the odds be?
Aviva
-
Thanks, Kyle. We're not the design/webmaster team, so while it might not have been a good idea to do that in the first place, our job here is just to tell our client what MUST change for SEO and what doesn't need to change, even though it might not have been ideal. The challenges of not having unlimited budget...
Thanks,
Aviva
-
Simply from a front-end development perspective, why would you place a
inside of an <a>? If you are trying to force a block element style, why not simply apply it through the CSS sheet to the</a> <a>tag?
If you supply a URL i can give more specific coding advice
Thanks - Kyle</a>
-
The problem with using invalid code is every browser may handle it differently. Even if your current browser handles it fine today, the next time it updates the results may change.
Code validation is representatives from all the major browsers getting together and agreeing on coding rules. The biggest problem with invalid code is people thinking their site is fine but then later finding out (or worse not finding out) their site does not appear correctly in various browsers.
You have ie6, ie7, ie8, ie9, ie10, Chrome, FF, Opera, Safari and other browsers on the market. You have a variety of phones, ipads and other devices. It is more important then ever to use valid code. If your page doesn't fully validate, it should still be almost valid and the couple errors which remain have been thoroughly researched and you consciously choose to not validate on those particular items. An example would be if you are using HTML 5 and the validation tool has not fully been updated for all the latest changes.
With the above noted, I am not aware of any problem with your code. The challenge is since it is not valid, you cannot predict how it will be handled by Google. Even if it is handled correctly today, a change can be made at any time which can impact you.
-
Thanks, Andy. You've seen sites that have used the tags the same way?
-
To be honest, I can't see, from an SEO perspective, how Google would view these in a negative way. I can only tell you that from all of the sites that I have seen, I have never seen this as a problem.
Someone else might come up with a definitive answer, but I would say that there is nothing wrong with
tags for SEO.
Cheers,
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WebMaster Tools keeps showing old 404 error but doesn't show a "Linked From" url. Why is that?
Hello Moz Community. I have a question about 404 crawl errors in WebmasterTools, a while ago we had an internal linking problem regarding some links formed in a wrong way (a loop was making links on the fly), this error was identified and fixed back then but before it was fixed google got to index lots of those malformed pages. Recently we see in our WebMaster account that some of this links still appearing as 404 but we currently don't have that issue or any internal link pointing to any of those URLs and what confuses us even more is that WebMaster doesn't show anything in the "Linked From" tab where it usually does for this type of errors, so we are wondering what this means, could be that they still in google's cache or memory? we are not really sure. If anyone has an idea of what this errors showing up now means we would really appreciate the help. Thanks. jZVh7zt.png
Technical SEO | | revimedia1 -
Specific question about pagination prompted by Adam Audette's Presentation at RKG Summit
This question is prompted by something Adam Audette said in this excellent presentation: http://www.rimmkaufman.com/blog/top-5-seo-conundrums/08062012/ First, I will lay out the issues: 1. All of our paginated pages have the same URL. To view this in action, go here: http://www.ccisolutions.com/StoreFront/category/audio-technica , scroll down to the bottom of the page and click "Next" - look at the URL. The URL is: http://www.ccisolutions.com/StoreFront/IAFDispatcher, and for every page after it, the same URL. 2. All of the paginated pages with non-unique URLs have canonical tags referencing the first page of the paginated series. 3. http://www.ccisolutions.com/StoreFront/IAFDispatcher has been instructed to be neither crawled nor indexed by Google. Now, on to what Adam said in his presentation: At about minute 24 Adam begins talking about pagination. At about 27:48 in the video, he is discussing the first of three ways to properly deal with pagination issues. He says [I am somewhat paraphrasing]: "Pages 2-N should have self-referencing canonical tags - Pages 2-N should all have their own unique URLs, titles and meta descriptions...The key is, with this is you want deeper pages to get crawled and all the products on there to get crawled too. The problem that we see a lot is, say you have ten pages, each one using rel canonical pointing back to page 1, and when that happens, the products or items on those deep pages don't get get crawled...because the rel canonical tag is sort of like a 301 and basically says 'Okay, this page is actually that page.' All the items and products on this deeper page don't get the love." Before I get to my question, I'll just throw out there that we are planning to fix the pagination issue by opting for the "View All" method, which Adam suggests as the second of three options in this video, so that fix is coming. My question is this: It seems based on what Adam said (and our current abysmal state for pagination) that the products on our paginated pages aren't being crawled or indexed. However, our products are all indexed in Google. Is this because we are submitting a sitemap? Even so, are we missing out on internal linking (authority flow) and Google love because Googlebot is finding way more products in our sitemap that what it is seeing on the site? (or missing out in other ways?) We experience a lot of volatility in our rankings where we rank extremely well for a set of products for a long time, and then disappear. Then something else will rank well for a while, and disappear. I am wondering if this issue is a major contributing factor. Oh, and did I mention that our sort feature sorts the products and imposes that new order for all subsequent visitors? it works like this: If I go to that same Audio-Technica page, and sort the 125+ resulting products by price, they will sort by price...but not just for me, for anyone who subsequently visits that page...until someone else re-sorts it some other way. So if we merchandise the order to be XYZ, and a visitor comes and sorts it ZYX and then googlebot crawls, google would potentially see entirely different products on the first page of the series than the default order marketing intended to be presented there....sigh. Additional thoughts, comments, sympathy cards and flowers most welcome. 🙂 Thanks all!
Technical SEO | | danatanseo0 -
Unfindable 404's
So I have noticed that my site has some really strange 404's that are only being linked to from internal links from the site.
Technical SEO | | Adamshowbiz
When I go to the pages that Web master tools suggests I can't actaully find the link which is pointing to the 404. In that instance what do you do? Any help would be much appreciated 🙂0 -
Duplicate Content - What's the best bad idea?
Hi all, I have 1000s of products where the product description is very technical and extremely hard to rewrite or create an unique one. I'll probably will have to use the contend provided by the brands, which can already be found in dozens of other sites. My options are: Use the Google on/off tags "don't index
Technical SEO | | Carlos-R
" Put the content in an image Are there any other options? We'd always write our own unique copy to go with the technical bit. Cheers0 -
Google appending keyword to local search result(s)?
I noticed an interesting change today in how one of my clients appears in the SERPs. Google seems to be appending a keyword to his listing title. Client website: www.mycalgarydentist.com Keyword: Calgary dentist Rank: #2 or #1 lately Title tag: Calgary Dentist | Ambiance Dental Google+ Local listing title: Ambiance Dental Link title in SERP: Ambiance Dental: Calgary Dentist That last point is what's interesting, and new. As of a couple weeks ago (before I went on holidays) his link would simply show "Ambiance Dental", which makes sense because that's the title of his Google+ Local listing. Given the above information, I can't see why his link in Google's SERP is "Ambiance Dental: Calgary Dentist" when doing a search for that keyword. When I do a search for "Calgary dentists" or other similar searches, he simply shows as "Ambiance Dental", not "Ambiance Dental: Calgary Dentists" To test yourself, use the Google AdWords Preview Tool (https://adwords.google.com/d/AdPreview/), change locality to "Calgary, AB, Canada" and search. I suspect this doesn't mean he's violating Google's guidelines for business listings (i.e. businesses aren't supposed to add keywords to their business title). I'm certainly curious why this is happening though. Can anyone provide any insight? Has anyone seen anything similar? calgary-dentist-search.png
Technical SEO | | Kenoshi0 -
Google doesn't rank the best page of our content for keywords. How to fix that?
Hello, We have a strange issue, which I think is due to legacy. Generally, we are a job board for students in France: http://jobetudiant.net (jobetudiant == studentjob in french) We rank quite well (2nd or 3rd) on "Job etudiant <city>", with the right page (the one that lists all job offers in that city). So this is great.</city> Now, for some reason, Google systematically puts another of our pages in front of that: the page that lists the jobs offers in the 'region' of that city. For example, check this page. the first link is a competitor, the 3rd is the "right" link (the job offers in annecy), but the 2nd link is the list of jobs in Haute Savoie (which is the 'departement'- equiv. to county) in which Annecy is... that's annoying. Is there a way to indicate Google that the 3rd page makes more sense for this search? Thanks
Technical SEO | | jgenesto0 -
What's the issue?
Hi, We have a client who dropped in the rankings (initially from bottom of the first page to page to page 3, and now page 5) for a single keyword (their most important one - targeted on their homepage) back in the middle of March. So far, we've found that the issue isn't the following: Keyword stuffing on the page External anchor text pointing to the page Internal anchor text pointing to the page In addition to the above, the drop didn't coincide with panda or penguin. Any other ideas as to what could cause such a drop for a single keyword (other related rankings haven't moved). We're starting to think that this may just have been another small change in the algorithm but it seems like too big of a drop in a short space of time for that to be the case. Any thoughts would be much appreciated! Thanks.
Technical SEO | | jasarrow0 -
Switching ecommerce CMS's - Best Way to write URL 301's and sub pages?
Hey guys, What a headache i've been going through the last few days trying to make sure my upcoming move is near-perfect. Right now all my urls are written like this /page-name (all lowercase, exact, no forward slash at end). In the new CMS they will be written like this: /Page-Name/ (with the forward slash at the end). When I generate an XML sitemap in the new ecomm CMS internally it lists the category pages with a forward slash at the end, just like they show up through out the CMS. This seems sloppy to me, but I have no control over it. Is this OK for SEO? I'm worried my PR 4, well built ecommerce website is going to lose value to small (but potentially large) errors like this. If this is indeed not good practice, is there a resource about not using the forward slash at the end of URLS in sitemaps i can present to the community at the platform? They are usually real quick to make fixes if something is not up to standards. Thanks in advance, -First Time Ecommerce Platform Transition Guy
Technical SEO | | Hyrule0