Null Alt Image Tags vs Missing Alt Image Tags
-
Hi,
Would it be better for organic search to have a null alt image tag programatically added to thousands of images without alt image tags or just leave them as is.
The option of adding tailored alt image tags to thousands of images is not possible.
Is having sitewide alt image tags really important to organic search overall or what? Right now, probably 10% of the sites images have alt img tags. A huge number of those images are pages that aren
Thanks!
-
Thanks, guys.
I've adjusted alt images tags on pages that really matter to me for organic. The tens of thousands of other images/pages are just going to have to chillax.
-
No problem at all. To be honest, it's really not a huge deal and probably not worth the dev budget or manhours required.
In most cases with a site like this, I'd be more inclined to add good alt text for all images on the most popular pages then, as you're working through other pages throughout the life of the campaign, update the alt text while you're at it.
If you're already updating the page title or content on a page, it's not that much extra effort to do the alt text while you're there.
-
Hi Eric & Chris,
Thanks for the help. Given the size of the site, tens of thousands of pages and more than one image per average page, I guess my real question is how much trouble is this worth? I don't think the image file name is really going to reliably yield alt img text. So, about the most one could do is possibly a site-wide empty tag. Is this really worth it for organic search? Seems like kind of a phony manipulation to appeal to a search algorithm in maybe some microscopic way. But, I could be wrong, so that is why I'm asking here. If it really matters, we'll do it. But if it doesn't, would rather not. Especially when you consider the next thing will be that having empty alt img tags will some day be a small negative, right? That would be so Google of them.
-
Is it possible to use a script to write? Alternative option is to run a screaming frog crawl looking for all images, download into excel, and use the image file name to help create a tag. That's assuming you've named the image with something specific instead of leaving it default (eg: image4893054893.jpg). Ideally you would want to include image alt tags, and many platforms can help make it easy. Could you give a little more information about your situation? There might be a pattern you can use to update on a large scale. I would not have the same tag applied to all images, because that really doesn't help search engines understand the photo and wouldn't be useful to users who have vision impairments. If you don't have the time to do it, then hire someone to assign alt tags (virtual assistant). Screaming Frog will make it really easy to find all the image files.
-
Naturally in the perfect world, meaningful attributes should be added. Assuming you're a mere mortal with a limited number of hours in the day... the best short-term solution to this is going to be having the alt attribute applied but empty.
To my knowledge (happy to be pointed towards data showing otherwise), there's no real ranking difference between these two options. The reason I prefer to add a blank alt in this instance is because assistive technology (like screen readers for vision impaired users) are going to have a much better experience on your site this way.
If you have a blank alt, the screen readers will essentially ignore the image since they're going to read " ". On the other hand, if you don't have an alt attribute in the , it's going to read the source instead. Even a short img src is going to be cumbersome, especially if you have an image-heavy site!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I use links intag instead of "ahref" tag can Google read links inside div tag?
Hi All, Need a suggestion on it. For buttons, I am using links in tag instead of "ahref". Do you know that can Google read links inside "div" tag? Does it pass rank juice? It will be great if you can provide any reference if possible.
Intermediate & Advanced SEO | | pujan.bikroy0 -
Google not Indexing images on CDN.
My URL is: http://bit.ly/1H2TArH We have set up a CDN on our own domain: http://bit.ly/292GkZC We have an image sitemap: http://bit.ly/29ca5s3 The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: http://bit.ly/29eNSXv. We used to have a disallow to /thumb/ which had a 301 redirect to our CDN but we removed both the disallow in the robots.txt as well as the 301. Yet, GWT still reports none of our images on the CDN are indexed. The above screenshot is from the GWT of our main domain.The GWT from the CDN subdomain just shows 0. We did not submit a sitemap to the verified subdomain property because we already have a sitemap submitted to the property on the main domain name. While making a search of images indexed from our CDN, nothing comes up: http://bit.ly/293ZbC1While checking the GWT of the CDN subdomain, I have been getting crawling errors, mainly 500 level errors. Not that many in comparison to the number of images and traffic that we get on our website. Google is crawling, but it seems like it just doesn't index the pictures!? Can anyone help? I have followed all the information that I was able to find on the web but yet, our images on the CDN still can't seem to get indexed.
Intermediate & Advanced SEO | | alphonseha0 -
Should Schema.org Tags go on every page?
Happy Monday Moz World! I am just wondering what are some best practices when using Schema.org Tags. For Example, I have a client who provides multiple services and provides unique content on each webpage. The design of each of the webpagesare unique, and conveys information differently. My question is: If each page of a company's website has unique content that describes a service or product, could I essentially change the url & description of the Schema Tag so that each of my pages are indexable by relationship to that page's content? Thanks ahead of time for the great responses! B/R Will
Intermediate & Advanced SEO | | MarketingChimp100 -
Canonical Tag help
Hello everyone, We have implemented canonical tag on our website: http://www.indialetsplay.com/ For e.g. on http://www.indialetsplay.com/cycling-rollers?limit=42 we added canonical as http://www.indialetsplay.com/cycling-rollers?limit=all (as it showcase all products) Our default page is http://www.indialetsplay.com/cycling-rollers Is canonical tag implementation right? Or we need to add any other URL. Please suggest
Intermediate & Advanced SEO | | Obbserv0 -
Local results vs Normal results
Hi everyone, I am currently working on the website of a friend, who's owning a French spa treatment company. I have been working on it for the past 6 months, mostly on optimizing the page titles and the link building. So far the results are great in terms on normal results : if you type most of the keywords and the city name, the website would be very well positioned, if not top positioned. My only problem is that in the local results (Google Maps), nothing has improved at all. In most of the same keyword where the website is ranking 1st on normal results, the website doesn't appear at all on the same keywords in local results. This is confusing as you would think Google think the website is relevant to the subject according to the normal results but it doesn't show any good ones in a local matter. The website is clearly located in the city (thanks to the pages titles and there's a Google Map in a specific page dedicated to its location). The company has a Google Places page and it has positive customers reviews on different trusted websites for more than a year now (the website is 2 years old). I focused my work concerning the link building on the local websites (directories and specialized websites) for the past 2 months. The results kept improving on normal results but still no improvement at all in the local ones. As far as I know, there is no mistakes such as multiple addresses for the same business etc. Everything seems to be done by the rules. I am not sure at all what more I can do. The competitors do not seem to be working their SEO pretty much and in terms of linking (according to the -pretty good- Seomoz tools), they have up to 10 times less (good) links than us. Maybe you guys have some advice on how I can manage this situation ? I'm kind of lost here 😞 Thanks a lot for your help, appreciate it. Cheers,
Intermediate & Advanced SEO | | Pureshore
Raphael0 -
Do links script tags pass value?
Hi I was wondering if there was any consensus over whether links in script tags pass any value - such as the link code below? Thanks
Intermediate & Advanced SEO | | James770 -
Why Put an H1 Tag On A Product?
Why would you put an H1 tag on a product name? I came across this in another forum and thought I'd float it here.
Intermediate & Advanced SEO | | AWCthreads0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0