Do search engines treat masked text differently than solid color fonts?
-
In my attempt to decrease page load times, I ditched my custom fonts for google fonts. I figured out how to apply CSS mask-image to make this blazing fast loading google font have a chalk texture, which was an awesome improvement over the 3-5 second load times for those locally hosted web fonts.
I've read that Google penalize a site for poor contrast ratios between the background and text, but do search engines go by CSS or do they somehow compare the actual rendered site as an image? Using CSS mask-image to give my text that chalk appearance does produce minor transparent patches in text.
So have I saved 3 seconds on page load just to have search engines knock points off for funky text issues? All input welcome. The temporary site is here. https://website-1b14f.firebaseapp.com/
Kevin
-
Great job optimizing your page load times by switching to Google Fonts and applying a chalk texture using CSS mask-image! Regarding contrast ratios, search engines like Google analyze the rendered site as an image, not just the CSS. So, even with CSS mask-image, the actual rendered text contrast is what matters. To ensure good contrast, visit DaFont, select a font, check its size, and use a clear and readable font in a suitable size (at least 14px) on your site to maintain a good user experience and avoid any potential SEO issues
-
Yes, search engines generally treat masked text differently than solid color fonts. Masked text, which refers to text that is hidden or obscured in some way on a webpage (such as using the same color as the background), is often seen as an attempt to manipulate search engine rankings and can result in penalties if detected.
Search engines like Google aim to provide users with relevant and valuable content. Masked text can be used in black hat SEO techniques to stuff keywords or hide spammy content from users while trying to manipulate search engine rankings. As a result, search engines are vigilant about detecting and penalizing such practices.
In contrast, solid color fonts, which are visible and legible to users, are considered legitimate and are not penalized by search engines. It's important to use solid color fonts for your content to ensure that it is properly indexed and ranked based on its actual value and relevance to users.
-
Hi thanks for helpful information use this font your text more attractive and beautiful use in comments , post, text etc this font is more help and share your thoughts in beautiful texts this times new roman font generator is very helpful.
-
Hi Christy,
Site launched! The e-commerce part is still under development but the basic site has been up a couple months. Masked text doing great! No issues whatsoever on the SEO side. Ranking super high still and load speeds are good. Service workers will be activated in the coming weeks as we build out our food delivery platform. So, I'll mark my question as answered. https://www.88k.com.tw
-
Site not launched yet but no warnings on any SEO tools. You can run this site through any tests you want and see. https://website-1b14f.firebaseapp.com/
Schema all good and AMP valid. Content coming up next... FYI this is not a public site and content will change as we test new designs and functionality.
-
Hi Kevin,
Have you launched yet? We'd love an update on this!
Christy
-
Thank for your thoughts. You're right that I can't find a single article on this anywhere, but I've never been conservative when I comes to SEO. I'm always looking to see what's possible. I concluded that since unsupported browsers will simply display the original text without the mask-image (Firefox/Opera), I'm going to assume google search bots won't care about the image mask either.
On the SEO side, this method shaves 3 to 5 seconds off load times, so that can't be bad. The effects are amazing, even on Chinese fonts. I'll report back after launch and post here.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not indexing /showing my site in search results...
Hi there, I know there are answers all over the web to this type of question (and in Webmaster tools) however, I think I have a specific problem that I can't really find an answer to online. site is: www.lizlinkleter.com Firstly, the site has been live for over 2 weeks... I have done everything from adding analytics, to submitting a sitemap, to adding to webmaster tools, to fetching each individual page as googlebot and then submitting to index via webmaster tools. I've checked my robot files and code elsewhere on the site and the site is not blocking search engines (as far as I can see) There are no security issues in webmaster tools or MOZ. Google says it has indexed 31 pages in the 'Index Status' section, but on the site dashboard it says only 2 URLS are indexed. When I do a site:www.lizlinketer.com search the only results I get are pages that are excluded in the robots file: /xmlrpc.php & /admin-ajax.php. Now, here's where I think the issue stems from - I developed the site myself for my wife and I am new to doing this, so I developed it on the live URL (I now know this was silly) - I did block the content from search engines and have the site passworded, but I think Google must have crawled the site before I did this - the issue with this was that I had pulled in the Wordpress theme's dummy content to make the site easier to build - so lots of nasty dupe content. The site took me a couple of months to construct (working on it on and off) and I eventually pushed it live and submitted to Analytics and webmaster tools (obviously it was all original content at this stage)... But this is where I made another mistake - I submitted an old site map that had quite a few old dummy content URLs in there... I corrected this almost immediately, but it probably did not look good to Google... My guess is that Google is punishing me for having the dummy content on the site when it first went live - fair enough - I was stupid - but how can I get it to index the real site?! My question is, with no tech issues to clear up (I can't resubmit site through webmaster tools) how can I get Google to take notice of the site and have it show up in search results? Your help would be massively appreciated! Regards, Fraser
Technical SEO | | valdarama0 -
Does Server Location have anything to do with Search Results
Good Morning Everyone... Does having a site hosted in Europe have any effect on Search Engine results in the US? Thanks
Technical SEO | | Prime850 -
Same page from different locations has slight different URL, is it a negative SEO practice?
Hi, Recently we made change in our website link generation logic, and now I can reach the same page from different pages with slightly different URLs like this: http://www.showme.com/sh/?h=wlZJNya&by=Featured_ShowMe and http://www.showme.com/sh/?h=wlZJNya&by=Topic Just wondering is this a bad practice and should we avoid it? Thank you, Karen
Technical SEO | | showme0 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0 -
How to attach at text to image that other websites use from my website
I often have other websites link to my website. They will do this with an image that they pull off of my website. (actually my website continues to serve the image). These inbound links are great, but they don't have alt text. Is there a way for me to attach alt text to the images, or is this something the other website needs to code themselves?
Technical SEO | | EugeneF0 -
In Google Merchant is there any difference between google_product_type and product_type
It used to by that we needed to just submit a product type, and now Google wants a google product type that fits within the Google Taxonomy. We always submitted the old product_type in accordance with the Google Taxonomy anyway so now we have 2 columns one for google_product_type and one for just product_type and they are the exact same thing. This seems wrong but does anyone know if there should be a difference?
Technical SEO | | KentH0 -
Blocking other engines in robots.txt
If your primary target of business is not in China is their any benefit to blocking Chinese search robots in robots.txt?
Technical SEO | | Romancing0 -
How to recover after blocking all the search engine spiders?
I have the following problem - one of my clients (a Danish home improvement company) decided to block all the international traffic (leaving only Scandiavian one), because they were getting a lot of spammers using their mail form to send e-mails. As you can guess this lead to blocking Google also since the servers of Google Denmark are located in the US. This lead to drop in their rankings. So my question is - What Shall I do now - wait or contact Google? Any help will be appreciated, because to be honest I had never see such thing in action until now 😄 Best Regards
Technical SEO | | GroupM0