How important is my code to text ratio on web pages?
-
I've been getting mixed reviews on this and I'm trying to figure out whether this is something I should be concerned with. We have a higher than recommended amount of code, relative to "content" but not by a crazy amount.
Thanks!
-
Hi,
I am also very confused by the conflicting information out there.
On one hand the general advice appears to be get your text to code ratio over 25%,
but on the other hand Moz now includes as one of its "Spam Score" factors and I quote "**Site Mark-up is Abnormally Small - **There's a high ratio of visible text compared to HTML, JavaScript, etc."
So which is right for the best SERPS SEO????
Help,
Glenn
-
Hi,
I would like to share one article on this. I hope that helps you.
http://blog.woorank.com/2013/03/are-text-to-html-ratios-important/
Thanks
-
There's a discussion on cache expiration here: http://stackoverflow.com/questions/6486805/html-cache-control-max-age, but you'll want someone knowledgeable in these recommendations making changes.
-
Awesome, thanks Ryan. How would I add "private, max-age=600" as code? Is there I can line I can put in the tags to show every image on that page should have that cache rule?
-
The biggest issue is with how this affects your load speed. Plug in your site here to see: https://developers.google.com/speed/pagespeed/insights/
Sites in the red often have trouble ranking well.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do about endless size pages
I'm working on a site that sells products that come in many different sizes. One product may come in 30 sizes.The products themselves are identical, except for the size. There are collections pages that are all of several kinds of product in a particular size and then there are individual product pages of one product the specific size. Collections pages for widgets size 30 is the same content as widgets size 29. A single product page for gold-widget-size-30 is the same content as the single product page gold-widget-size-29. To make matters worse, they all have the same tags and very little written content. The site is in Shopify. Last month there were almost 400 pages that produced visits on organic, mostly in the 1 to 4 per month range, but all together about 1000 visits. There are several hundred more that produced no traffic in organic, but are duplicate (except for size) and part of this giant ball of tangled string. What do you think I should do? Thanks... Mike
Intermediate & Advanced SEO | | 945010 -
How to optimize count of interlinking by increasing Interlinking count of chosen landing pages and decreasing for less important pages within the site?
We have taken out our interlinking counts (Only Internal Links and not Outbound Links) through Google WebMaster tool and discovered that the count of interlinking of our most significant pages are less as compared to of less significant pages. Our objective is to reverse the existing behavior by increasing Interlinking count of important pages and reduce the count for less important pages so that maximum link juice could be transferred to right pages thereby increasing SEO traffic.
Intermediate & Advanced SEO | | vivekrathore0 -
Category Pages
I'm debating on what the best category structure is for a recipe website and was looking to get some advice. It's a recipe/travel/health fitness blog but recipes reign on the site. Should it be: Option A website name\recipe\type of recipe\URL of specific recipe or Option B website name\type of recipe\url of specific recipe (and just cut out the 'recipe' category name) Any advise would be appreciated! Thanks!
Intermediate & Advanced SEO | | Rich-DC0 -
500 and 508 pages?
Hi we just did a massive deepcrawl (using the tool deepcrawl.co.uk/) on the site: http://tinyurl.com/nu6ww4z http://i.imgur.com/vGmCdHK.jpg Which reported a lot of URLs as either 508 and 500 errors. For the URLs as reported as either 508 or 500 after the deep crawl crawl finished we put them directly into screaming frog and they all came back with status code 200. Could it be because Deep Crawl hammered the site and the server couldn't handle the load or something? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Help with duplicate pages
Hi there, I have a client who's site I am currently reviewing prior to a SEO campaign. They still work with the development team who built the site (not my company). I have discovered 311 instances of duplicate content within the crawl report. The duplicate content appears to either be 1, 2, or 3 versions of the same pages but with differing URL's. Example: http://www.sitename.com http://sitename.com http://sitename.com/index.php And other pages follow a similar or same pattern. I suppose my question is mainly what could be causing this and how can I fix it? Or, is it something that will have to be fixed by the website developers? Thanks in advance Darren
Intermediate & Advanced SEO | | SEODarren0 -
Code Monitor Recommendations
Hi all, I was wondering if you have any recommendations for a code monitor? We'd like to keep track of any code and content changes on a couple of websites. We've taken a look at Page Monitor: https://chrome.google.com/webstore/detail/page-monitor/pemhgklkefakciniebenbfclihhmmfcd?hl=en but I'm not sure if it tracks code changes? Any suggestions for free or paid tools would be appreciated. Edit: We'd also like to avoid a tool that requires any tracking code changes or anything that involves a database/FTP connection.
Intermediate & Advanced SEO | | ecommercebc0 -
Thousands of Web Pages Disappered from Google Index
The site is - http://shop.riversideexports.com We checked webmaster tools, nothing strange. Then we manually resubmitted using webmaster tools about a month ago. Now only seeing about 15 pages indexed. The rest of the sites on our network are heavily indexed and ranking really well. BUT the sites that are using a sub domain are not. Could this be a sub domain issue? If so, how? If not, what is causing this? Please advise. UPDATE: What we can also share is that the site was cleared twice in it's lifetime - all pages deleted and re-generated. The first two times we had full indexing - now this site hovers at 15 results in the index. We have many other sites in the network that have very similar attributes (such as redundant or empty meta) and none have behaved this way. The broader question is how to do we get the indexing back ?
Intermediate & Advanced SEO | | suredone0 -
Linking to local pages on main page - keyword self-cannibalization issue?
Hi guys, Our website has this landing page: www.example.com/service1/ Is this considered keyword self-cannibalization if on the above page we link to local pages such as: www.example.com/service1-in-chicago/ www.example.com/service1-in-newyork/ www.example.com/service1-in-texas/ Many thanks David
Intermediate & Advanced SEO | | sssrpm0