Page HTML great for humans, but seems to be very bad for bots?
-
We recently switched platforms and use Joomla for our website. Our product page underwent a huge transformation and it seems to be user friendly for a human, but when you look at one of our product pages in SEOBrowser it seems that we are doing a horrible job optimizing the page and our html almost makes us look spammy.
Here is an example or a product page on our site:
http://urbanitystudios.com/custom-invitations-and-announcements/shop-by-event/cocktail/beer-mug
And, if you take a look in something like SEObrowser, it makes us look not so good.
For example, all of our footer and header links show up. Our color picker is a bunch of pngs (over 60 to be exact), our tabs are the same (except for product description and reviews) on every single product page...
In thinking about the bots:
1-How do we handle all of the links from footer, header and the same content in the tabs
2-How do we signal to them that all that is important on the page is the description of the product?
3-We installed schema for price and product image, etc but can we take it further?
4-How do we handle the "attribute" section (i.e. our color picker, our text input, etc).
Any clarification I need to provide, please let me know.
-
out of curiosity, what did you think this page was for? thanks for your insight.
-
Just being honest....
I had absolutely no idea that this was a page for designing an invitation. None at all - until I read your reply.
If this was my site I would not allow a cool color picker or a coding challenge or whatever to compromise my success by pushing the description down under. I would find a way to make it work because I bet this will kill the conversion rate.
It's easier to double your income from current traffic that it is to double your traffic.
-
Hi EGOL,
Completely agree on beefing up the content as well as making the product name more relevant. We have run into cannibalization issues before so we have made our product names less competitive with our category pages and are working on making the page titles incredibly relevant (so we haven't done this yet, but Beer Mug Party Invitation) would be an example of what we'll change the page title to.
We struggle with bringing product description above the fold because the call to action is to play with the colors and see how customizable, flexible our products really are. We don't want folks to miss that by first seeing the product description.
As far as our HTML of the page, however, what are your thoughts on that. You'll see that the color picker (for example) pulls 66 pngs right in a row with a bunch of random numbers...tells the bot nothing of the page. However, that is how the code is built to make the interface work.
-
First I would try to serve visitors by getting the product description up above the fold and immediately visible by the people who visit the website.
Second, I would expand the title tag because "Beer Mug" puts you into generic competition when you want to compete for easier SERPs such as "custom printed beer mug" (or appropriate language for your product).
Third, your description is really really short. I honestly believe that it has a very good chance of being filtered for trivial content. So, I would take my most important product first and start beefing up the description. As you do that you will add more relevant words to the page so in addition to making your content above trivial you will be qalifying for long tail traffic. Another benefit is that it adds sales appeal and reduces the number of questions that come in by email and phone.
At my office we spend lots of time improving trival content. I spend a hundred hours a month on that. Taking twenty word pages with one image and improving them to 200 word pages with four images. That is for retail pages. Informative pages go up to over a thousand words eight images and a video. (those numbers are just examples - we don't have word count goals). The payback in traffic can be very high if you are in a busy niche and have a site with a little authority.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I apply Canonical Links from my Landing Pages to Core Website Pages?
I am working on an SEO project for the website: https://wave.com.au/ There are some core website pages, which we want to target for organic traffic, like this one: https://wave.com.au/doctors/medical-specialties/anaesthetist-jobs/ Then we have basically have another version that is set up as a landing page and used for CPC campaigns. https://wave.com.au/anaesthetists/ Essentially, my question is should I apply canonical links from the landing page versions to the core website pages (especially if I know they are only utilising them for CPC campaigns) so as to push link equity/juice across? Here is the GA data from January 1 - April 30, 2019 (Behavior > Site Content > All Pages😞
Intermediate & Advanced SEO | | Wavelength_International0 -
Few pages without SSL
Hi, A website is not fully secured with a SSL certificate.
Intermediate & Advanced SEO | | AdenaSEO
Approx 97% of the pages on the website are secured. A few pages are unfortunately not secured with a SSL certificate, because otherwise some functions on those pages do not work. It's a website where you can play online games. These games do not work with an SSL connection. Is there anything we have to consider or optimize?
Because, for example when we click on the secure lock icon in the browser, the following notice.
Your connection to this site is not fully secured Can this harm the Google ranking? Regards,
Tom1 -
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
Weight of content further down a page
Hi, A client is trying to justify a design decision by saying he needs all the links for all his sub pages on the top level category page as google won't index them; however the links are available on the sub category and the sub category is linked to from the top level page so I have argued as long as google can crawl the links through the pages they will be indexed and won't be penalised. Am I correct? Additionally the client has said those links need to be towards the top of the page as content further down the page carries less weight; I don't believe this is the case but can you confirm? Thanks again, Craig.
Intermediate & Advanced SEO | | CSIMedia1 -
Ecommerce category pages
Hi there, I've been thinking a lot about this lately. I work on a lot of webshops that are made by the same company. I don't like to say this, but not all of their shops perform great SEO-wise. They use a filtering system which occasionally creates hundreds to thousands of category pages. Basically what happens is this: A client that sells fashion has a site (www.client.com). They have 'main categories' like 'Men' 'Women', 'Kids', 'Sale'. So when you click on 'men' in the main navigation, you get www.client.com/men/. Then you can filter on brand, subcategory or color. So you get: www.client.com/men/brand. Basically, the url follows the order in which you filter. So you can also get to 'brand' via 'category': www.client.com/shoes/brand Obviously, this page has the same content as www.client.com/brand/shoes or even /shoes/brand/black and /men/shoes/brand/black if all the brands' shoes happen to be black and mens' shoes. Currently this is fixed by a dynamic canonical system that canonicalizes the brand/category combinations. So there can be 8000 url's on the site, which canonicalize to about 4000 url's. I have a gut feeling that this is still not a good situation for SEO, and I also believe that it would be a lot better to have the filtering system default to a defined order, like /gender/category/brand/color so you don't even need to use these excessive amounts of canonicalization. Because, you can canonicalize the whole bunch, but you'd still offer thousands of useless pages for Google to waste its crawl budget on. Not to mention the time saved when crawling and analysing using Screaming Frog or other audit tools. Any opinions on this matter?
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
More bad links
Hi, After a recent disastrous dalliance with a rogue SEO company I disavowed quite a few domains (links he had gained) which I was receiving a penalty of about 23 places. I cleaned up the site and added meta descriptions where missing, and deleted duplicate titles and pages. This gained me another 5 places. In the meantime I have been getting a few links from wedding blogs, adobe forums and other relevant sites so was expecting an upward momentum. Since the high point of bottom of page 1 I have slowly slid back down to near the bottom of page two for my main keywords. Just checked my webmaster tools latest links and another 4 domains have appeared (gained by the dodgy SEO) : domain:erwinskee.blog.co.uk domain:grencholerz.blog.co.uk domain:valeriiees.blog.co.uk domain:gb.bizin.eu They all look bad so I am going to disavow. I expect to find an improvement when I disavow these new domains. As I have said, have started using the open site explorer tool to check my competitors backlinks and getting some low level links(I'm a wedding photographer) like forum comments and blog comments and good directories. I know there is much more than this to SEO and plan on raising my game as time progresses. I have also gained more links from the domains I disavowed on the 8th January mostly from www.friendfeed.com. will webmaster tools ignore any new links from previously disavowed domains? Like I have said I know there are better ways to get links, but are these links (forum comments, blog comments and respectable directories) one way of raising my rankings? To be honest that is all my competitors have got other than some of the top boys might have a photograph or two on another site with a link. No-one has a decent article or review anywhere (which is my next stage of getting links). Thanks! David.
Intermediate & Advanced SEO | | WallerD0 -
PageSpeed Vs Page Size
Hi, We all know that Google doesnt like slow loading pages, fair enough! However, for one of my websites, user interactivity is key to its success. Now each of my pages are fairly large sized (ranges in the order or 1.8 to 2.5 MB) because it has a lot of pictures, css and at times some Java script elements. However, I have tried to ensure that the code is optimized - for example html minified and compressed, caching enables, images optimized and served through CDN, etc. In spite of high page size, my GTMetrix PageSpeed score is 93+ for most pages. However, the number of requests served is 100+ and page loading time is 4.5s + as per GTMetrix and Pingdom. My question is - should this matter from an SEO perspective. Is google likely to penalize me for high loading time even though I am serving highly optimized pages? I really dont want to cut down on the user interactiveness of my website unless I have to from an SEO perspective. Please suggest. Here is my homepage, just as to give you an idea of what i am talking about: www.dealwithautism.com
Intermediate & Advanced SEO | | ashishb010 -
Will Google bots crawl tablet optimized pages of our site?
We are in the process of creating a tablet experience for a portion of our site. We haven’t yet decided if we will use a one URL structure for pages that will have a tablet experience or if we will create separate URLs that can only be access by tablet users. Either way, will the tablet versions of these pages/URLs be crawled by Google bots?
Intermediate & Advanced SEO | | kbbseo0