How can I provide titles and descriptive text for our list of USPs on the same page optimized both for usability and SEO
-
I am rebuilding our website together with an agency and I am stuck with the following problem:
We have a page which will provide the visitor with a quick and convincing impression why he should chose our enterprise. On this page we want to show our USPs (Unique Selling Points) each with a title and a short description. Now my preferred way of presenting those USPs would be of a list of the titles (which permits to see all USPs without having to read a lot of text) where each title can be clicked to expand the description (in case you want to know more about this specific USP) and if you click on another title the previously clicked title description will collapse and the new description expand and so on (similar to this page: http://www.berlin-city-immobilien.de/38.html - I'm talking about the list in the middle of the page starting with the headline "Dabei profitieren Sie von folgenden Vorteilen"). Since I also want to use these descriptions as on page SEO-texts I checked whether Google might not index or at least value "click to expand content" less than plain text in the body of the page and I stumbled over this article: https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html. According to this article Google will definitely discount the descriptions on my page.
Does anyone have an idea how to solve this problem? Either by suggesting a different way to show titles and descriptions on the page or maybe by suggesting a workaround so Google will not treat the descriptions as "click to expand text".
Thank you already in advance for your input.
Ben -
First of all thank you both for taking the time to answer my question.
@Russ
I also was hesitating whether I could display the text first and then collapse it with some JS but I also read somewhere that Google is or will be analyzing JS in the future and of course this could lead to a penalty if not now than somewhere in the future. So I think I will follow your advice to stick with your first suggestion.
As to your first suggestion: In this case the user has to click more so this is a slight limitation when it comes to usability but I guess to some extend I have to accept a compromise. Do you think it is a problem if content (in that case headline and teaser) is repeated on the same page?
@ Dimitrii
Well what Matt is saying is that they won't count it as some spam and penalize the website. But he does not say anything about how the click to expand content is weighted.
The solution with the different pages will not work in my case as I need all descriptions on one page for SEO and it is also a slight limitation to usability as the user has to keep on switching between the pages.
-
Hi there.
Well, in the same article you are referring to, is this text:
Amazon use to use a lot of tabs but now they seem to output most of the content directly on the page, making the user scroll and scroll to see the content. _Google's own help documents does use click to expand but only to see the questions. _
Also there was this video from Matt: https://www.youtube.com/watch?v=UpK1VGJN4XY
I understand that a lot of this content contradicts each other etc, but I'd look at this problem like this: it's not a secret at all that Google puts (or at least states that they put) User Experience first. So, Look at your page and see if users, after they land on it, would be happy. If everything makes sense from User point of view. If "expand" buttons are large enough and portrait that by clicking on them you'd expand content etc.
Also, as Matt said, is there 8 pages of content hidden and being displayed after you click "expand" and ruining your day?
I believe that as long as it looks good, makes sense to user and is good content, there shouldn't be any problems. The only workaround i see is instead of expandable content, to have simply links to other pages. I've seen both scenarios work.
Hope this helps.
-
This is a question that is getting a lot more attention lately. You have two choices...
1. Accept the reality that Google doesn't want to rank you for content that is hidden...
In this case, I would recommend starting with the list of your USPs at the top, maybe each with 1 sentence below explaining (like a headline and a tagline). Below that, repeat the headlines but each with a much longer description of text. Make the first listings links to the anchored headlines below, so if you click on the 1st USP, you are taken to the full description of it below. Then use a "return to top" anchor to bring you back to the list. This would allow you to get your USPs front-and-center and still get the content on the page.2. Or try and get around it.
Start with the content showing and then hide it with some JS event like a scroll, mouseover, timed event, etc.In the end, I would recommend finding a way to accomplish #1 so you don't worry about losing ill-gotten gains by tricking Google.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Often Can I Change My Meta Titles? (Product Discounts)
Hello, I have an ecommerce website. Due to discounts and sales on most of my products, the discount rates change very often. Some products have %10 discounts for 3 days and then same products might have %50 discounts for another 5 days. I would like to show these product specific discounts on Meta Titles and Descriptions dynamically. My system will automatically update the Meta Titles once the discount on a product changes. My question is, how often can I change the Meta Titles? Is changing them (only discount rates = 1word) too often bad for my SEO? What is Goolge's approach on this? Thank you in advance for all your help. Best,
Technical SEO | | yigitgok
Yigit0 -
How do I optimize a website for SEO for a client that is using a subdirectory as a seperate website?
We launched a subdirectory site about two months ago for our client. What's happening is searches for the topic covered by the subdirectory are yielding search results for the old site and not the new site. We'd like to change this. Are there best practices for the subdirectory site Specifically we're looking for things we can do using sitemapping and Webmaster tools. Are there other technical things we can do? Thanks you.
Technical SEO | | IVSeoTeam120 -
NOFOLLOW Links: Can we 100% ignore them for SEO purposes?
Some SEO articles say we can completely ignore NoFollow links. Other articles say they still matter - but then are very vague on what they count for or against. So which is it really? I do realize that they can provide traffic, and for that they are worthwhile. But it is SEO I am asking about... The SEO purpose I am most concerned with is the Link Profile. Separating the Follows from the NoFollows often gives really different anchor text distributions. If they don't matter, why do MOZ and other SEO Analysis programs still include them in their standard reports? (I can see some benefit to having them as part of the in-depth reports) So what's your thoughts? Can we 100% ignore the NoFollows for our SEO analysis?
Technical SEO | | GregB1230 -
Crawl Diagnostics and Duplicate Page Title
SOMOZ crawl our web site and say we have no duplicate page title but Google Webmaster Tool says we have 641 duplicate page titles, Which one is right?
Technical SEO | | iskq0 -
Home page indexed but not ranking...interior pages with thin content outrank home page??
I have a Joomla site with a home page that I can't get to rank for anything beyond the company name @ Google - the site works fine @ Bing and Yahoo. The interior pages will rank all day long but the home page never shows up in the results. I have checked the page code out in every tool that I know about and have had no luck....by all account it should be good to go...any thoughts/comments/help would be greatly appreciated. The site is http://www.selectivedesigns.com Thanks! Greg
Technical SEO | | DougHosmer0 -
Google Dropping Pages After SEO Clean Up
I have been using SEOmoz to clear errors from a site. There
Technical SEO | | Andy56
were over 10,000 errors to start with. Most of these were duplicate content, duplicate titles and too many links on a page. Most of the duplicate errors have now been
cleared. This has been done in two weeks (down to around 3000 errors now). But instead of improving my rankings, pages that were on the second page of Google have started to drop out of the listings altogether. The pages that are dropping out
are not related to the duplicate problems and get A grades when I run SEOmoz
page reports. Can you clean up too much too quickly or is there likely to be another reason for it?0 -
SEOMoz is finding jpegs on my site and reporting them as pages with missing meta titles
SEOMoz has just done a crawl of my site, and found 600 pages with missing meta title errors. When I have checked the list of these pages, they are all jpegs and not pages. Why is SEOMoz reporting that this .jpg files have missing meta titles on my site, which is www.webmakercms.com? SEOMoz has run several crawls of my site and this is the first time it has brought up this list of jpegs as errors and I don't understand why?
Technical SEO | | mfrgolfgti1 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0