Which is more effective: JQuery + CSS for Tabbed Content or Create Unique Pages for each tab.
-
We are building a from-scratch directory site and trying to determine the best way to structure our pages. Each general listing page has four sections of specific information.
What is a better strategy for SEO: Using tabs (e.g. JQuery + CSS) and putting all content on one page (and will all of the content still be indexible using JQuery?) OR creating unique pages for each section.
- JQuery: sitename.com/listing-name#section1
- Unique Pages: sitename.com/listing-name/section1
If I go with option one, I can risk not being crawlable by google if they can't read through the scripting. However, I feel like the individual pages will not rank if there's a small amount of content for each section. Is it better to keep all the content on one page and focus on building links to that? Or better to build out the section pages and worry about adding quality content to them so that long term there is more specificity for long tail search and better quality search experience on Google?
We are also set up to have "../listing-type/listing-name" but are considering removing 'listing type and just having "../listing-name/". Do you think this more advantageous for boosting rankings?
I know that was like five questions. I've been doing a lot of research and these are the things that I'm still scratching my head about. Some general direction would be really great!
Thank You!
-
Thanks Casey. I'm interested see if there is any varying opinion. I've had a few voted in favor of your methodology, but a couple of critics for the alternative.
I think one page with all of the content will work well for us. Plus, the user experience should improve since we won't be having to load a new page each time the user wants to see additional information.
-
Hi Grant,
There are multiple ways of going about this I am sure, but here is my take.
To me, this sort of depends on the content of all 4 tabs and if they are relevant and valuable for the user on this page. Here are a couple of questions to ask yourself:
- Does the user really want to load a new page to see a small section what may or may not have belonged on the previous page?
- Does it make since for a user to go to a new page? (is there a ton of content in these sections)
- Is each section targeting a new keyword, or supporting the main keyword?
jQuery + CSS will be just fine
As long as your developer knows what he is doing, loading jQuery(or better yet pure css) tabs Google will index all of the content on this page. Google should see sitename.com/listing-name#section1 as sitename.com/listing-name. Just make sure the code structure is setup to support any content hierarchy.
**../listing-type/listing-name/ vs ../listing-name/ **
I think this could come down to what these listings are.. If this was say a real estate website it would make since to set it up like:
- ../house/123-main-st/
- ../apartment/432-main-st/
If it makes since to add a listing type I say go for it.
Again, this can differ for what type of content you are providing, but this should provide you with a good sense of general direction.
Thanks,
Casey
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Putting my content under domain.com/content, or under related categories: domain.com/bikes/content ?
Hello This questions plays on what Joe Hall talked about during this years' MozCon: Rethinking Information Architecture for SEO and Content Marketing. My Case:
Intermediate & Advanced SEO | | Inevo
So.. we're working out guidelines and templates for a costumer (sporting goods store) on how to publish content (articles, videos, guides) on their category pages, product pages, and other pages. At this moment I have 2 choices:
1. Use a url-structure/information architecture where all the content is placed in one subfolder, for example domain.com/content. Although it's placed here, there's gonna be extensive internal linking from /content to the related category pages, so the content about bikes (even if it's placed under domain.com/bikes) will be just as visible on the pages related to bikes. 2. Place the content about bikes on a subdirectory under the bike category, **for example domain.com/bikes/content. ** The UX/interface for these two scenarios will be identical, but the directories/folder-hierarchy/url structure will be different. According to Joe Hall, the latter scenario will build up more topical authority and relevance towards the category/topic, and should be the overall most ideal setup. Any thoughts on which of the two solutions is the most ideal? PS: There is one critical caveat her: my costumer uses many url-slugs subdirectories for their categories, for example domain.com/activity/summer/bikes/, which means the content in the first scenario will be 4 steps away from the home page. Is this gonna be a problem? Looking forward to your thoughts 🙂 Sigurd, INEVO0 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
Should my back links go to home page or internal pages
Right now we rank on page 2 for many KWs, so should i now focus my attention on getting links to my home page to build domain authority or continue to direct links to the internal pages for specific KWs? I am about to write some articles for several good ranking sites and want to know whether to link my company name (same as domain name) or KW to the home page or use individual KWs to the internal pages - I am only allowed one link per article to my site. Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
Intermediate & Advanced SEO | | Zanox
- The amount of 404 pages Google is warning about in GWT
- Right now all productpages are indexed
- Use as much value as possible in the right way from all pages
- Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.0 -
Would spiders successfully crawl a page with two distinct sets of content?
Hello all and thank you in advance for the help. I have a coffee company that sell both retail and wholesale products. These are typically the same product, just at different prices. We are planning on having a pop up for users to help them self identify upon their first visit asking if they are retail or wholesale clients. So if someone clicks retail, the cookie will show them retail pricing throughout the site and vice versa for those that identify themselves as wholesale. I can talk to our programmer to find out how he actually plans on doing this from a technical standpoint if it would be of assistance. My question is, how will a spider crawl this site? I am assuming (probably incorrectly) that whatever the "default" selection is (for example, right now now people see retail pricing and then opt into wholesale) will be the information/pricing that they index. So long story short, how would a spider crawl a page that has two sets of distinct pricing information displayed based on user self identification? Thanks again!
Intermediate & Advanced SEO | | ClayPotCreative0 -
Content per page?
We used to have an articles worth of content in a scroll box created by our previous SEO, the problem was that it was very much keyword stuffed, link stuffed and complete crap. We then removed this and added more content above the fold, the problem I have is that we are only able to add 150 - 250 words above the fold and a bit of that is repetition across the pages. Would we benefit from putting an article at the bottom of each of our product pages, and when I say article I mean high quality in depth content that will go into a lot more detail about the product, history and more. Would this help our SEO (give the page more uniqueness and authority rather than 200 - 250 word pages). If I could see one problem it would be would an articles worth of content be ok at the bottom of the page and at that in a div tab or scroll box.
Intermediate & Advanced SEO | | BobAnderson0 -
Links to images on a page diluting page value?
We have been doing some testing with additional images on a page. For example, the page here:
Intermediate & Advanced SEO | | Peter264
http://flyawaysimulation.com/downloads/files/2550/sukhoi-su-27-flanker-package-for-fsx/ Notice the images under the heading Images/Screenshots After adding these images, we noticed a ranking drop for that page (-27 places) in the SERPS. Could the large amount of images - in particular the links on the images (links to the larger versions) be causing it to dilute the value of the actual page? Any suggestions, advice or opinions will be much appreciated.0