Best Format to Index a Large Data Set
-
Hello Moz,
I've been working on a piece of content that has 2 large data sets I have organized into a table that I would like indexed and want to know the best way to code the data for search engines while still providing a good visual experience for users. I actually created the piece 3 times and am deciding on which format to go with and I would love your professional opinions.
1. HTML5 - all the data is coded using tags and contains all the data on page in the . This is the most straight forward method and I know this will get indexed; however, it is also the ugliest looking table and least functional.
2. Java - I used google charts and loaded all the data into a
-
Hi there,
I would go with option #1 and use CSS to improve the design of the table. You can then use JavaScript to add functionality if necessary without the risk of some content being un-indexable.
Some examples of nicely designed tables using HTML and CSS: https://www.freshdesignweb.com/free-css-tables/
-
So it looks like option 3 is off the table. I ran "Fetch as Google" and confirmed that the 2 fusion table embeds are not visible to Google. Both the HTML and Java versions look visible when I fetch and render; however, I would appreciate any insights on which format would give me the greatest benefit.
Preferably I would like to use the Java tables but I don't want to sacrifice performance for aesthetic appeal.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Index an URL without directly linking it?
Hi everyone, Here's a duplicate content challenge I'm facing: Let's assume that we sell brown, blue, white and black 'Nike Shoes model 2017'. Because of technical reasons, we really need four urls to properly show these variations on our website. We find substantial search volume on 'Nike Shoes model 2017', but none on any of the color variants. Would it be theoretically possible to show page A, B, C and D on the website and: Give each page a canonical to page X, which is the 'default' page that we want to rank in Google (a product page that has a color selector) but is not directly linked from the site Mention page X in the sitemap.xml. (And not A, B, C or D). So the 'clean' urls get indexed and the color variations do not? In other words: Is it possible to rank a page that is only discovered via sitemap and canonicals?
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
What are the Best SEO Website which you read daily
Hai Moz memebers, Can you pls suggest me some best seo websites that you people read articles everyday a part from MOZ
Intermediate & Advanced SEO | | SEO_GB1 -
How can I optimize pages in an index stack
I have created an index stack. My home page is http://www.southernwhitewater.com My home page (if your look at it through moz bat for chrome bar} incorporates all the pages in the index. Is this Bad? I would prefer to index each page separately. As per my site index in the footer What is the best way to optimize all these pages individually and still have the customers arrive at the top and links directed to the home page ( which is actually the 1st page). I feel I am going to need a rel=coniacal might be needed somewhere. Any help would be great!!
Intermediate & Advanced SEO | | VelocityWebsites0 -
Best support site software to use
Hi Guys We currently use Desk to run our company support site, it seems ok (I don't administer it), however is it very template driven and doesn't allow useful tools such as being able to add metadata to each page (hence in our Moz crawl tests we get a large number of no metadata errors (which seems like a lost opportunity for us to optimise the site). Our support team are looking to implement MadCap Flare as an information management tool, however this tool outputs HTML as iframes which obviously make it hard for google to crawl the content. We recently implemented HubSpot as our content marketing platform which is great, and we'd love to have the support site hosted on this (great for tracking traffic etc), however as far as I'm aware MadCap Flare doesn't integrate directly with HubSpot....so looking for suggestions on what others are successfully using to host/manage their SEO optimised support sites? Cheers Matt
Intermediate & Advanced SEO | | SnapComms0 -
Is it best to have products and reviews on the same URL?
Hi Moz, Is it better to have products and reviews on the same or different URLs? I suspect that combining these into one page will help with rankings overall even though some ranking for product review terms may suffer. This is for a hair products company with tens of thousands if not hundreds of thousands of reviews. Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Thousands of 404 Pages Indexed - Recommendations?
Background: I have a newly acquired client who has had a lot of issues over the past few months. What happened is he had a major issue with broken dynamic URL's where they would start infinite loops due to redirects and relative links. His previous SEO didn't pay attention to the sitemaps created by a backend generator, and it caused hundreds of thousands of pages to be indexed. Useless pages. These useless pages were all bringing up a 404 page that didn't have a 404 server response (it had a 200 response) which created a ton of duplicate content and bad links (relative linking). Now here I am, cleaning up this mess. I've fixed the 404 page so it creates a 404 server response. Google webmaster tools is now returning thousands of "not found" errors, great start. I fixed all site errors that cause infinite redirects. Cleaned up the sitemap and submitted it. When I search site:www.(domainname).com I am still getting an insane amount of pages that no longer exist. My question: How does Google handle all of these 404's? My client wants all the bad pages removed now but I don't have as much control over that. It's a slow process getting Google to remove these pages that are returning a 404. He is continuously dropping in rankings still. Is there a way of speeding up the process? It's not reasonable to enter tens of thousands of pages into the URL Removal Tool. I want to clean house and have Google just index the pages in the sitemap.
Intermediate & Advanced SEO | | BeTheBoss0 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
Removing a Page From Google index
We accidentally generated some pages on our site that ended up getting indexed by google. We have corrected the issue on the site and we 404 all of those pages. Should we manually delete the extra pages from Google's index or should we just let Google figure out that they are 404'd? What the best practice here?
Intermediate & Advanced SEO | | dbuckles0