Benefits to having an HTML sitemap?
-
We are currently migrating our site to a new CMS and in part of this migration I'm getting push-back from my development team regarding the HTML sitemap. We have a very large news site with 10s of thousands of pages. We currently have an HTML sitemap that greatly helps with distributing PR to article pages, but is not geared towards the user. The dev team doesn't see the benefit to recreating the HTML sitemap despite my assurance that we don't want to lose all these internal links since removing 1000s of links could have a negative impact on our Domain Authority.
Should I give in and concede the HTML sitemap since we have an XML one? Or am I right that we don't want to get rid of it?
-
It sure is, read the link i sent you, it explains why.
This is not a big job, and it is very important
-
Our html sitemap is broken down into different pages that never contain more than 250 links. All pages are linked via the top nav back to the homepage and to their section/subsection.
The issue I'm having is not that they don't know how to recreate our html sitemap in the new CMS. It's that they don't believe it serves a purpose. And given limited resources, they don't want to work on this in favor of other more crucial work.
My biggest concern is the removal of thousands of internal links. Should I be worried about this?
-
the xml and html sitemaps are completly different in what they do.
You dont want to have more then 250 links on a page, http://thatsit.com.au/seo/reports/violation/the-page-contains-too-many-hyperlinks
but 250 * 250 is 62500, so with one leverl you can get a lot of pages linked.
all the pages should be linked somehow, and all should link back to your home page and landing pages if posible.
I dont know your developers, but many CMS developers are limited by the capapabilities of the CMS, and expect the customer to bend to those limitations, i cant see why they would not do it if they have the ability.
Have a read of this page for a quick explaination of how Page rank flows.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed, not submitted in sitemap
I have this problem for the site's blog
Technical SEO | | seomozplan196
There is no problem when I check the yoast plugin setting , but some of my blog content is not on the map site but indexed. Did you have such a problem? What is the cause? my website name is missomister1 -
Protecting sitemaps - Good idea or humbug?
Is there a way to protect your sitemap.xml so that only Google can read it and would it make sense to do this?
Technical SEO | | Roverandom0 -
Any SEO benefits of adding a Glossary to our website?
Hi all, I manage a website for a software company. Many terms can be quite tricky so it would be nice to add a Glossary page. Other than that, I have 2 questions: 1. What would be the SEO benefits? 2. How would you suggest to implement this glossary so we can get as much SEO benefit as possible (for example how would we link, where would we place the glossary in the terms of the sitemap, etc.). Any advice appreciated! Katarina
Technical SEO | | Katarina-Borovska2 -
Non-standard HTML tags in content
I had coded my website's article content with a non-standard tag <cnt>that surrounded other standard tags that contained the article content, I.e.</cnt> , . The whole text was enclosed in a div that used Schema.org markup to identify the contents of the div as the articleBody. When looking at scraped data for stories in Webmaster Tools, the content of the story was there and identified as the articleBody correctly. It's recently been suggested by someone else that the presence of the non-standard <cnt>tags were actually making the content of the article uncrawlable by the Googlebot, this effectively rendering the content invisible. I did not believe this to be true, since the content appeared to be correctly indexed in Webmaster Tools, but for the sake of a test I agreed to removing them. In the last 6 weeks since they were removed, there have been no changes in impressions or traffic from organic search, which leads me to believe that the removal of the <cnt>tags actually had no effect, since the content was already being indexed successfully and nothing else has changed.</cnt></cnt> My question is whether or not an encapsulating non-standard tag as I've described would actually make the content invisible to Googlebot, or if it should not have made any difference so long as the correct Schema.org markup was in place? Thank you.
Technical SEO | | dlindsey0 -
Search Console rejecting XML sitemap files as HTML files, despite them being XML
Hi Moz folks, We have launched an international site that uses subdirectories for regions and have had trouble getting pages outside of USA and Canada indexed. Google Search Console accounts have finally been verified, so we can submit the correct regional sitemap to the relevant search console account. However, when submitting non-USA and CA sitemap files (e.g. AU, NZ, UK), we are receiving a submission error that states, "Your Sitemap appears to be an HTML page," despite them being .xml files, e.g. http://www.t2tea.com/en/au/sitemap1_en_AU.xml. Queries on this suggest it's a W3 Cache plugin problem, but we aren't using Wordpress; the site is running on Demandware. Can anyone guide us on why Google Search Console is rejecting these sitemap files? Page indexation is a real issue. Many thanks in advance!
Technical SEO | | SearchDeploy0 -
Same day our sitemap finished processing in G WebTools, our SERP results Tank!
Need a little help troubleshooting an SEO issue. The day our sitemap finished processing in Google Webmaster Tools, almost all of our keyword serp results tanked. Our top 4 keywords routinely were placing from 11 - 33 rank in serp result and now they're not even on in the top 200? Would the sitemap processing have anything do do with this or should I look somewhere else. FYI: Site is build in DNN, sitemap is fine and robot.txt file is good. Open to all suggestions!!
Technical SEO | | Firecracker0 -
Robots.txt versus sitemap
Hi everyone, Lets say we have a robots.txt that disallows specific folders on our website, but a sitemap submitted in Google Webmaster Tools that lists content in those folders. Who wins? Will the sitemap content get indexed even if it's blocked by robots.txt? I know content that is blocked by robot.txt can still get indexed and display a URL if Google discovers it via a link so I'm wondering if that would happen in this scenario too. Thanks!
Technical SEO | | anthematic0 -
Sitemap question
My sitemap includes www.example.com and www.example.com/index.html, they are both the same page, will this have any negative effects, or can I remove the www.example.com/index.html?
Technical SEO | | Aftermath_SEO0