Best server-side xml sitemap generator?
-
I have tried xml-sitemaps which tends to crash when spidering my site(s) and requires multiple manual resumes which aren't practical for our businesses. Please let me know if any other server-side generators that could be used on multiple enterprise-sized websites exist that could be a good fit. Image sitemaps would also be helpfu.l
+++One with multiple starting URLs would help spidering/indexing the most important sections of our sites.
Also, has anyone heard of or used Dyno Mapper? This also looks like a good solution for us, but was wondering if anyone has had any experience with this product.
-
Our company uses a custom CMS for each of our niche classified websites. Ideally, we would like an online/cloud-based platform that would generate our sitemaps for us. Dyno Mapper looks like a good contender, but the $108 monthly fee seems a bit steep.
-
What technology stack is running the websites? Most Cms-based systems have some method of building the xml sitemaps from the database, which is vastly superior to having to build it from a crawl.
P.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I bother with a Video Sitemap?
Morning all, I've started a pretty aggressive Video content push in recent weeks. All our videos are on our YouTube channel. I decided to go with hosting the videos on YouTube based on my research on moz.com, especially considering the potential reach of the content on YouTube. What I'm finding is that the YouTube channel is doing great. We've hit 200 subscribers and 15K views in a little under a month. Wayyyy more than I could have ever hoped for. But the blog posts on our website are getting minimal traffic and no search visibility. That doesn't necessarily bother me, since the intention of our marketing campaign is to use YouTube to drive traffic to our website. So I guess my question is really more to do with optimizing the site with Video Sitemaps and best practices for Google Webmaster Tools. Right now we have YouTube videos embedded on blog posts like this one that have a time-stamp. But I've been working to create Gallery-style pages (no time-stamp) which would have multiple YouTube videos embedded on them like this one. These make it easier for visitors to watch multiple videos without needing to skip around to multiple blog posts. The challenge I'm running into is that when I go to submit a Video Sitemap to GWT I get an error saying that I have duplicate page content within the video sitemap. I've used several WP plugins to do this. It seems that when there is a video embedded on multiple URLs (pages + posts) the plugins will ignore the posts and only add the pages to the video sitemap. Here is my regular Sitemap Here is my video Sitemap I've attached a screenshot of my current Yoast Video SEO config if that's useful for reference. Does anyone have experience with using multiple sitemaps in GWT? I'm starting to think that maybe I shouldn't even bother with a video sitemap. Maybe those gallery-style pages should just go in the regular sitemap? Any thoughts or advice would be highly appreciated! Thanks llQfydA
Intermediate & Advanced SEO | | TMHoward860 -
Best method to update navigation structure
Hey guys, We're doing a total revamp of our site and will be completely changing our navigation structure. Similar pages will exist on the new site, but the URLs will be totally changed. Most incoming links just point to our root domain, so I'm not worried about those, but the rest of the site does concern me. I am setting up 1:1 301 redirects for the new navigation structure to handle getting incoming links where they need to go, but what I'm wondering is what is the best way to make sure the SERPs are updated quickly without trashing my domain quality, and ensuring my page and domain authority are maintained. The old links won't be anywhere on the new site. We're swapping the DNS record to the new site so the only way for the old URLs to be hit will be incoming links from other sites. I was thinking about creating a sitemap with the old URLs listed and leaving that active for a few weeks, then swapping it out for an updated one. Currently we don't have one (kind of starting from the bottom with SEO) Also, we could use the old URLs for a few weeks on the new site to ensure they all get updated as well. It'd be a bit of work, but may be worth it. I read this article and most of that seems to be covered, but just wanted to get the opinions of those who may have done this before. It's a pretty big deal for us. http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well Am I getting into trouble if I do any of the above, or is this the way to go? PS: I should also add that we are not changing our domain. The site will remain on the same domain. Just with a completely new navigation structure.
Intermediate & Advanced SEO | | CodyWheeler0 -
Sitemaps recommend by google
Google in it guideline recommends to create a sitemap. Do they means a /sitemap.xml or does it need to be sitemap directly on the website ? Does it make any difference ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
How to Best Establish Ownership when Content is Duplicated?
A client (Website A) has allowed one of their franchisees to use some of the content from their site on the franchisee site (Website B). This franchisee lifted the content word for word, so - my question is how to best establish that Website A is the original author? Since there is a business relationship between the two sites, I'm thinking of requiring Website B to add a rel=canonical tag to each page using the duplicated content and referencing the original URL on site A. Will that work, or is there a better solution? This content is primarily informational product content (not blog posts or articles), so I'm thinking rel=author may not be appropriate.
Intermediate & Advanced SEO | | Allie_Williams0 -
Best url structure
I am making a new site for a company that services many cities. I was thinking a url structure like this, website.com/keyword1-keyword2-keyword3/cityname1-cityname2-cityname3-cityname4-cityname5. Will this be the best approach to optimize the site for the keyword plus 5 different cities ? as long as I keep the total url characters under the SeoMoz reccomended 115 characters ? Or would it be better to build separate pages for each city, trying to reword the main services to try to avoid dulpicate content.
Intermediate & Advanced SEO | | jlane90 -
Best practices for handling https content?
Hi Mozzers - I'm having an issue with https content on my site that I need help with. Basically we have some pages that are meant to be secured, cart pages, auth pages, etc, and then we have the rest of the site that isn't secured. I need those pages to load correctly and independently of one another so that we are using both protocols correctly. Problem is - when a secure page is rendered the resources behind it (scripts, etc) won't load with the unsecured paths that are in our master page files currently. One solution would be to render the entire site in https only, however this really scares me from an SEO standpoint. I don't know if I want to put my eggs in that basket. Another solution is to structure the site so that secure pages are built differently from unsecured pages, but that requires a bit of re-structuring and new SOPs to be put in place. I guess my question is really about best practices when using https. How can I avoid duplication issues? When do I need to use rel=canonical? What is the best way to do things here to avoid heavy maintenance moving forward?
Intermediate & Advanced SEO | | CodyWheeler0 -
Export Website into XML File
Hi, I am having an agency optimize the content on my sites. I need to create XML Schema before I export the content into XML. What is best way to export content including meta tags for an entire site along with the steps on how to?
Intermediate & Advanced SEO | | Melia0 -
Which is best structure for Multiple XML Sitemap?
I have read such a great blog posts on Multiple XML Sitemaps on following websites before a week. SEOmoz Distilled Google Webmaster Central Blog Search Engine Land SEO Inc I have created multiple XML sitemaps for my eCommerce website with following structure and submitted to Google webmaster tools. http://www.vistastores.com/main_sitemap.xml http://www.vistastores.com/products_sitemap.xml But, I am not satisfy with my second XML sitemap because it contain more than 7K+ product page URLs and looks like very slow crawling by Google! I want to separate my XML sitemap with following structure. With Root Level Category http://www.vistastores.com/outdoor_sitemap.xml http://www.vistastores.com/furniture_sitemap.xml http://www.vistastores.com/kitchen_dining_sitemap.xml http://www.vistastores.com/home_decor_sitemap.xml OR::: End Level Category http://www.vistastores.com/table_lamps_sitemap.xml http://www.vistastores.com/floor_lamps_sitemap.xml . . . . . . . etc.... So, Which is best structure for Multiple XML Sitemap?
Intermediate & Advanced SEO | | CommercePundit0