Any Good XML Sitemaps Generator?
-
I was wondering if everyone could recommend what XML Sitemap generators they use. I've been using XML-Sitemap, and it's been a little hit and miss for me. Some sites it works great, other it really has serious problems indexing pages. I've also uses Google's, but unfortunately it's not very flexible to use.
Any recommendation would be much appreciated.
-
I would give DYNO Mapper a try. It's great Sitemap Generator that includes link / inventory information, Google Analytics integration, and comments for collaboration. It can import xml or scan urls pretty accurately. It is probably more useful for planning architecture than exporting for Google,
-
I will give GSiteCrawler a shot. I've been very impressed with the free ScreamFrog crawler, and if I were to select an alternative today, it be this. Just debating with myself the cost for the paid version.
-
Hi, you can try using Gsitecrawler.
Download it from here : http://gsitecrawler.com/en/download/
Best regards,
Devanur Rafi
-
-
It's html with embedded cgi-bin eCommerce software that produces dynamic pages.
-
is your site built on wordpress, there are several plugins that work good if so.
-
The trouble xml-sitemaps seems to have is with rich snippets and structured data. It seems to have trouble getting it to recognize pages with that code. I was wondering if anybody else has had a similar problem? Sites that I have without snippets, work fine.
-
Google Webmaster Tools recommends http://www.xml-sitemaps.com/, and I have to say, I've been happy using it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are HTML Sitemaps Still Effective With "Noindex, Follow"?
A site we're working on has hundreds of thousands of inventory pages that are generally "orphaned" pages. To reach them, you need to do a lot of faceting on the search results page. They appear in our XML sitemaps as well, but I'd still consider these orphan pages. To assist with crawling and indexation, we'd like to create HTML sitemaps to link to these pages. Due to the nature (and categorization) of these products, this would mean we'll be creating thousands of individual HTML sitemap pages, which we're hesitant to put into the index. Would the sitemaps still be effective if we add a noindex, follow meta tag? Does this indicate lower quality content in some way, or will it make no difference in how search engines will handle the links therein?
Intermediate & Advanced SEO | | mothner0 -
Would changing permalink structure of 7,500 articles be good or bad?
Morning everyone, I'm the tech at a large men's lifestyle publisher and we're currently running the old /year/month/ URL structure in Wordpress. Now I've read countless articles about pro's and con's of month date vs post type formats (/2016/06/sample-post/ vs /sample-post/) and considering we produce both evergreen and daily news content we're stuck with making a decision. Currently we receive about 10,000 organic referrals per day (has been stuck at this for 12 months) but considering we have 7,500 articles, have 10 full-time staff and have been around for close to 7 years we think we're underperforming. Now providing we 301 redirect every old article to the new structure is there any other reason not to do this change? Any advice would be appreciated. Axps36D
Intermediate & Advanced SEO | | lucwiesman0 -
What are partial urls and why this is causing a sitemap error?
Hi mozzers, I have a client that recorded 7 errors when generating Xml sitemap. One of the errors appear to be coming from partial urls and apparently I would need to exclude them from sitemap. What are they exactly and why would they cause an error in the sitemap. Thanks!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
How do I fix my sitemap?
I have no idea how this happened, but our sitemap was http://www.kempruge.com/sitemap.xml, now it's http://www.kempruge.com/category/news/feed/ and google won't index it. It 404's. Obviously, I had to have done something wrong, but I don't know what and more importantly, I don't know how to find it in the backend of wordpress to change it. I tried a 301 redirect, but GWT still 404'd it. Any ideas? And, it's been like this for a few weeks, I've just neglected it, so I can't just reset the site without losing a lot of work. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Generating Rich Snippets without Structured Data
I noticed something in Google search results today that I can't explain. Any help would be appreciated. I performed a real estate based search and the top result featured a rich snippet showcasing the following... Address Price Bd/Ba
Intermediate & Advanced SEO | | RyanOD
912 Garden District Dr #17. Charlotte, NC 28202 $179,990 3 / 2
222 S Caldwell St #1602. Charlotte, NC 28202 $389,238 2 / 2&1/2 However, when I visit the page associated with this information, there is no Schema to be found. In fact, the page is, for the most part, just a large table listing homes on the market. The table headings are Address, Price, and Bd/Ba. Is it common for Google to use table based data to generate rich snippets? What is the best way to influence this? In the absence of Schema (as the page we are talking about has no Schema implementation), does Google default to table data? Has anyone seen this behavior before and, if so, can you point me to it? EDIT: I've now come across a few other examples where the information is not in a table, but rather in divs. Why are such sites (you can find some by searching for "[ZIPCODE] real estate") getting this treatment?0 -
Can I, in Google's good graces, check for Googlebot to turn on/off tracking parameters in URLs?
Basically, we use a number of parameters in our URLs for event tracking. Google could be crawling an infinite number of these URLs. I'm already using the canonical tag to point at the non-tracking versions of those URLs....that doesn't stop the crawling tho. I want to know if I can do conditional 301s or just detect the user agent as a way to know when to NOT append those parameters. Just trying to follow their guidelines about allowing bots to crawl w/out things like sessionID...but they don't tell you HOW to do this. Thanks!
Intermediate & Advanced SEO | | KenShafer0 -
How good or bad are the free word press themes for SEO purposes?
I was wondering if the free word press themes would suffice as long as the plugins were added for seo purposes?
Intermediate & Advanced SEO | | bronxpad0 -
What's a good place for a copywriter to start researching the more technical aspects of SEO?
I've been working as a copywriter for about a year and a half now and I feel like the more advanced SEO topics (rel= tags, .htaccess files, etc) are a bit over my head. Is there a website where I can read up on all of these things? I have a basic understanding of them but I couldn't talk about them for very long, and I want to become more well rounded as a search marketer. Thanks!!
Intermediate & Advanced SEO | | nxmassa0