Sitemap generator partially finding list of website URLs
-
Hi everyone,
When creating my XML sitemap here it is only able to detect a portion of the website. I am missing at least 20 URLs (blog pages + newly created resource pages). I have checked those missing URLs and all of them are index and they're not blocked by the robots.txt.
Any idea why this is happening? I need to make sure all wanted URLs to be generated in an XML sitemap.
Thanks!
-
Gaston,
Interestingly enough by default the generator only located only half of the URLs. I hope that one of those 2 fields will do the trick.
-
Hi Taysir,
I´ve never used that service. I suspect that the section you refer to should do the trick.
I believe that you do know how many URLs there are in the whole site, so you can compare how much pro-sitemaps.com finds to your numbers.Best luck!
GR -
Thanks for your response Gaston. These pages are definitely not blocked by the robots.txt file. I think that it is an internal linking problem. I actually subscribed to pro-sitemap.com and was wondering if I should use this section and add remaining sitemap URLs that are missing: https://cl.ly/0k0t093f0Y1T
Do you think this would do the trick?
-
Google not only provides a basic template you could do the sitemap manually if you wished, and this link has Google listing several dozen open source sitemap generators.
If Google Webmaster's can't read the one you generated fully, then clearly an alternate generator should definitely fix that for you. Good luck!
-
Hi taysir!
Have you tried any other crawler to check whether those pages can be finded?
I'd strongly suggest you Screaming Frog spider, the free version allows you up to 500 URLs. Also, it has a feature to create sitemaps from the crawled URLs. Even though dont know if that available in the free version.
Here some info about that feature: XML sitemap genetator - Screaming FrogUsual issues in not being findable are:
- Poor internal linking
- Not having a sitemap (this is why you find out)
- Blocked resources in robots.txt
- Blocked pages with robots meta tag
That being said, its completely normal that Google has indexed pages that you cant find in a AdHoc crawl, that is because GoogleBot could have found those pages from external linking.
Also keep in mind that having pages blocked with Robots.txt or robots meta tag will not prevent that page from being indexed nor will make them deindex if you add some rules to block them.Hope it helps.
Best luck
GR
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ecommerce Preferred URL Structure for Printing Website
Hello Mozers! We are adding an ecommerce functionality to our existing website.
Technical SEO | | CheapyPP
Our company offers a wide range of commercial printing and mail services. We have done a pretty good job over the years in building content both in terms of our print offerings and blog section highlighting those offerings. We have finally bit the bullet and have decided to add end-to end ecommerce functionality. Users will be able to price, pay, upload and order thru our website. My question to the community becomes which sub folder do we use?
The ecommerce functionality is a third part software and needs to sit in a sub folder and we can't seem to find a good fit. Most of our content pages for print items are something like this www.website/printing/ - pillar page examples of url structure for sub pages www.website/printing/flyer-printing/
www.website/printing/booklet-printing/
www.website/printing/door-hangers/
www.website/printing/business-cards/ Options would be order-printing/ or prints/ So we we thinking /orders/ would be the best but not certain and wanted some feedback from the community. If we did go this route the url structure would be: order/business-cards this would be the default econ page order/business-cards/full-uv-coaing-both-sides individual product page What are your thoughts? CH0 -
Can you help by advising how to stop a URL from referring to another URL on my website please?
Stopping a redirect from one URL to another due to a 404 error? Referred URL which is (https://webwritinglab.com/know-exactly-what-your-ideal-clients-want-in-8-easy-steps/%5Bnull%20id=43484%5D) Referring URL (https://webwritinglab.com/know-exactly-what-your-ideal-clients-want-in-8-easy-steps/)
Technical SEO | | Nichole.wynter20200 -
No structured sitemap
Hello We face this problem that a lot of sitemaps are structurally not good. In this case we used the WP sitemap plugin to generate the website sitemap and Google XML sitemaps to generate the sitemap for Google. We also bought the Yoast premium plugin, but we can read in the backend that the plugin XML sitemaps may cause problems in combination with Yoast. Normally the Google XML sitemap generator improves SEO using sitemaps for the best indexation by search engines, but the structure is not as we want it. Will Yoast be a better solution to generate structured sitemaps? This is a section from the current sitemap of www.rovana.be. Products Reepgordijn Plissé - Dupli gordijn Duo rolgordijn Paneelgordijn Jaloezie - Vlinderjaloezie Poorten Muggenramen Velux accessoires Rolgordijn Vouwgordijn Buitenjaloezie Voorzetrolluik Glasdak Glaswand Vouwdak Pergola Verlichting - Verwarming Automatisering Lamellendak Verandazonwering Screens Koepel zonwering This is how we think the sitemap should look like. We would like more structure in the different product categories. Producten Zonwering Zonnescherm
Technical SEO | | conversal
Screens
Verandazonwering
Koepel zonwering
Automatisering
Verwarming – verlichting Terrasoverkapping Lamellendak
Pergola
VouwdaK
Glasdak
Glaswand Raamdecoratie Rolgorijn
Paneelgordijn
Duo rolgordijn
Vouwgordijn
Plissé – dupli gordijn
Jaloezie – vlinderjaloezie
Reepgordijn
Velux accessoires Rolluiken Voorzetrolluiken
Buitenjaloezie
Velux accessoires Muggenramen Muggenraam
Velux accessoires Poorten Sectionaal poort Is this technically possible to create similar sitemaps in WordPress and how exactly do we proceed here? What is the impact of these changes on SEO? How can we make this work? Thanks!0 -
Having Problems to Index all URLs on Sitemap
Hi all again ! Thanks in advance ! My client's site is having problems to index all its pages. I even bought the full extension of XML Sitemaps and the number of urls increased, but we still have problems to index all of them. What are the reasons? The robots.txt is open for all robots, we only prohibit users and spiders to enter our Intranet. I've read that duplicate content and 404's can be the reason. Anything else?
Technical SEO | | Tintanus0 -
Sitemap indexation
3 days ago I sent in a new sitemap for a new platform. Its 23.412 pages but until now its only 4 pages (!!) that are indexed according to the Webmaster Tools. Why so few? Our stage-enviroment got indexed (more than 50K pages) in a few days by a mistake.
Technical SEO | | Morten_Hjort0 -
How to fix these unwanted URLs?
Right now i have wordpress, one page website, but google also show wp-content. KIndly check below in google. site:http://baltimoreelite.com/ How I can fix this issue?
Technical SEO | | marknorman0 -
How do I resolve Twin domains? redirect website.com to www.website.com?
I am new to this website. Tried to run a campain and got a warning that website.com resolves to www.website.com which hinders SERP by competing for Keyword indexing!. (website is my domain name) Would appreciate help with this. Thanks. S.H. PS: here is the exact wording of error : We have detected that the domain www.yfvaccine.com and the domain yfvaccine.com both respond to web requests and do not redirect. Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here.
Technical SEO | | sherohass0 -
Website Ranking Issue
Hi, We have been performing our own onsite of offsite SEO along with external assistance and have ranked well over the years with minimal impact from Google updates. Howevr the last so called Panda update has affected us heavily pushing our main phrase 'web design melbourne' from 2nd to 7th where we have been for almost 2 months now on Google.com.au irrespective of onsite or offsite work. We have been trying to find signs of any onsite, IP, duplicate content, titles or other issues that may be holding us back to no avail. The only flag that Google webmaster tools is showing is a number of bad internal site links, which I think is a glitch with the CMS we are using. Even the SEO MOZ tool gives us a higher ranking compared to most competitors on page 1 of Google.com.au for our main phrase. The biggest difference between us and competitors is we chose to target an internal page specific to the topic rather than our homepage. With this sadi we have also reduced our keyword density and content quantity inline with the other sites homepages. Can anyone help shed some light on this? and perhaps something obvious that we have missed, or where we should be looking? Thanks.
Technical SEO | | paulsid0