Sitemap indexation
-
3 days ago I sent in a new sitemap for a new platform. Its 23.412 pages but until now its only 4 pages (!!) that are indexed according to the Webmaster Tools. Why so few? Our stage-enviroment got indexed (more than 50K pages) in a few days by a mistake.
-
Thanks! I'll see if this changes anything.
-
Its not that complicated, it is really easy...
In Google Webmaster tools go to the Crawl/Fetch as Google. The top level will be displayed at the top of the page. Press the Fetch Button to the right.
Goolge will fetch the page and this will be displayed underneath on the same page. To the right of this line, you will see a button to submit to index. When you press this a pop up box will appear and you can select to either submit just this page or this page and all links from it. Select the all links from it. (you can only do this full crawl/submit option 10 times in a calendar month, to submit just single pages you can do this 500 times a month) and then press Submit.
Google will then submit all the pages to its index.
Hope that helps.
Bruce
-
In regard of the error, Google crawled our https://stage.musik.dk instead of just https://musik.dk. We now have authorization on the subdomain, which gives errors in our account. I made another post about this and it seems it shouldn't harm our ranking.
Webmaster Tools is an extremely messy tool when working with various subdomains + no-http
-
Yeah. I've tested it several times, but with no errors. today its up on 35 indexed pages, but a loong way to go...
-
What do you mean by manual submit the site? Its more than 23.000 links, so a manual process is kinda of a no go
-
Hi,
Are you sure you submitted the right site map format / files? We've had in in the past that are sitemap was broken up into multiple files and we had to send sitemap-index.xml, sitemap-1.xml ... sitemap-16.xml. Have you checked it again and again?
regards
Jarno
-
No Sure what the problem was with the "by mistake"
Go to Google Webmaster tools and "manually" submit the site for the home page and all links. This will at least get the ball rolling whilst you investigate the other possible problems once you revist the sitemap etc just to check that it is complete and has not missed off a bunch of pages
Bruce
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated pages are being indexed?
I have lots of paginated pages which are being indexed. Should I add the noindex tag to page 2 onwards? The pages currently have previous and next tags in place. Page one also has a self-referencing canonical.
Technical SEO | | WTH0 -
Sitemap Rules
Hello there, I have some questions pertaining to sitemaps that I would appreciate some guidance on. 1. Can an XML sitemap contain URLs that are blocked by robots.txt? Logically, it makes sense to me to not include pages blocked by robots.txt but would like some clarity on the matter i.e. will having pages blocked by robots.txt in a sitemap, negatively impact the benefit of a sitemap? 2. Can a XML sitemap include URLs from multiple subdomains? For example: http://www.example.com/www-sitemap.xml would include the home page URL of two other subdomains i.e. http://blog.example.com/ & http://blog2.example.com/ Thanks
Technical SEO | | SEONOW1230 -
Is there a maximum sitemap size?
Hi all, Over the last month we've included all images, videos, etc. into our sitemap and now its loading time is rather high. (http://www.troteclaser.com/sitemap.xml) Is there any maximum sitemap size that is recommended from Google?
Technical SEO | | Troteclaser0 -
Image & Video Sitemaps - Submitted vs. Indexed
Hi Mozzers, I have read all the relevant blogs from media indexing experts like Phil Nottingham and have followed Google's best practice as well as advice from similar discussions on here. We have submitted video and image sitemaps to WT, and the image sitemap has 33 indexed from 720 submitted images, and the video 170 indexed from 738 submitted. With the image sitemap the number (33) has remained steady while the submitted has grown by over 100 in the last month. The video has shown signs of indexing new videos however but still not the amount that were submitted. Thus far, I have followed the guidelines sitemap structure as per Google. We are using Cloudfront so I have added and verified our cloudfront server in the same WT account. If anyone has any advice, it would be most appreciated. There is no duplicate content and the robots.txt is not blocking anything within the sitemap. Image sitemap: view-source:http://www.clowdy.com/sitemap.images.xml
Technical SEO | | Morrreau0 -
Page not indexed but still has a PageRank, how?
http://www.optiproerp.com/products.aspx page is not indexed in Google but still has a PageRank of 1. How? Regards
Technical SEO | | IM_Learner0 -
Website is not indexed in Google
Hi Guys, I have a problem with a website from a customer. His website is not indexed in Google (except for the homepage). I could not find anything that can possibly be the cause. I already checked the robots.txt, sitemap, and plugins on the website. In the HTML code i also couldn't find anything which makes indexing harder than usual. This is the website i am talking about: http://www.xxxx.nl/ (Dutch) The only thing that i am guessing now is the Google sandbox, but even that is quite unlikely. I hope you guys discover something i could not find! Thanks in advance 🙂
Technical SEO | | B.Great0 -
Getting More Pages Indexed
We have a large E-commerce site (magento based) and have submitted sitemap files for several million pages within Webmaster tools. The number of indexed pages seems to fluctuate, but currently there is less than 300,000 pages indexed out of 4 million submitted. How can we get the number of indexed pages to be higher? Changing the settings on the crawl rate and resubmitting site maps doesn't seem to have an effect on the number of pages indexed. Am I correct in assuming that most individual product pages just don't carry enough link juice to be considered important enough yet by Google to be indexed? Let me know if there are any suggestions or tips for getting more pages indexed. syGtx.png
Technical SEO | | Mattchstick0 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0