Submitting an 'HTTPS' sitemap.xml to Bing
-
I have been trying to submit my sitemap to Bing [via their webmaster tools] for well over a week and it continues to report 'pending' My site is HTTPS and the sitemap is accepted by Google. I questioned Bing about this and got this response:
To set your expectations, our Sitemap fetchers use a different pipeline and because of this, we cannot crawl Sitemaps in HTTPS format. We require that you submit an HTTP version of sitemap in order for Bing to properly crawl the file. Please go ahead and delete the current Sitemap and resubmit a new one in HTTP.
Currently I don't and can't have a HTTP version of my site & sitemap and my developers are telling me that 3hrs worth of dev time will go into coming up with a work-around which I'm not sure I want to invest in [I have more important things to concentrate my spend on!].
Has anyone been faced with this problem and is there any quick/cheap alternative or do I just accept that Bing won't crawl my site until they update their end?!
-
Hi Matthew, your response makes perfect sense. Thankfully Bing [seems to be!!] indexing my site - well certainly the pages that count as we are showing up in search results. We've been trying to come up with a work-around but all solutions will involve an element of dev. time which I don't really think is money well spent - at the moment anyway!
Cheers
Iain
-
Hey Iain. If it were me, I'd probably just accept that Bing can't crawl the sitemap and let it go. XML sitemaps are important, but not something that will generally make a huge life altering difference for your website's performance.
Now, I say "probably" because I'm wondering if you are having indexing problems with Bing. Are there pages you want Bing to index that maybe they can't reach easily (or at all) without an XML sitemap? If that is the case, then maybe it is worth the 3 hours of dev time to get the XML sitemap in place. Alternatively, you could find other ways to link to those pages Bing isn't currently indexing (on your site or others) to get those pages noticed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I redo my submitted sitemap to Google?
We are a electronic hardware manufacture with a fairly large catalog of products. I dynamically built our site and we have over 705,000 unique products that we can offer. With our php framework I was able to create sitemaps that hold every product unique url. After doing all of that I submitted our data to Google. Then waited with a cocktail encouraged that we'd grow up the ranks of Google organically. Well, that didn't happen. Besides several other problems (lack of overall unique content, appearance of duplicate content, no meta description, no unique page titles, poor use of heading tags and no rel canonical tags) how can I get a "do-over" with Google and my submitted sitemaps? Can they be re-submitted? Can they even be deleted?
Reporting & Analytics | | jandk40140 -
Google Analytics Set-Up for site with both http & https pages
We have a client that migrated to https last September. The site uses canonicals pointing to the https version. The client IT team is reluctant to put 301 redirects from the non-secure to the secure and we are not sure why they object. We ran a screaming frog report and it is showing both URLs for the same page (http and https). The non-secure version has a canonical pointing to the secure version. For every secure page there is a non-secure version in ScreamingFrog so Google must be ignoring the canonical and still indexing the page however, when we run a site: we see that most URLs are the secure version. At that time we did not change the Google Analytics setup option to use: "https" instead of "http" BUT GA appears to be recording data correctly. Yesterday we set up a new profile and selected "https" but our question is: Does the GAnalytics http/https version make a difference if so, what difference is it?
Reporting & Analytics | | RosemaryB1 -
Https question
I had a site with an ssl certificate on. It has now been taken off. We are getting 404 errors on the weekly report for pages that were indexed as https What is the best way to get rid of these? How can I take them off the map or do we need to put the ssl back on? Thanks
Reporting & Analytics | | sharpster0 -
What's the new "Value" column in GWT about?
I was checking out our GWT this morning and noticed a new column on the far right that was labeled "Value". Currently, there isn't anything of value (no pun intended) listed just $Nan. Anyone else see this or know what it might be? 014pThY
Reporting & Analytics | | Shawn_Huber0 -
Where have the 'most changed keyword rankings' gone from the weekly summary emails?
Since the change to Moz we have noticed that the weekly summary emails do not show the 'most changed keyword rankings' table. We found these extremely helpful and would be disappointed to see these go. Are these going to make a come back?
Reporting & Analytics | | RedAntSolutions2 -
Google Analytics Content Experiments don't deliver 50/50?
Our A/B test is actually delivering at about a 70/30 page view rate. 70% in favor of the original version and only 30% of the new. We are sending 100% of our traffic to this homepage test. Has anyone else experienced this? There seems to be a lot of folks experiencing this.....anyone know why?
Reporting & Analytics | | VistageSEO0 -
Duplicate content? Split URLs? I don't know what to call this but it's seriously messing up my Google Analytics reports
Hi Friends, This issue is crimping my analytics efforts and I really need some help. I just don't trust the analytics data at this point. I don't know if my problem should be called duplicate content or what, but the SEOmoz crawler shows the following URLS (below) on my nonprofit's website. These are all versions of our main landing pages, and all google analytics data is getting split between them. For instance, I'll get stats for the /camp page and different stats for the /camp/ page. In order to make my report I need to consolidate the 2 sets of stats and re-do all the calculations. My CMS is looking into the issue and has supposedly set up redirects to the pages w/out the trailing slash, but they said that setting up the "ref canonical" is not relevant to our situation. If anyone has insights or suggestions I would be grateful to hear them. I'm at my wit's end (and it was a short journey from my wit's beginning ...) Thanks. URL www.enf.org/camp www.enf.org/camp/ www.enf.org/foundation www.enf.org/foundation/ www.enf.org/Garden www.enf.org/garden www.enf.org/Hante_Adventures www.enf.org/hante_adventures www.enf.org/hante_adventures/ www.enf.org/oases www.enf.org/oases/ www.enf.org/outdoor_academy www.enf.org/outdoor_academy/
Reporting & Analytics | | DMoff0 -
Differences in keyword rankings in Google and Bing and Yahoo
Hi there, We have some keywords that are ranked so far apart on the search engines its puzzling. For example we have keywords ranked at say 10, 9, 7 etc on Google, not in top 50 on Bing or Yahoo. Stuff like that. Surely the algorithms can’t be that far apart? Is this indeed normal? Does anyone have the same issues? Thanks
Reporting & Analytics | | inhouseninja0