CMS Auto Generated Sitemap Work Around?
-
Hey Moz Community,
The Shopify ecommerce platform auto generates xml sitemaps and robots.txt for you.
Frustratingly there is no way to augment either of these. If I noindex on a page it will still show up in the site map... Causing inconstancy with the sitemap submitted to GWT.
In theory if put the MY version of the sitemap on site and point GWT to MY version.. Would this solve the inconstancy ? Or would Googlebot go in and still crawl the default /sitemap.xml anyway?
Any suggestions and insight is greatly appreciated!
-
Hi Dylan,
I haven't worked much with the technical side of Shopify, so wasn't aware of this. Very prohibitive though.
I hope you can get this sorted OK.
-Andy
-
Hey Andy,
I haven't directly asked the powers that be at Shopify to turn it off.. However our developer here gave me a pretty technical explanation that essentially comes down to because Shopify runs on rails and the way it handles hitting the server changing something would change it for everyone.
Thanks for pointing out the robots file referencing the old sitemap by the way. Close one!
-
Hi Dylan,
When you upload a sitemap, you get the option to state which one, and the location. If you were to upload your own and then point to it, this should be fine. Just remember to either remove the auto-generated one and never add it.
Are you able to add to the robots file? If so, you can point to your site map in there too. However, if this is already in, don't add your own otherwise Google will see two and inconsistencies will occur.
Are they unwilling to turn this off for you?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple CMS on one website / domain & SEO
For a client we would like to work with a content hub, but their website is build on a custom CMS so we are limited in our options and if we aks their web developers they ask crazy prices to help us. So now we have the idea to build the content hub with wordpress and implement it next to their current CMS. for example on www.website.com/contenthub/ . As far as i know this is technically possible and there are no negative effects regarding SEO as long as we link the two sitemaps together. Am i right or am i missing something here?
Technical SEO | | Siphoplait0 -
Will Redirection of Unnatural Links to the New Domain will work?
Hi All, If a website www.mywebsite.com has received 100 unnatural links and because of that rankings are dropped. I want to follow the below strategy to protect my website from these 100 unnatural links. Please let me know if it will work: Buy a new domain www.newdomain.com with new whois record (Not related to www.mywebsite.com ) Do 301 redirection of all 100 unnatural links (Those are pointing to www.mywebsite.com) to www.newdomain (New domain) Do 302 redirection from www.newdomain (New domain) to www.mywebsite.com (Old Domain) After applying the above strategy, will google still consider those 100 unnatural links for Old domain??
Technical SEO | | RuchiPardal0 -
Using a single sitemap for multiple domains
We have a possible duplicate content issue based on the fact that we have a number of websites run from the same code base across .com / .co.uk / .nl / .fr / .de and so on. We want to update our sitemaps alongside using the href lang tags to ensure Google knows we've got different versions of essentially the same page to serve different markets. Google has written an article on tackling this:https://support.google.com/webmasters/answer/75712?hl=en but my question remains whether having a single sitemap accessible from all the international domains is the best approach here or whether we should have individual sitemaps for each domain.
Technical SEO | | jon_marine0 -
Will an XML sitemap override a robots.txt
I have a client that has a robots.txt file that is blocking an entire subdomain, entirely by accident. Their original solution, not realizing the robots.txt error, was to submit an xml sitemap to get their pages indexed. I did not think this tactic would work, as the robots.txt would take precedent over the xmls sitemap. But it worked... I have no explanation as to how or why. Does anyone have an answer to this? or any experience with a website that has had a clear Disallow: / for months , that somehow has pages in the index?
Technical SEO | | KCBackofen0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0 -
Should I worry about errors MozBot finds but is not on my sitemap?
MozBot crawled a found a couple errors that isn't included on my sitemap plugin, such as duplicate page content on author pages. Should I worry about things not on my sitemap?
Technical SEO | | 10JQKAs0 -
Adjust the priority field under the XML sitemap option
For those familiar with this in Drupal - is this worth doing? It seems to be a setting that affects the priority of a URL compared to others on the site. It's set to a default of 0.5 but you can increase up to 1.0 I think. Anyone know about this? thanks
Technical SEO | | inhouseninja0