Wordpress Blog Integrated into eCommerce site - Should we use one xml sitemap or two?
-
Hi guys,
I wonder whether you can help me with a couple of SEO queries:
So we have an ecommerce website (www.exampleecommercesite.com) with its own xml sitemap, which we have submitted to the Google Webmasters Console. However, recently we decided to add a blog to our site for SEO purposes.
The blog is on a subdomain of the site such as:
(We wanted to have it as www.exampleecommercesite.com/blog but our server made it very difficult and it wasn't technically possible at the time)
1. Should we add the blog.exampleecommercesite.com as a separate property in the Google Webmaster tools?
2. Should we create a separate xml sitemap for the blog content or are there more benefits in terms of SEO if we have one sitemap for the blog and the ecommerce site?
If appreciate your opinions on the topic!
Thank you and have a good start of the week!
-
Glad to help!
-
Thanks Logan! This makes perfect sense but we just wanted to be 100% sure.
-
Since your blog is on a subdomain, yes, you will need to set up a separate WMT profile for it. The blog will also need its own robots.txt and XML sitemap files, since technically speaking, subdomains are regarded as a different site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
In one site a 3rd party is asking visitors to give feedback via pop-up that covers 30-50% of the bottom of the screen, depending on screen size. Is the 3rd party or the site in danger of getting penalized after the intrusive interstitial guidelines?
I am wondering whether the intrusive interstitial penalty affects all kinds of pop-ups regardless of their nature, eg if a third party is asking feedback through a discreet pop-up that appears from the bottom of the screen and covers max 50% of it. Is the site or the third party who is asking the feedback subject to intrusive interstitial penalty? Also is the fact that in some screens the popup covers 30% and in some others 50% plays any role?
Algorithm Updates | | deels-SEO0 -
Atom, RSS Feed or XML Sitemap which is better?
Hey Mozers! I'm reaching out to you today because I'm trying to find out more information about Atom & RSS Feed. I'm not sure which my Retail company should use for the sitemap. Atom? RSS? XML?. Why would you choose one more so than the other what are the benifits?
Algorithm Updates | | rpaiva0 -
Proactively Use GWT Removal Tool?
I have a bunch of links on my site from sexualproblems.net (not a porn site, it's a legit doctor's site who I've talked to on the phone in America). The problem is his site got hacked and has tons of links on his homepage to other pages, and mine is one of them. I have asked him multiple times to take the link down, but his webmaster is his teenage son, who doesn't basically just doesn't feel like it. My question is, since I don't think they will take the link down, should I proactively remove it or just wait till I get a message from google? I'd rather not tell google I have spam links on my site, even if I am trying to get them removed. However, I have no idea if that's a legitimate fear or not. I could see the link being removed and everything continuing fine or I could see reporting the removal request as signaling a giant red flag for my site to be audited. Any advice? Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Should I use the Disavow Tool at this point?
After Penguin, our site: www.stadriemblems.com jumped up to #1 for the keyword "embroidered patches." Now, months later, it's at the top pf page two. I'm pretty sure this is because we do have a few shady links (I didn't do it!) that perhaps Penguin didn't catch the first time around, but now Google is either discounting them or counting them against us. My question is, since I'm pretty sure those links are the reason we are gradually declining, should I submit them to Google as disavowed, even though technically, we're not penalized . . . yet? I have done everything possible to get them removed, and it's not happening.
Algorithm Updates | | UnderRugSwept0 -
Recovered from penguin/panda but which one?
So the good news is that for the first time since April 24th, one of our websites is back in the search results as of around December 12 but I am still unsure as whether it was panda or penguin (or both) that was impacting the site?? Note this was not a manual penalty. I diagnosed it as a penguin issue (drop on April 24th, aggressive on-page optimisation, around 10% of links from spammy directories like addyourfreelinks.com with anchor text built by a questionable agency), but on further advice it was thought that panda was also an issue because it is a hotel microsite so there was duplication with our own brand site and across third party travel sites and there were a number of pages with bare content. I figured it was a good time to clean everything up to address both. Here is a summary of actions taken: submitted disavow file on October 24th with all questionable links including actions taken and comments. Since then I have cleaned up some content so it is less aggressively targeting certain keywords. Amended several third party listings with duplicate content No follow,indexed pages that were directly duplicated with our brand site and over the last month have built a few good quality links. Cleaned up 404's in webmaster tools over the last week I have searched to see if there were any algorithm updates around December 12 but cannot find any mentions. Thoughts?
Algorithm Updates | | jay.raman0 -
Why does Google Alerts call my website a blog?
Our company started a WordPress blog about 14 years ago. It has since added a third-party forum, a user-submitted photo gallery, and a huge database of searchable products. We also have almost 4000 posts. With all that said, Google Alerts often lists our content under blogs rather than websites. Sometimes it shows up in both? Does anyone know what criteria Google uses for determining the type of content, and how we can signal to them that we are a website?
Algorithm Updates | | TMI.com0 -
How to Link a Network of Sites w/o Penguin Penalties (header links)
I work for a network of sites that offer up country exclusive content. The content for the US will be different than Canada, Australia, Uk, etc.… but with the same subjects. Now to make navigation easy we have included in the header of every page a drop down that has links to the other countries, like what most of you do with facebook/twitter buttons. Now every page on every site has the same link, with the same anchor text. Example: Penguins in Canada Penguins in Australia Penguins in the USA Because every page of every site has the same links (it's in the header) the "links containing this anchor text" ratio is through the roof in Open Site Explorer. Do you think this would be a reason for penguin penalization? If you think this would hurt you, what would you suggest? no follow links? Remove the links entirely and create a single page of links? other suggestions?
Algorithm Updates | | BeTheBoss0 -
Big site SEO: To maintain html sitemaps, or scrap them in the era of xml?
We have dynamically updated xml sitemaps which we feed to Google et al. Our xml sitemap is updated constantly, and takes minimal hands on management to maintain. However we still have an html version (which we link to from our homepage), a legacy from back in the pre-xml days. As this html version is static we're finding it contains a lot of broken links and is not of much use to anyone. So my question is this - does Google (or any other search engine) still need both, or are xml sitemaps enough?
Algorithm Updates | | linklater0