Domain restructure, sitemaps and indexing
-
I've got a handcoded site with around 1500 unique articles and a handcoded sitemap. Very old school.
The url structure is a bit of a mess, so to make things easier for a developer who'll be making the site database-driven, I thought I'd recategorise the content. Same content, but with new url structure (I thought I'd juice up the urls for SEO purposes while I was at it)
To this end, I took categories like:
/body/amazing-big-shoes/
/style/red-boots/
/technology/cyber-boots/And rehoused all the content like so, doing it all manually with ftp:
/boots/amazing-boots/
/boots/red-boots/
/boots/cyber-boots/I placed 301 redirects in the .htaccess file like so:
redirect 301 /body/amazing-boots/ http://www.site.co.uk/boots/amazing-boots/
(not doing redirects for each article, just for categories which seemed to make the articles redirect nicely.)
Then I went into sitemap.xml and manually overwrote all the entries to reflect the new url structure, but keeping the old dates of the original entries, like so:
<url><loc>http://www.site.co.uk/boots/amazing-boots/index.php</loc>
<lastmod>2008-07-08</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority></url>And resubmitted the sitemap to Google Webmasters.
This was done 4 days ago. Webmaster said that the 1400 of 1500 articles indexed had dropped to 860, and today it's climbed to 939.
Did I adopt correct procedure? Am I going about things the right way? Given a little time, can I expect Google to re-index the new pages nicely?
I appreciate I've made a lot of changes in one fell swoop which could be a bit of a no-no... ?
PS Apologies if this question appears twice on Q&A - hopefully I haven't double-posted
-
If your developer will be making the website dynamic via a system like WordPress there will be automated ways to keep your sitemap up to date every time you publish a new page to your system and then it will even ping the search engines that the sitemap is updated
It will be a "set it and forget it" type of thing with sitemaps if you are moving in that direction
Good luck!
-
Oh, no what you did is perfect! I guess I meant the site architecture/navigation, but you answered it in your original post when you said "Same content" so disregard that question.
Congrats.
-
Sadly I did change the internal linking structure, so that internal links now point to new urls not the old ones. The good news is that even with changing the internal linking structure, Google seems to be keeping abreast of it all. The number of urls indexed has now jumped - in a day - from 939 to 1024, so good old Google is clearly keeping up with the changes. Looks like my fears were ungrounded. Yay
-
Looks perfect to me too. Did the internal linking structure change at all or is that still the same? If it's all the same you should be right back where you were in no time. And you should see some benefits from having a more common sense, easy to understand URL structure. Cheers!
-
That's fair. I get that you're not recommending it personally - but it does seem popular with consistently good feedback from people, so I'll give it a go
-
Just to clarify, I know the sitemap tool I mentioned is very popular. Many small sites use it because it is online, fast and free. I have used it a few times myself. I can't necessarily say I recommend it because I have never personally purchased the software. I would say that if I was looking to obtain a sitemap for your site, I would start with that tool but may take a look at some others.
-
Thanks Ryan, that's a weight off my mind. I'll definitely take up your advice on the sitemap generator, too. Thanks for the recommendation - I'd seen a few around, but wasn't sure - it's great to be pointed in the right direction!
-
Did I adopt correct procedure? Am I going about things the right way? Given a little time, can I expect Google to re-index the new pages nicely?
I would say Yes to all three, but clarify with details below.
When you submit a sitemap to Google with 1500 pages, there is no guarantee they will index all of your pages. It sounds like you have done a lot of intensive, manual work. Fortunately, you have done things the correct way in properly redirecting each page to the new URL. If Google indexed 1400 pages before, they should index around that same number once again. It may take several weeks depending on a few factors such as your site's Domain Authority, navigation and how many links each page has received.
With respect to the sitemap, I would highly recommend using sitemap generation software. It is simply not reasonable to manually update a sitemap with 1500 entries. I would have updated the lastmod date on the sitemap but it may not make any difference.
A popular sitemap tool: http://www.xml-sitemaps.com/. The free version only generates 500 pages, but for $20 you can buy the full version and automate it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is my page being indexed?
To put you all in context, here is the situation, I have pages that are only accessible via an intern search tool that shows the best results for the request. Let's say i want to see the result on page 2, the page 2 will have a request in the url like this: ?p=2&s=12&lang=1&seed=3688 The situation is that we've disallowed every URL's that contains a "?" in the robots.txt file which means that Google doesn't crawl the page 2,3,4 and so on. If a page is only accessible via page 2, do you think Google will be able to access it? The url of the page is included in the sitemap. Thank you in advance for the help!
Technical SEO | | alexrbrg0 -
Why Are Some Pages On A New Domain Not Being Indexed?
Background: A company I am working with recently consolidated content from several existing domains into one new domain. Each of the old domains focused on a vertical and each had a number of product pages and a number of blog pages; these are now in directories on the new domain. For example, what was www.verticaldomainone.com/products/productname is now www.newdomain.com/verticalone/products/product name and the blog posts have moved from www.verticaldomaintwo.com/blog/blogpost to www.newdomain.com/verticaltwo/blog/blogpost. Many of those pages used to rank in the SERPs but they now do not. Investigation so far: Looking at Search Console's crawl stats most of the product pages and blog posts do not appear to be being indexed. This is confirmed by using the site: search modifier, which only returns a couple of products and a couple of blog posts in each vertical. Those pages are not the same as the pages with backlinks pointing directly at them. I've investigated the obvious points without success so far: There are a couple of issues with 301s that I am working with them to rectify but I have checked all pages on the old site and most redirects are in place and working There is currently no HTML or XML sitemap for the new site (this will be put in place soon) but I don't think this is an issue since a few products are being indexed and appearing in SERPs Search Console is returning no crawl errors, manual penalties, or anything else adverse Every product page is linked to from the /course page for the relevant vertical through a followed link. None of the pages have a noindex tag on them and the robots.txt allows all crawlers to access all pages One thing to note is that the site is build using react.js, so all content is within app.js. However this does not appear to affect pages higher up the navigation trees like the /vertical/products pages or the home page. So the question is: "Why might product and blog pages not be indexed on the new domain when they were previously and what can I do about it?"
Technical SEO | | BenjaminMorel0 -
How long after disallowing Googlebot from crawling a domain until those pages drop out of their index?
We recently had Google crawl a version of the site we that we had thought we had disallowed already. We have corrected the issue of them crawling the site, but pages from that version are still appearing in the search results (the version we want them to not index and serve up is our .us domain which should have been blocked to them). My question is this: How long should I expect that domain (the .us we don't want to appear) to stay in their index after disallowing their bot? Is this a matter of days, weeks, or months?
Technical SEO | | TLM0 -
Do I need a link to my sitemap?
I have a very large sitemap. I submit it to both Google and Bing, but do I need a link to it? If someone went there it would probably lock their browser. Is there any danger of not having a link if I submit it to Google and Bing?
Technical SEO | | EcommerceSite0 -
Domains
My questions is what to do with old domains we own from a past business. Is it advantages to direct them to the new domain/company or is that going to cause a problem for the new company. They are not in the same industry.
Technical SEO | | KeylimeSocial0 -
Subdomain and Domain Rankings
I have read here that domain names with keywords might add a boost to your search rank For instance using a completely inane example monkey-fights.com might get a boost compared to mfl.com (monkey fighting league) when searching for "monkey fights" There seems to be a hot debate as to how much bonus the first domain might get over the second, but leaving that aside for the moment. Question 1. Would monkey-fights.mfl.com get the same kind of bonus as a root domain bonus? Question 2. If the answer to 1 above was yes would a 301 redirect from the suddomain URL to root domain URL retain that bonus I was just thinking on how hard it is to get root domains these days that are not either being squatted on etc. and if this might be a way to get the same bonus, or maybe subdomains are less bonus prone and so it would be a waste of time Thanks
Technical SEO | | bThere0 -
Redirect Multiple Domains
This is a follow-up question from one posted earlier this month. I can't linked to that because it's a private question so I'm trying to summarize it below. We have a number of domains – about 20 - (e.g. www.propertysharp.com) that point to our main domain ip adress (www.propertyshark.com) and share the same content. This is no black-hat strategy whatsoever, the domains were acquired several years ago in order to help people who mistyped the websites url to reach their desired destination. The question was whether to redirect them to our main domain or not. Pros were the reportedly millions of incoming links from these domains - cons was the fact that lots of issues regarding duplicate content could arise and we actually saw lots of some pages from these domains ranking in the search engines. We were recommended to redirect them, but to take it gradually. I have a simple question - what does gradually mean - one domain per week, per month?
Technical SEO | | propertyshark0 -
Weird Indexing Question
Google has indexed mysite.com/ and mysitem.com/\/ (no idea why). If you click on the /%5C? URL it takes you to mysite.com//. I have a rel=canonical tag on it that goes to mysite.com/ but I was wondering if there was another way to correct the issue.
Technical SEO | | BryanPhelps-BigLeapWeb0