Domain restructure, sitemaps and indexing
-
I've got a handcoded site with around 1500 unique articles and a handcoded sitemap. Very old school.
The url structure is a bit of a mess, so to make things easier for a developer who'll be making the site database-driven, I thought I'd recategorise the content. Same content, but with new url structure (I thought I'd juice up the urls for SEO purposes while I was at it)
To this end, I took categories like:
/body/amazing-big-shoes/
/style/red-boots/
/technology/cyber-boots/And rehoused all the content like so, doing it all manually with ftp:
/boots/amazing-boots/
/boots/red-boots/
/boots/cyber-boots/I placed 301 redirects in the .htaccess file like so:
redirect 301 /body/amazing-boots/ http://www.site.co.uk/boots/amazing-boots/
(not doing redirects for each article, just for categories which seemed to make the articles redirect nicely.)
Then I went into sitemap.xml and manually overwrote all the entries to reflect the new url structure, but keeping the old dates of the original entries, like so:
<url><loc>http://www.site.co.uk/boots/amazing-boots/index.php</loc>
<lastmod>2008-07-08</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority></url>And resubmitted the sitemap to Google Webmasters.
This was done 4 days ago. Webmaster said that the 1400 of 1500 articles indexed had dropped to 860, and today it's climbed to 939.
Did I adopt correct procedure? Am I going about things the right way? Given a little time, can I expect Google to re-index the new pages nicely?
I appreciate I've made a lot of changes in one fell swoop which could be a bit of a no-no... ?
PS Apologies if this question appears twice on Q&A - hopefully I haven't double-posted
-
If your developer will be making the website dynamic via a system like WordPress there will be automated ways to keep your sitemap up to date every time you publish a new page to your system and then it will even ping the search engines that the sitemap is updated It will be a "set it and forget it" type of thing with sitemaps if you are moving in that direction
Good luck!
-
Oh, no what you did is perfect! I guess I meant the site architecture/navigation, but you answered it in your original post when you said "Same content" so disregard that question. Congrats.
-
Sadly I did change the internal linking structure, so that internal links now point to new urls not the old ones. The good news is that even with changing the internal linking structure, Google seems to be keeping abreast of it all. The number of urls indexed has now jumped - in a day - from 939 to 1024, so good old Google is clearly keeping up with the changes. Looks like my fears were ungrounded. Yay
-
Looks perfect to me too. Did the internal linking structure change at all or is that still the same? If it's all the same you should be right back where you were in no time. And you should see some benefits from having a more common sense, easy to understand URL structure. Cheers!
-
That's fair. I get that you're not recommending it personally - but it does seem popular with consistently good feedback from people, so I'll give it a go
-
Just to clarify, I know the sitemap tool I mentioned is very popular. Many small sites use it because it is online, fast and free. I have used it a few times myself. I can't necessarily say I recommend it because I have never personally purchased the software. I would say that if I was looking to obtain a sitemap for your site, I would start with that tool but may take a look at some others.
-
Thanks Ryan, that's a weight off my mind. I'll definitely take up your advice on the sitemap generator, too. Thanks for the recommendation - I'd seen a few around, but wasn't sure - it's great to be pointed in the right direction!
-
Did I adopt correct procedure? Am I going about things the right way? Given a little time, can I expect Google to re-index the new pages nicely?
I would say Yes to all three, but clarify with details below.
When you submit a sitemap to Google with 1500 pages, there is no guarantee they will index all of your pages. It sounds like you have done a lot of intensive, manual work. Fortunately, you have done things the correct way in properly redirecting each page to the new URL. If Google indexed 1400 pages before, they should index around that same number once again. It may take several weeks depending on a few factors such as your site's Domain Authority, navigation and how many links each page has received.
With respect to the sitemap, I would highly recommend using sitemap generation software. It is simply not reasonable to manually update a sitemap with 1500 entries. I would have updated the lastmod date on the sitemap but it may not make any difference.
A popular sitemap tool: http://www.xml-sitemaps.com/. The free version only generates 500 pages, but for $20 you can buy the full version and automate it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
English and French under the same domain
A friend of mine runs a B&B and asked me to check his freshly built website to see if it was <acronym title="Search Engine Optimization">SEO</acronym> compliant.
Technical SEO | | coolhandluc
The B&B is based in France and he's targeting a UK and French audience. To do so, he built content in english and french under the same domain:
https://www.la-besace.fr/ When I run a crawl through screamingfrog only the French content based URLs seem to come up and I am not sure why. Can anyone enlighten me please? To maximise his business local visibility my recommendation would be to build two different websites (1 FR and 1 .co.uk) , build content in the respective language version sites and do all the link building work in respective country sites. Do you think this is the best approach or should he stick with his current solution? Many thanks1 -
Spammers created bad links to old hacked domain, now redirected to our new domain. Advice?
My client had an old site hacked (let's call it "myolddomain.com") and the hackers created many links in other hacked sites with links such as http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html The old myolddomain.com site was redirected to a different new site since then, but we still see over a thousand spam links showing up in the new site's Search Console 404 crawl errors report. Also, using the links: operator in google search, we see many results of spam links. Should we be worried about these bad links pointing to our old site and redirecting to 404s on the new site? What is the best recommendation to clean them up? Ignore? 410s? Other? I'm seeing conflicting advice out there. The old site is hosted by the client's previous web developer who doesn't want to clean anything up on their end without an ongoing hosting contract. So beyond turning redirects on or off, the client doesn't want to pay for any additional hosting. So we don't have much control over anything related to "myolddomain.com". 😞 Thanks in advance for any assistance!
Technical SEO | | usDragons0 -
Why Are Some Pages On A New Domain Not Being Indexed?
Background: A company I am working with recently consolidated content from several existing domains into one new domain. Each of the old domains focused on a vertical and each had a number of product pages and a number of blog pages; these are now in directories on the new domain. For example, what was www.verticaldomainone.com/products/productname is now www.newdomain.com/verticalone/products/product name and the blog posts have moved from www.verticaldomaintwo.com/blog/blogpost to www.newdomain.com/verticaltwo/blog/blogpost. Many of those pages used to rank in the SERPs but they now do not. Investigation so far: Looking at Search Console's crawl stats most of the product pages and blog posts do not appear to be being indexed. This is confirmed by using the site: search modifier, which only returns a couple of products and a couple of blog posts in each vertical. Those pages are not the same as the pages with backlinks pointing directly at them. I've investigated the obvious points without success so far: There are a couple of issues with 301s that I am working with them to rectify but I have checked all pages on the old site and most redirects are in place and working There is currently no HTML or XML sitemap for the new site (this will be put in place soon) but I don't think this is an issue since a few products are being indexed and appearing in SERPs Search Console is returning no crawl errors, manual penalties, or anything else adverse Every product page is linked to from the /course page for the relevant vertical through a followed link. None of the pages have a noindex tag on them and the robots.txt allows all crawlers to access all pages One thing to note is that the site is build using react.js, so all content is within app.js. However this does not appear to affect pages higher up the navigation trees like the /vertical/products pages or the home page. So the question is: "Why might product and blog pages not be indexed on the new domain when they were previously and what can I do about it?"
Technical SEO | | BenjaminMorel0 -
Bing is not Indexing my site.
Hi, My website is four months old and has more than 8000 pages. Bing has indexed only 8 pages till date and Google also keeps playing hide and seek with it. There was a time when google indexed almost all the pages of my site but now there are only 5000 pages indexed. Moreover when I check my site on google (by typing site:socktail.com), it shows only 26 pages. Please let me know what should I do. If somebody wants to take a look, my website is http://socktail.com Thanks
Technical SEO | | saurabh19050 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Why my site is not indexing in google
In google webmaster i have updated my sitemap in Mar 6th..There is around 22000 links..But google fetched only 5300 links for long time...
Technical SEO | | Rajesh.Chandran
I waited for 1 month till no improvement in google index..So apr6th we have uploaded new sitemap (1200 links totally)..,But only 4 links indexed in google ..
why google not indexing my urls? Is this affect our ranking in SERP? How many links are advisable to submit in sitemap for a website?0 -
Changing preferred domain
My company has an international website, and because of a technical issue visitors in one of our main countries cannot visits the "www" version of our site. Currently, the www version is our preferred domain - and the non www redirects to that page. To solve this problem, I was thinking of proposing the following and would greatly appreciate any feedback! (Note: If you answered my www vs. non www question, thanks - this is a follow up) 1. Set non www site as the preferred version 2. Redirect from www to non www 3. Contact our current links and ask them to change to without “www” 4. Change canonical URLs to without “www”
Technical SEO | | theLotter0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0