What is the best way to change your sites folder structure?
-
Hi,
Our site was originally created with a very flat folder structure - most of the pages are at the top level. Because we will adding more content I want to tidy up the structure first. I just wanted to check what the best way to go about this was. Is it best to:
-
First configure all the new 301 redirects to point to the new pages, while leaving the actual links on our site pointing to the old pages. Then changing the links on the site after a few weeks.
-
Configure the redirects and change the actual links on my website at the same time to point to the new locations.
My thinking that if I go with option 1 route then I will give Google a chance to process all the redirects and change the locations in their index before I start pointing them to the new locations. But does it make any difference? What is the best wat to go about making this sort of change to minimize any loss in rankings, page rank etc?
Thanks for the help.
-
-
Cyklop Studio is correct. Definitely the second option.
301 your pages, change all links, then crawl your site so you can find any links you missed.
Imagine for a moment you moved to a new home. You go to the post office and forward your mail to the new home (i.e. a 301). Would you still keep asking everyone to send your mail to your old home (i.e. keep the old links)? No. You would update everyone you can, then as you notice forwarded mail you can contact those people or businesses you missed and alert them to your new address.
-
This is something I am considering doing as well with my site. I would really like to change to folder structure around. The biggest issue is some of my inner pages are ranking well in the SERPS and I want to continue that success even after the switch is done.
-
My gut feeling says #2. I'll spend the rest of this post thinking out loud why I think that one is there better option (though I don't think there is actually a 'wrong' and a 'good' option here, both have their advantages en disadvantages).
-
Both your visitors and the search engines will stop visiting the old URLs as fast as possible (saving you bandwidth on the redirects).
-
Less 'code overhead' regarding cases such as 'did I change that one already?'
-
You are treating search engine robots and human visitors equally
Love to see what others have to say about this!
-
-
definitely number 2.. if you leave the links, then your have conflicting information for Google. 1 saying it's been moved over here and 1 saying everything has is still where it use to be.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Messy older site
I am taking over a website that doesn't have any canonical tags and spotty redirects. It looks like they have http://, https://, www and non-www pages indexed but GA is just set up for the http://non-www home page. Should all versions of the site be set up in GA and Search Console? I think so but wanted to confirm. Thanks in advance.
Technical SEO | | SpodekandCo0 -
Best way to deal with 100 product pages
It feels good to be BACK. I miss Moz. I left for a long time but happy to be back! 🙂 My client is a local HVAC company. They sell Lennox system. Lennox provides a tool that we hooked up to that allows visitors to their site to 'see' 120+ different kind of air quality, furnace and AC units. They problem is (I think its a problem) is Google and other crawl tools are seeing these 100+ pages that are not unique, helpful or related to my client. There is a little bit of cookie cutter text and images and specs and that's it. Are these pages potentially hurting my client? I can't imagine they are helping. Best way to deal with these? Thank you! Thank you! Matthew
Technical SEO | | Localseo41440 -
Best Schema Advice
Hi, I am new here and I have searched for but not got a definitive answer for this. I am sorting out a website which is a scaffolding company operating in a particular area. They are only interested in targeting a particular area and from what I have read through here I need to mark the site up with schema mentioning their company name and address. My issue is that I seem to find lots of conflicting advice about what should go it and how it should be laid out. I would love to know peoples opinions on where the best guide for setting up schema correctly for a site like this. They use wordpress, I am ok with inserting code to the site etc, I just want to make sure I get it right from the start. Once I have done this, I understand that I need to get local citations using the same NAP as how the site is marked up. Sorry for what might seem like a daft question but I am a designer and I am still learning the ins and outs of SEO. Thanks
Technical SEO | | kirstyseo0 -
Best way to get SEO friendly URLSs on huge old website
Hi folks Hope someone may be able to help wit this conundrum: A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888 and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/ I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way: Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included? Rebuild the site entirely (preferably on PHP with a decent URL structure) Accept that the URLS can't be made friendly on a site this size and focus on other aspects Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs. Any thoughts from the great minds in the SEOmoz community appreciated! Cheers Simon
Technical SEO | | SCL-SEO1 -
How to find all the links to my site
hi i have been trying to find all the links that i have to my site http://www.clairehegarty.co.uk but i am not having any luck. I have used the open explorer but it is not showing all the links but when i go to my google webmaster page it shows me more pages than it does on the semoz tool. can anyone help me sort this out and find out exactly what links are going into my site many thanks
Technical SEO | | ClaireH-1848860 -
Best way to condense content on a page?
We want to add a video transcript to the same page as the video, but it doesn't really fit the design of the page. Is it fine to use CSS/DIVs to either have a "click to read full transcript" or a scroll box?
Technical SEO | | nicole.healthline0 -
How to setup tumblr blog.site.com to give juice to site.com
Is it possible to get a subdomain blog.site.com that is on tumblr to count toward site.com. I hoped I could point it in webmaster tools like we do www but alas no. Any help would be greatly appreciated.
Technical SEO | | oznappies0 -
Recently revamped site structure - now not even ranking for brand name, but lots of content - what happened? (Yup, the site has been crawled a few times since) Any ideas? Did I make a classic mistake? Any advise appreciated :)
I've completely disappeared off Google - what happened? Even my brand name keyword does not bring up my website - I feel lost, confused and baffled on what my next steps should be. ANY advice would be welcome, since there's no going back to the way the site was set up.
Technical SEO | | JeanieWalker0