Changing my site (dramatically)
-
I am about to do a complete site change. I am going to WordPress. I am ranked #2 on SERPS. Will I lose rank for changing everything on my site? I have 500 pages indexed but I am about to have 30k indexed. It is a real estate site that is switching from a "framed" solution, to a listing indexed solution.
If I make good use of my keywords etc (on site optimization) will I be at risk of losing risk just for changing my site?
-
I have thousands of duplicate content warning.
I am using a hosted server/realtor package that doesn't let me see the source code. I basically have duplicates of every page on my site some how. I've used the company for 8 yrs though. I am in deep trouble. I fear I can't stay with them.
-
Joseph,
In scenarios like these, I put together a site map and review the overall value of the site and what it's providing to it's users.
Once I know what the value that I am currently providing, I put together an action plan (301 redirects, site optimization, site architecture etc...) with the proposed site changes and see if it will benefit the visitor by restructuring the site.
Once you are able to visualize the overall flow of the old site and the new site then you make decisions on whether or not it's worth while to move forward. Keep in mind that whatever you are doing has to give users a reason to use your website and stay on it.
It's possible you should start with some conversion optimization of certain pages to find where your leaks are, try and plug them and keep forging forward.
If you find that it's impossible to plug the leaks in your current structure than you may need to do what you are describing in your post.
-
No, no - if you set up the 301s then the originals don't have to remain. A visiting browser knows right away that the old page is gone and has bene permanently replaced.
However, Egol's advice is spot-on. This is complicated, and you have a LOT of pages involved. Your best bet is to hire an expert to help you with this. It may cost you some $$, but it'll pay off in a hurry.
I'm not pitching myself here - in fact, keep me out of the running so that it's clear I have no axe to grind.
But this is very, very technically complex.
-
This may be harder than I could have imagined. If 90% of the backlinks on my site are my root domain (PensacolaRealEstate.com) then I should not lose any link juice, right? The backlinks will lead to my root domain, period; however, if I change my site and cut off some of the URLs then I lose the backlinks to that page. This could cause my domain authority (a sum of ALL my backlinks to ALL my pages in the domain) but not my page strength (the root domain). Right?
If I were to set up these 301 redirects, it means I have to leave the files for those pages on the server but not actually link my new site to them, right?
-
Don't rush into this. You gotta do it smart.
When you change your site the goal should be to improve it and make it more competitive.
That is done by carefully improving the structure and the optimization... and also preserving your link assets with 301 redirects. Also avoiding problems like changing the URLs without redirecting.
If you don't know how to do these things it is essential to get a competent person to help you or to do it for you. There are lots of good instructions for moving a site on the web.
Now... your 30K pages.... Those could suck a lot of strength from your site. These could produce a duplicate content problem (Panda).
I would not be surprised if your rankings dropped after adding them. To minimize ranking loss you want to have very few links into them from the rest of your site and all of those pages linking out generously to the rest of your site.
To minimize panda problems you might combine some of these listings, placing several on a page. This page count reduction can reduce the rankings loss.
The problem with that is having enough links into those pages to get them indexed and hold them in the index. So if you put all of them up at the same time you will need fifty to sixty really strong links that hit nodes deep in the site. These links will channel spiders deep into the site and force them to chew their way out through those pages, indexing them as they go.
These links must be permanent or be held in place until other links are developed. If you remove them then google spider action will fall on those pages and google will forget them.
If you don't have experience with big sites find someone who has that experience who can help you design the structure of these new pages.
(not recommending myself... I only work on my own sites)
-
Chances are the listings are repeated on other web sites, so those probably won't have much positive impact on your rankings. They shouldn't do much harm, either, except that Google's going to waste a lot of time crawling all those duped pages.
The best way to preserve what you can is to ensure that you redirect all of the old page addresses to their new equivalents. So, if you have a page like:
and it will now be:
Set up a 301 redirect from the first to the second URL. That will ensure that search engines can quickly update their indexes.
Also, be sure to set up a full XML site map and submit it to Google and Bing Webmaster Tools. That'll give you very rapid feedback regarding any problems.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hey guys so i changed my title on the website
The old one was very keyword stuffed and repeated words. see its now gone from search for those main keywords do you think its just playing now and it will be back ?
On-Page Optimization | | johan81 -
Rel-nofollow for price comparison site?
I run a price comparison site, so we have TONS of outbound links. Should my outbound links be marked with 'rel=nofollow'?
On-Page Optimization | | lancerpanz0 -
Does CloudFlare Benefit my site?
I am using the $20 / month CloudFlare service. My site is not running faster, so my question is: is this service adding real value? I do not have a lot of highly sensitive customer data (some email addresses and customer names and nothing more - no credit cards). I am already using Amazon Cloud Server and it appears to work fine. Any thoughts appreciated.
On-Page Optimization | | khi50 -
Locating broken links on site?
Hey guys, I'm using Screaming Frog to help locate some broken links on a client's site and I've managed to pick up two. However, I can't seem to find whereabouts they're located on the site in order to fix them! Is there a way I can do this? Cheers!
On-Page Optimization | | Webrevolve0 -
Craw structure for web site about jobs?
Hi there, we have now a client who has a job offering web site.There are many craw errors in it. My question is how should the url structure in a jobs website look like and which pages should be indexed? What is the best way and tips for optimizing a job website? Now the posted jobs pages are dynamically like: examplejob.com/detail-job/1891222223/Careers-for-Mens---Womens/Experienced-Web-Design-Need I see many job websites allow their job offers to be indexed and may be this is useful because some people find jobs also when directly search in Google. Are they using dynamically urls for that? And also my related question is what happens when the job offer expires? When Google craws that page again should it be redirected to 404 page or the original job offer text should be there and just to be added info that this job offer has expired? Otherwise If only it's written that it has expired may be there will be too much duplicate content on many many pages.
On-Page Optimization | | vladokan0 -
Altering site structure
I work for a business that operates several sites that were developed a very long time ago. We've been making many different changes over the past 12-18 months to improve these sites in several different ways. One area that we've never discussed or attempted is general site structure. Its pretty obvious that when the business was started they had never heard of information architecture or usability design. To make matters worse, the internal linking strategy appears to have been link everything to everything. Well after being told that it couldn't be done - I'm getting our team to say we must focus on this, if for no other reason that to help consumers figure out how to navigate through our site. Today we essentially have a series of category / information pages. In some cases, we hang more detailed topical content related to a category /informational page in a hub and spoke manner. Although remember what I said about linking everything to everything. In reality there are a series of subtopics that should been designed for every category / informational area. Instead, what happened is in some cases the subtopic is integrated into the hub or category page, in other situations is hung off the page as a spoke page and in others the subtopic isn't even covered. The plan is to standardize - each category will have 'n' subtopics (~10-12, we're still working this out). From a navigational standpoint users will be able to easily navigate both across categories as well as subtopics within a category as well as between categories within adjacent/similar subtopics. This is essentially a grid if that makes sense. The question is this - we have some keywords that do well in SEO and many many more that do not and the trend has not been our friend. We're considering keeping the URLs of the pages associated with strong keywords the same within the nav structure, even though this might mean the URL for a spoke page will be inconsistent with the spoke page name from a different category. I don't see any real danger for pages that either are not associated with any ranking keywords or only very weak keywords. Maybe I'm wrong. What things should we consider in this change? We believe that this standardization should help consumers find the information they are looking for in a much more efficient manner, so page views/visit should go up. Additionally, this prepares us for category and subtopic comparison pages and other added functionality being added in a logical manner. We also think that as we add depth about a subtopic, it will be easier for us to acquire links to our site because the subtopics within a category will appeal to different websites. This is by no means a small project. We have hundreds and hundreds of pages. Do folks think this is a worthwhile endeavor? We've spent a lot of time cleaning up H1 tags, structure of our pages, anchor tags, page load order and speed, image caching, etc. Site structure, URL length and internal link structure are essentially what is left. Once these are done we intend to really get going on better and more organized content on our site. Thoughts?
On-Page Optimization | | Allstar1 -
Duplicate Content - Meta Data for International Site Roll Out
Hi All, We have a site targeting Ireland, so all on-page SEO is completed and launched on the Irish site. We are now rolling out this site to the UK...how much of this content & SEO meta data has to be changed for Google to not recognise it as duplicate content? Site structure is as follows: http://www.domain.com/ie-en/ - Irish site http://www.domain.com/uk-en/ - UK site Or will it even be considered duplicate content as we have the uk and Irish signals in the subfolders, will be using geo targeting on webmasters, and will have UK specific addresses and phone numbers? We will be rolling this site out to may more countries so would be great to get this straight from the start so we don't waste time creating many versions of the meta data unnecessarily! Many thanks Emma
On-Page Optimization | | john_Digino0 -
How deep should I go with a directory site?
I am creating a new site which has a directory component. Based on what I have ready I am inclined to keep the site architecture as flat as possible. However, the natural layout that I have come up with in my head has the directory listings 5 or 6 pages deep in the site structure. I saw in another post that someone in a similar situation was suggesting that going deep like this is fine so long as there are many internal links to the deeper pages to indicate that they are important. Should I make a conscious effort to make the site architecture as flat as possible? Are there any specific guides/resources that address this particular issue that I should be aware of? Thanks!
On-Page Optimization | | fastestmanalive0