Delay release of content or fix after release
-
I am in the midst of moving my site to a new platform. As part of that I am reviewing each and every article for SEO - titles, URLs, content, formatting/structure, etc, etc. I have about 200 articles to move across and my eventual plan is to look at each article and update for these factors.
I have all the old content moved across to the new server as-is (the old server is still the one to which my domain's DNS records point). At a high level I have two choice:
- Point DNS to the new server, which will expose the same content (which isn't particularly SEO-friendly) and then work through each article, fixing the various elements to make them more user friendly.
- Go through each article, fixing content, structure, etc and THEN update DNS to point to the new server.
Obviously the second option adds time before I can switch across. I'd estimate it will take me a few weeks to get through the articles. Option 1 allows me to switch pretty soon and then start going through the articles and updating them.
An important point here is the new articles already have new (SEO-friendly) URLs and titles on the new server. I have 301 redirections in place pointing from the old to new URLs. So, it's "only" the content of each article that will be changing on the new server, rather than the URLs, etc.
So, I'd be interested in any suggestions on the best approach - move across to the new server now and then fix content or wait till all the content is done and then switch to the new server.
Thanks.
Mark
-
I would definitely at least clean up the article HTML and structure before launching the pages, since you don't want people who might land on them before they're updated to have a weird experience. As far as optimizing them for SEO, I think you could go ahead and make the pages live and roll out edits as you make them. Prioritizing the pages based on highest-traffic/best-converting first is the way to go. If switching your platform is going to make your site easier to crawl, you definitely want to do that sooner rather than later - plus, having the new pages live will allow them to start accumulating some links even before you make keyword-related changes.
In general with a major change like this I recommend changing as few other things as possible simultaneously. It's OK to make more gradual changes, and it gives Google fewer things to get used to at one time.
-
If search engines did not catch up with changes we make and improve our ranking for positive changes, there'd be little point to Search Engine Optimization.
If Google is already seeing your pages anyway and the move will only make them better (even if they are still not where you'd like them to be), then you can go ahead and move them if you like, as long as the move will not create a confusing situation for the people looking at the pages.
As you fix the pages to your satisfaction, wait for them to be crawled again or resubmit them using Fetch as Google to possibly get them crawled faster. [And as far as H2 tags, if that is your main worry, I wouldn't worry too much--they probably won't make much difference.]
-
Thank you for the response, Linda. So, this is a slightly tricky one because I don't have a specific deadline per se, but also want to build a plan that gets me over to the new server as soon as possible, without falling into a trap of the switchover date just "floating". Let me put it this way.
I have the following "phases" for each of the articles (as reminder, I have around 200 such articles):
- Create all articles: Using the planned titles, categories and URLs but with no content.
- Move content across from old site to the new articles. Done with straight cut-and-paste (don't ask about importing - long story :)). This gets the data into WordPress posts as-is, but includes HTML markup from the old CMS, doesn't correctly use styles (some articles look pretty messy) and doesn't have a consistent use of H2 tags (H1 is the title). Most articles look "OK" but a) some are messy but readable for the human eye and b) the lack of H2 tags means there's no structure from an SEO-perspective.
- Clean up article HTML/structure. Review each article, cleaning up the HTML and ensuring the content still makes sense and reads well. HTML clean-up includes removing HTML relevant to the old CMS and making sure I have article structure through use of H2 tags
- Review each article for SEO. Will be using the Yoast SEO plugin and making changes recommended. The keywords are already decided (the URLs and titles in step 1 reflect those decisions) so for each article I will be reviewing the rest of the content and making sure it looks acceptable from an SEO perspective,
I am currently done with step 2 (all articles moved across, albeit some looking somewhat untidy and without any document structure). I am starting to work through step 3 now, but this is a time-consuming process.
I guess what this all boils down to is if I switch across will search engines "catch up" later, when I revise the content for structure and SEO changes. The existing site is not good - so, as it stands, search engines don't look on the site kindly.
One option is to just bite the bullet and move across (I'd see benefits from the title and URL changes, with the associated 301 redirects in place) and subsequently do steps 3 and 4. I'd actually like to do that but ONLY if I can be confident the search engines will end up in the same place as they would if I just waited till step 4 is done.
Another option is to finish step 3, move to the new server and then start updating articles for SEO (step 4).
Thanks.
Mark
-
Why are you switching? If there is no reason to be in a rush, then I'd wait and make the change when everything is ready--a few weeks isn't that long.
If there is a particular reason for haste (like you were having technical problems with the old platform or a lot of your traffic is mobile and you want to make the April 21 Google deadline), then I think it depends on the state of the content.
If it is not perfect but still makes sense with the new titles and URLs, I'd do the update for your most important content and switch. If it is terrible, I'd wait. There is no point getting traffic for bad content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Topic research and content suggestion
Is the topic research and content suggestions that semrush gives (that is currently in beta) similar to what moz calls content suggestions ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
No content in the view source, why?
Hi I have a website that you don't see the article body in the view source but if you use the inspect element tool you can see the content, do you know why? Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
How do i fix temporary redirects from Volusion?
I have around 20 temporary redirects that i can not really change. they look like this: http://www.bestfitbybrazil.com/reviewhelpful.asp?ProductCode=PANT+PA4069GRAFFITI&ID=17&yes=yes/login.asp See attached As you can see they are from LOGIN.ASP. This are system calls. I think the last thing I tried was blocking them in my robottxt file. but it doesn't seem to make a difference. Am I being effected by these redirects? How will Google look at them? QmiQcD6
Intermediate & Advanced SEO | | DrMcCoy0 -
Any Ecommerce Content Marketing Training and Resources
Hi guys! Was wondering if you can help me out finding training/courses/resources that you could recommend for content marketing for large retail ecommerce sites. Particularly interested in editing product and category pages though open minded for any of your support. Any suggestion is welcomed. In advance I appreciate your time and help! Best, Finn
Intermediate & Advanced SEO | | insite3600 -
Http and https duplicate content?
Hello, This is a quick one or two. 🙂 If I have a page accessible on http and https count as duplicate content? What about external links pointing to my website to the http or https page. Regards, Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Does onsite content updates have an effect on SERPs?
Hi, Some might see this as a very (VERY) basic question but wanted to drill down into it anyway. Onsite content: Lets say you have a service website and attached to it is a blog, the blog gets updated every other day with 500 words of relevant content, containing anchor text links back to a relevant page on the main website. Forget about social signals and natural links being built from the quality content, will adding the content with anchor text links be more beneficial then using that content to generate links through guest blogging? 10 relevant articles onsite with anchor links, or 10 guest posts on other websites? I guess some might say 5 onsite and 5 guest posts.
Intermediate & Advanced SEO | | activitysuper0 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840