Discourage search engines from indexing this site AFTER a site launch
-
Hi,
I have unticked "Discourage search engines from indexing this site" a few months before the initial release of my website. I don't want to be found by search engines until the official release (still a few months left). Do you think that ticking this box again will harm the website's long-term ranking or have any repercussion on the website? Do you have any additional advice to avoid being temporarily ranked until the official release which won't harm the website in SERPs?
Thanks for your answers.
-
I would just leave that box in wordpress checked off or use the meta robots noindex tags on all of your pages. When you want the site to be indexed remove the tags and fetch your pages in GSC.
-
Hey there
Presume you're using WordPress here. From my past experience - no, that won't have a long-term detrimental effect on your site's ability to rank, once the site goes live.
If you're concerned, however, you could install a "construction" or "coming soon" page, which will allow the site to index, but prevent other URLs from being found/crawled (so long as you don't submit a sitemap until you're ready).
Seedprod's free plugin is highly recommended, and I've used it before to good effect: https://en-gb.wordpress.org/plugins/coming-soon/
Hope this helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Checking subdomains/ site structure of a website for International SEO
Dear Moz community, I am looking into two websites for a friend and we want to understand the following: What is the site structure as per the sub domains? e.g currently it is .com/en/ or .com/ru/ or .com/zh/ Using the crawl report, each page has a en or other language version. I take it this means that we have to create copy, meta titles and descriptions for each of the languages even if the page is the same but in a different language? To avoid duplication of content would you suggest canonical tags to be put in place? To check hreflang mark up, I couldn't find anything in the code which makes me thing a script is automatically translating this? This is the first time I have started to look at international SEO and want to understand what to look for in an audit of exisiting sites. Thank you,
Local Website Optimization | | TAT1000 -
Ecommerce Site Structure -- "/our_locations" page: helpful or harmful?
Hello! We are a retailer with brick and mortar stores in different cities. We have a website (ourbusiness.com), which includes a blog (ourbusiness.com/blog) and a separate ecommerce site for each store in subfolders (ourbusiness.com/Boston-store and ourbusiness.com/Atlanta-store). NB: We do this for non-business reasons and have no choice. So, this is not like REI (for example) or other stores with lots of locations but one central ecommerce operation. Most experts seem to recommend a site structure that echoes REIs. IE: a home page principally devoted to ecommerce (rei.com) includes an Our Locations-type page (rei.com/stores) which links to local store pages like (rei.com/stores/fresno) I understand how this would help REI, since their homepage is devoted to ecommerce and they need a store locator page that doesn't compete with the shopping experience. But since we can't send people to products directly from our home page, is there any reason for us not to put the store locator function right on the home page? That is, is there any reason in our case to prefer (A) ourbusiness.com/our_locations/Boston_store over (B) ourbusiness.com/Boston-store? As i see it, the extra page (/our_locations/) could actually hurt, as it puts products one click further away from customers, and one link deeper for bots. On the other hand, it may make the multi-store structure clearer to bots (and maybe people) and help us in local search. Finally, would it make a difference if there were 10 stores vs 2? Thanks for any thoughts!
Local Website Optimization | | RankAmateur...1 -
Localized Search Results
I'll try to setup this question: I go to Google.com and set the search tools to a particular city that I am not in (say I live in Nashville but set the search tools for Rockville MD). I do a search for a specific term without a location modifier such as "chrysler town and country" and I don't see the website I'm looking for in the first 100 results. Then I keep the search tools the same, but change the specific search to "chrysler town and country rockville md" and the website I'm looking for is now the #1 result. What would affect the difference? I would have expected the website to have a similar ranking in both situations.
Local Website Optimization | | perkfriday0 -
Search Result Brings Up Home Page
Hi all, I've been looking at some SEO work on our new CMS site that's been up for a few months now and when doing a search for a particular page which I know has good SEO and received a 'B' rating on the MOZ on page grader, our home page shows up and not our relevant page. I don't exactly want to worsen the SEO on our home page so that it doesn't show up, but why would my home page which is fairly generic and covers a large area of the business show up rather than the specific page which I am searching with exact keywords for? For example: If I was searching for 'bananas' my home page for fruits would show up rather than my bananas page which has been tailored around those keywords. Thanks
Local Website Optimization | | HB170 -
Client with business website as well as franchise site
I have a client who has created a Weebly web presence alongside his provided franchise website. What is my best strategy as he does not wish for the franchise site to out-perform his Weebly presence.
Local Website Optimization | | Sans_Terra0 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
Site was hacked - do I need to change my phone number?
The site I was leasing was hacked about a year ago. I've bought a new domain and changed everything about the site. It is completely separated from the old domain. However, my current phone number is still visible on the old site and I can't get it down. I really don't want to get a new phone number, but the SEO success of my new site is extremely important. Is there anyway the fact that my old phone number on the hacked site could hurt my new site's standing in Google? Thank you so much!
Local Website Optimization | | OptimizationMegan0 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0