Hosting Multiple Websites Within The Same Server Space
-
Hi,
So, I have a client who wants to host two websites (which you could refer to as sister sites) on the same hosting account.
For some reason, I was under the impression that doing as much may be detrimental (for SEO purposes). Am I correct in thinking this? Can I get some back-up documentation or comments here? I look forward to hearing what you all have to say. Thanks for reading!
-
It makes no difference, as long as they are viewed as separate accounts. If you have sites running out of directories of other sites then this can cause problems if the domain settings are not correct. Basically, make sure each account has its own Cpanel account, and you should be good to go. If they share the same IP address it makes no difference.
If you are super worried about this, then I would assign a separate IP address for each account if you have WHM access.
-
I have also never had any problems. I run multiple websites on my servers that are not sisters, but actually twins. As long as the content is not recycled, it should not hurt your SEO.
-
I've had the same as Gaston, no problem at all ranking websites from the same server.
-
Hi there.
I've found no SEO problems in hosting many sites in the same hosting space.
Best Luck.
GR.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlinks from customers' websites. Good or bad? Violation?
Hi all, Let's say a company holds 100 customers and somehow getting a backlink from all of their websites. Usually we see "powered by xyz", etc. Is something wrong with this? Is this right backlinks strategy? Or violation of Google guidelines? Generally most of the customers's websites do not have good DA; will it beneficial getting a backlinks from such average below DA websites? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Google URL Shortener- Should I use one or multiple???
I have a client with a number of YouTube videos. I'm using Google URL Shortner to allow the link to show in the YouTube text (as its a long URL). Many of these links go to the same page ex .com/services-page Should I use a single short URL for each video linking to the .com/services-page or should they be unique each time? If unique, would Google possibly think I'm trying to manipulate results? Thanks in advance. I'm just not sure on this one and hope someone knows best practice on this. Thanks!
White Hat / Black Hat SEO | | mgordon1 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Lots of websites copied my original content from my own website, what should I do?
1. Should I ask them to remove and replace the content with their unique and original content? 2. Should I ask them to link to the URL where the original content is located? 3. Should I use a tool to easily track these "copycat" sites and automatically add links from their site to my site? Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Website that lost ranking and now starting to recovery
Hello guys,
White Hat / Black Hat SEO | | WayneRooney
About a month ago we got in the webmaster tool a message that saying that we have unnatural links to the website.
We got drop from 200 keys that was in page 1-2 to pages 5-8.
We check our links and notice that someone links more then 1000 links to our site. We apply for reconsideration request plus we send the file with the links to Google to ask to remove. Yesterday we got message from Google that say : Manual spam action revoked.
We check today the ranking and we saw that from 3 keys that was in the first page, now we are with 24 in the first page. Very good improvement but still very far from the 130 keys that was in the first page a month ago. I wanted to ask, what can we expect ?
Are we gonna get the lost ranking now ?
Is this happen overnight ?Maybe the big change will be in the next penguin update ? Bottom Line, what is the chance to get back the ranking as we had before ?
This is the most important thing right now... Thank you0 -
Loss of 85-90% of organic traffic within the last 2 weeks.
Hey Everybody, Have a client that recently came to us asking for SEO help. Did some initial analysis on their current SEO status and most everything looked pretty good. On-page work was pretty good, nothing really lacking there other then missing alt tags for all images. Their linking profile looked good too. Lots of good links from quality sources, all relevant. Client has done some good press releases. They could probably use a bit more focus in their content as it is somewhat general and not keyword focused. Initially it didn't look like they needed any help with their SEO, so was a bit curious as to why they contacted us. Today we get their google analytics information and immediately noticed that they have had a 85-90 percent drop in organic traffic from all major search engines that started about two weeks ago. If all their SEO looks to be done properly, any ideas what would account for the massive drop in traffic? The only thing that looks like may have happened is that they may have dropped a couple spots from position #1 to position 2-3 for some of their highest traffic terms. Even if that is the case, I would not expect such a high drop off in terms of organic traffic. Just curious as to what anyone else can attribute the huge drop in traffic to or what else may help identify the issue. It's almost as if analytics was turned off or removed from the site, but that is not the case.
White Hat / Black Hat SEO | | Whebb0 -
Is there a danger linking to and from one website too many times?
Basically my webdeveloper has suggested that instead of using a subfolder to create an English and Korean version of the site I should create two different websites and then link them both together to provide the page in English, or in Korean, which ever the case may be. My immediate reaction is that search engines may perceive this kind of linking to be manipulative, as you can imagine there will be a lot of links (One for every page). Do you think it is OK to create two webpages and link them together page by page? Or do you think that the site will get penalized by search engines for link farming or link exchanging. Regards, Tom
White Hat / Black Hat SEO | | CoGri0 -
Server Side Java Script Redirects
I would like to use a redirect through a server based Java script to redirect visitors only referenced from a certain site. So let's say anyone clicking on a link to my site page-A from seomoz.org would automatically be redirected to page-B. All other users as well as direct and search engine traffic would only see the regular page A. The reason I am doing this is because the linking site is linking to page A which doesn't serve the user the correct content. Rather than contacting the webmaster to change the link to point to page -B, I want to redirect them. Is there any danger of Google penalizing this for cloaking? and how would they be able to tell?
White Hat / Black Hat SEO | | zachc_coffeeforless.com0