My website is coming up under a proxy server "HideMyAss.com." How do I stop this from happening?
-
We've noticed that when we search our web copy in Google the first result is under a proxy server "HideMyAss.com," and our actual website is no where in sight. We've called Google and they really didn't have an answer for us (well the 2-3 people) we spoke with.
Any suggestions or ideas would be greatly appreciated.
-
That's very frustrating! I've never had a problem with HMA specifically, but that script usually works.
You may want to try searching for things like "break out of frame proxy" or "break out of frame php" to see if anyone out there has come up with a better solution.
-
We've added the break out of frame script but it doesn't work. Hidemyass is too smart for that script.
We also tried disallowing access proxy servers via PHP (see below) and htaccess. For PHP, it also disallow some pages under CDN. We might try turning off CDN and try PHP again. Htaccess doesn't work at all for hidemyass.
die("Proxy access not allowed"); ?>
-
Great call on both the canonical tags and breaking out of the frame. I imagine the site will be indexed properly within a week using these tactics.
-
Thanks! I'll get with our Web Developer and see how that works.
-
Interesting. Google is usually smart about not indexing proxies, but here it looks like it's doing exactly that.
You can try using Javascript to break your site out of any frames when a visiting your site. This will break the user out of the frame if they come via the proxy, and may also help prevent Google from indexing the proxy frame:
-
Interesting, I am now seeing what you are talking about...
-
It's when we copy some web copy into Google. For instance if you enter this text into Google it will come up "At American Addiction Centers, treatment programs are tailored to meet the precise goals and needs of each individual. Our staff develops working relationships with clients while reviving mind, body and spirit."
And that's true for most of our web content.
-
What query are you searching for? The website comes up fine for me.
-
Hi Takeshi,
We have the canonical tags on all our headers. And there are no manual penalties listed in webmaster tools. Any thoughts?
-
Sure! Here it is http://americanaddictioncenters.com/
Thanks! - L
-
Make sure you have the canonical tag in the header of all your pages. It helps Google tell what is the original source of a page:
https://support.google.com/webmasters/answer/139394?hl=en
Also, check in webmaster tools to see if your site has a manual penalty, being penalized can cause other sites to show up higher than you in the SERPs for your content.
-
Can you share your URL so we can take a look? Feel free to PM me if you want to keep it private.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A website with some guidelines points similar - is this creates an issue?
Hey Guys, Please clarify my doubt at the earliest. We just revamped the website with new content and hired a content writer for our services page to make it done. I just came across with 2 pages with similar guidelines over the content. These are the pages showing some similarity of bulletins. Please take a look on it and give the reply, it creates any ranking issues or not. Page-1: https://www.socprollect-mea.com/business-setup-in-ajman/ Page-2: https://www.socprollect-mea.com/business-registration-in-ajman-free-zones/ Reply ASAP
White Hat / Black Hat SEO | | nazfazy0 -
Chrome79 shows warning on our domain "Did you mean...?" another website
On Chrome79 a large scary warning is shown to users on our site: "Did you mean this other domain? This site's domain looks similar to X domain. Attackers sometimes mimic sites by making small, hard-to-see changes to the domain." Screenshot: https://imgur.com/a/NOGEyLM Our online business is reputable, no black hat SEO practices, has been established since the early 2000s, with a relatively high DA. We don't have any warnings / manual actions in Google Search Console so I can't request a review there. I've reported it several weeks ago to Google's Incorrect Phishing Warning but the warning continues to display. I reported using: google.com/safebrowsing/report_error/ Does the Moz community have any suggestions on how to fix this or general thoughts? Thanks! NOGEyLM
White Hat / Black Hat SEO | | sb10300 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
How to save website from Negative SEO?
Hi, I have read couple of good blog post on Negative SEO and come to know about few solution which may help me to save my website during Negative SEO. Here, I want to share my experience and live data regarding Negative SEO. Someone is creating bad inbound links to my website. I come to know about it via Google webmaster tools. Honestly, I have implemented certain solutions like Google disavow tool, contact to certain websites and many more. But, I can see negative impact on organic visits. Organic visits are going down since last two months. And, I am thinking, These bad inbound links are biggest reasons behind it. You can visit following URLs to know more about it. Can anyone share your experience to save website from negative SEO? How can I save any website from Negative SEO (~Bad Inbound Links) https://docs.google.com/file/d/0BxyEDFdgDN-iR0xMd2FHeVlzYVU/edit https://drive.google.com/file/d/0BxyEDFdgDN-iMEtneXU1YmhWX2s/edit?usp=sharing https://drive.google.com/file/d/0BxyEDFdgDN-iSzNXdEJRdVJJVGM/edit?usp=sharing
White Hat / Black Hat SEO | | CommercePundit0 -
Website that lost ranking and now starting to recovery
Hello guys,
White Hat / Black Hat SEO | | WayneRooney
About a month ago we got in the webmaster tool a message that saying that we have unnatural links to the website.
We got drop from 200 keys that was in page 1-2 to pages 5-8.
We check our links and notice that someone links more then 1000 links to our site. We apply for reconsideration request plus we send the file with the links to Google to ask to remove. Yesterday we got message from Google that say : Manual spam action revoked.
We check today the ranking and we saw that from 3 keys that was in the first page, now we are with 24 in the first page. Very good improvement but still very far from the 130 keys that was in the first page a month ago. I wanted to ask, what can we expect ?
Are we gonna get the lost ranking now ?
Is this happen overnight ?Maybe the big change will be in the next penguin update ? Bottom Line, what is the chance to get back the ranking as we had before ?
This is the most important thing right now... Thank you0 -
Changing domains from .net to .com after 7 month of traffic loss.
We are in business since 2005 and we always used the .net version as it was the only one available when we started. In about 2007 we bought the .com version to the person who owned it but we kept using the .net as customers were already used to that version. In January we started to see a SE traffic loss, not to mention being outranked by several sites (95% of those site spammers). We had no manual penalty but it could be an algorithmic, we are not sure if we even have some sort of penalty or is just that our niche is too spammed. We are now considering moving the site to the .com version as all our tries of increasing and regaining our ranks were useless (backlink cleanup, disavow tool usage, excellent link building, excellent content creation and social interactions). Our DA and PA are both higher that any of the other ages ranking on top. We have about 3k pages indexed. What do you guys think? Should we move the site to the .com? (note that the change is ranking-wise, not in terms of branding). And if we do, should we 301 all pages? or rel=canonical to avoid a possible "penalty flow" to the other domain? Note: for years, the .com version was/is 301 to the .net one. Thank you all!
White Hat / Black Hat SEO | | FedeEinhorn0 -
Has anyone used tribepro.com
Does that concept really work. Any experience? I've registered and so far I think it's hard to measure whether the shares are spam or genuine. Would love to see it works for someoneThanks
White Hat / Black Hat SEO | | LauraHT0