Is this organic search sketchiness worth unwinding?
-
Started working on a site and learned that the person before me had done a fairly sketchy maneuver and am wondering if it's a net gain to fix it.
The site has pages that it wanted to get third party links linking to. Thing is, the pages are not easy to naturally link to boost them in search.
So, the woman before me started a new blog site in the same general topic area as the first/main site. The idea was to build up even the smallest bit of authority for the new blog, without tipping Google off to shared ownership. So, the new blog has a different owner/address/registrar/host and no Google Analytics or Webmaster Tools account to share access to.
Then, as one method of adding links to the new blog, she took some links that originally pointed to the main site and re-directed them to the blog site.
And voila! ...Totally controllable blog site with a bit of authority linking to select pages on the main site!
At this point, I could un-redirect those links that give the blog site some of its authority. I could delete the links to the main site on the blog pages.
However, on some level it may have actually helped the pages linked to on the main site.
The whole thing is so sketchy I wonder if I should reverse it.
I could also just leave it alone and not risk hurting the pages that the blog currently links to.
What do you think? Is there a serious risk to the main site in this existing set up? The main site has hundreds of other links pointing to it, a Moz domain authority of 43, thousands of pages of content, 8 years old and Open Site Explorer Spam Score of 1. So, not a trainwreck of sketchiness besides this issue.
To me, the weird connection for Google is that third party sites have links that (on-page-code-wise) still point to the main site, but that resolve via the main site's redirects to the blog site. BTW, the blog site points to other established sites besides the main site. So, it's not the exclusive slave to the main site.
Please let me know what you think. Thanks!
-
I agree with the two methods that both you and Gaston have pointed out.
The downside to reversing those links is that the domain authority could drop a bit—which could impact their rankings on the SERPs. If this happens, the client might think you are doing something wrong and causing their rankings to rank when, in theory, you were trying to help get rid of any sketchy links. In my opinion, I’d keep them. They’ll make your work perform better. Disavowing them could yield worse results than what their former SEO provided. If that happens, you're playing defense and blaming.
Hope this helps!
-
Well, I like Gaston's answers on these boards and at the same time was curious if that seemed like the concensus.... leave it cause no real risk.
-
Hi 94501! Did Gaston answer you question, and if so, would you mind marking his response a "Good Answer?"
Otherwise, how else can we help?
-
Thanks, Gaston!
Any other insights, folks?
Mike
-
Hi there,
There are 2 exits here, and you've pointed them:
- Reverse those links
- Leave all as it is now.
On one hand, if you aren't confortable with those links, just reverse all.
On the other hand, you've said that the main site has a lot of links and it those 'unnatural links' will not make harm and that the satellite blog has really few conections to the latter. I'd say that there isnt, almost nothing, risk. So, i'd leave as it is now.
Hope it helps.
GR.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it a good idea to optimize for keywords that have no search volume if they're ranked?
Hello Moz Community, I have some questions I hope some of you can help with. We’re doing SEO work for a client that provides outsourced IT and managed IT services in Phoenix, AZ and cities in the Phoenix metro area (i.e. Glendale, Tempe, Scottsdale, etc.) They’re currently ranked for or targeting the following keywords: • consulting phoenix az (1)
Intermediate & Advanced SEO | | marnipatterson
• outsourced it phoenix (2)
• phoenix it support (3)
• it services Scottsdale (5)
• it consulting firm phoenix (targeting)
• it solutions phoenix (targeting) We have recommended the following keywords based on monthly search totals, competitive level and difficulty ratings in Moz. • IT consulting phoenix
• it consultant company
• outsourced it
• it support services
• it consulting services
• outsourcing it
• outsourced tech support Questions
1. While I know it’s a good idea to optimize for keywords that you're currently ranked for, there’s no search volume for any of these. So, I recommended non-geo versions since Google provides search results based on the user’s location. Will this preserve the company's current rankings?
2. If not optimizing for their current keywords will hurt their rankings, will using the current keywords as secondary keywords suffice? If so, do we need to include them in the content for keyword density?
3. Since search engine algorithms now focus so heavily on user intent, I assume we’re covered for all variations of a keyword (i.e. outsource it, outsourced it, outsourcing it, etc.) Is this correct?
4. They want to rank for “cloud services” and “cloud solutions.” Both are very competitive with high difficulty rankings. So, I recommended “cloud migration” and “cloud strategy” as alternatives since these are the main services they provide. Will including “cloud services” and “cloud solutions” as secondary keywords help them increase their rankings for both? If you’ve dealt with a similar situation, I'd appreciate your insight and advice. Thanks!0 -
Search visibility consistently low. No apparent cause.
I'm trying to nail down why a site would be consistently showing a search visibility of 1-2% whilst similar competitors are in 20-30s. There are no site errors reported. Sitemap, robot.txt, meta titles & descriptions, key word presence, alt attributes, load times, canonicals, etc all check out as fine. Backlinks profile is healthier than competitors'. Yet even searching for our main product the result links to an obscure blog page rather than our main site, despite the presence of identical and similar keywords on our homepage, in our title, h1 tags, web address... Site content and design seem subjectively good and at the very least matches better performing competitor site's. Does anyone know of any less visible reason as to why a site would be tanking so badly in search rankings? Have checked using other SEO tools and they all report the same as Moz.
Intermediate & Advanced SEO | | SimonZM1 -
Client has an inexplicable jump in crawled pages being reported in Google Search Console
Recently a client of mine noticed an inexplicable jump in crawled pages being reported in Google Search Console. We researched the following culprits and found nothing: Rel=canonicals are put in place No SSL/non SSL duplication We used a tool to extrapolate search query page data from Google Search Insights; nothing unusual No dynamic pages being made on the website All necessary landing pages are in the XML sitemap Could this be a glitch in GSC? We are wondering what the heck is going on. 7eaeS
Intermediate & Advanced SEO | | BigChad20 -
Thousands of 503 errors in GSC for pages not important to organic search - Is this a problem?
Hi, folks A client of mine now has roughly 30 000 503-errors (found in the crawl error section of GSC). This is mostly pages with limited offers and deals. The 503 error seems to occur when the offers expire, and when the page is of no use anymore. These pages are not important for organic search, but gets traffic from direct and newsletters, mostly. My question:
Intermediate & Advanced SEO | | Inevo
Does having a high number of 503 pages reported in GSC constitute a problem in terms of organic ranking for the domain and the category and product pages (the pages that I want to rank for organically)? If it does, what is the best course of action to mitigate the problem? Looking excitingly forward to your answers to this 🙂 Sigurd0 -
Indexing falling/search queries the same - concerned
Hello, I posted abou this a few days ago but didn't really get anywhere and now have new information after looking into it more. This is my site - http://www.whosjack.org My page indexing has been falling steadily daily currently from thousands of pages indexed to just a couple of hundred. My search queries don't seem to be currently affected, I have done crawl tests to see if the site can be crawled and put the site:whosjack.org into Google and had 12,000 results come back when goole has said it has indexed 133 and falling. However all pages indexed on the site:whosjack.org search seem to be stories with just two words in the title? I am sure I am missing out on traffic here but can't work out what the issue is and how to fix it. I have no alerts on my dashboard and when I submit sitemaps to webmaster tools I get 15,115 URLs submitted 12,088 URLs indexedwhich cant be bad?Any help/suggestions really appreciated.
Intermediate & Advanced SEO | | luwhosjack0 -
Does using robots.txt to block pages decrease search traffic?
I know you can use robots.txt to tell search engines not to spend their resources crawling certain pages. So, if you have a section of your website that is good content, but is never updated, and you want the search engines to index new content faster, would it work to block the good, un-changed content with robots.txt? Would this content loose any search traffic if it were blocked by robots.txt? Does anyone have any available case studies?
Intermediate & Advanced SEO | | nicole.healthline0 -
Best strategy behind moving country subdirectory to dedicated TTLD wo/ loosing organic search volume?
Community, We are about to move one of our most popular country sub directories from brandname.com/de/.. to brandname.de . We have just purchased the domain so while the domain has been registered in 2009 the URL has zero domain authority. What is the best strategy to execute the move while being cautious about loosing too much organic search volume the subdirectory is receiving right now? Obviously it will take some time to build up DA on the TTLD so maybe it is a good idea to keep the country directory for a little longer and start on the TTLD with just a static landing page, place some links, wait until it receives some DA builds up and then perform the move. Thoughts? /TomyPro
Intermediate & Advanced SEO | | tomypro0