Is this organic search sketchiness worth unwinding?
-
Started working on a site and learned that the person before me had done a fairly sketchy maneuver and am wondering if it's a net gain to fix it.
The site has pages that it wanted to get third party links linking to. Thing is, the pages are not easy to naturally link to boost them in search.
So, the woman before me started a new blog site in the same general topic area as the first/main site. The idea was to build up even the smallest bit of authority for the new blog, without tipping Google off to shared ownership. So, the new blog has a different owner/address/registrar/host and no Google Analytics or Webmaster Tools account to share access to.
Then, as one method of adding links to the new blog, she took some links that originally pointed to the main site and re-directed them to the blog site.
And voila! ...Totally controllable blog site with a bit of authority linking to select pages on the main site!
At this point, I could un-redirect those links that give the blog site some of its authority. I could delete the links to the main site on the blog pages.
However, on some level it may have actually helped the pages linked to on the main site.
The whole thing is so sketchy I wonder if I should reverse it.
I could also just leave it alone and not risk hurting the pages that the blog currently links to.
What do you think? Is there a serious risk to the main site in this existing set up? The main site has hundreds of other links pointing to it, a Moz domain authority of 43, thousands of pages of content, 8 years old and Open Site Explorer Spam Score of 1. So, not a trainwreck of sketchiness besides this issue.
To me, the weird connection for Google is that third party sites have links that (on-page-code-wise) still point to the main site, but that resolve via the main site's redirects to the blog site. BTW, the blog site points to other established sites besides the main site. So, it's not the exclusive slave to the main site.
Please let me know what you think. Thanks!
-
I agree with the two methods that both you and Gaston have pointed out.
The downside to reversing those links is that the domain authority could drop a bit—which could impact their rankings on the SERPs. If this happens, the client might think you are doing something wrong and causing their rankings to rank when, in theory, you were trying to help get rid of any sketchy links. In my opinion, I’d keep them. They’ll make your work perform better. Disavowing them could yield worse results than what their former SEO provided. If that happens, you're playing defense and blaming.
Hope this helps!
-
Well, I like Gaston's answers on these boards and at the same time was curious if that seemed like the concensus.... leave it cause no real risk.
-
Hi 94501! Did Gaston answer you question, and if so, would you mind marking his response a "Good Answer?"
Otherwise, how else can we help?
-
Thanks, Gaston!
Any other insights, folks?
Mike
-
Hi there,
There are 2 exits here, and you've pointed them:
- Reverse those links
- Leave all as it is now.
On one hand, if you aren't confortable with those links, just reverse all.
On the other hand, you've said that the main site has a lot of links and it those 'unnatural links' will not make harm and that the satellite blog has really few conections to the latter. I'd say that there isnt, almost nothing, risk. So, i'd leave as it is now.
Hope it helps.
GR.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Huge organic traffic drom after a perfect domain migration. What to do?
Hi, I already asked the question on different places. But so far nobody could help me.
Intermediate & Advanced SEO | | Dennis1992038
Hope someone can help me out. If possible.
I migrated my website https://vihara.nl to https://meditatieinstituut.nl and lost about 80% traffic (see printscreens). It's over more than a month ago now and there is no sign of getting it back up. Maybe there is nothing to do and
1. I have to be patient and traffic comes back in a few months.
or
2. There is nothing to do and I've lost everything I've build up in the last years. Start over again to get the rankings back.
or maybe, maybe
3. I just forgot something that I still need to do to get the rankings back up. Or there is something I did not think of... This is done: The website is migrated 1 on 1. No changes in content, url, code, etc. Everything is exactly the same as on the previous domain. 301 redirects whole domain (via htaccess a bulk redirect). All the old pages, without exceptions, lead to the exact new page. The new domain is running from CDN (Cloudflare) with the same settings as the previous domain. SSL is installed in the exact same way. Domain migration set up in Search console (working). Uploaded new sitemap (working). Updated internal links. Changed the most important external links (where I could get contact after reaching out) In meanwhile received some new external links and also posted new content Anybody knows what to do? Or do I just have to be more patient and will it come back in a few months by itself? Looking forward to suggetions. Thanks! Gerjan Migratie-Meditatie-Instituut-2048x786.jpg verloop-sinds-de-start-2048x355.jpg0 -
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
Should I use the Change of Address in Search Console when moving subdomains to subfolders?
We have several subdomains for various markets for our business. We are in the process of moving those subdomains to subfolders on the main site. Example: boston.example.com will become example.com/boston And seattle.example.com will become example.com/seattle and so on. It's not truly a change of address, but should I use the change of address tool in GSC for all of these subdomains moving?
Intermediate & Advanced SEO | | MJTrevens0 -
In the google index but search redirects to homepage
Hi everyone, thanks for reading i have a website "www.gardeners.scot" and have the following pages listed in google site: command http://www.gardeners.scot/garden-landscaping-Edinburgh.htm & http://www.gardeners.scot/garden-maintenance-Edinburgh.htm however when a user searches for "garden landscaping Edinburgh" or "garden maintenance Edinburgh" we are in the rankings but google search links these phrases to the home page not to their targeted pages. the site is about a year old have checked the robots.txt, sitemap.xml & .htaccess files but can see anything wrong there. any ideas out there?
Intermediate & Advanced SEO | | livingphilosophy0 -
On-site Search - Revisited (again, *zZz*)
Howdy Moz fans! Okay so there's a mountain of information out there on the webernet about internal search results... but i'm finding some contradiction and a lot of pre-2014 stuff. Id like to hear some 2016 opinion and specifically around a couple of thoughts of my own, as well as some i've deduced from other sources. For clarity, I work on a large retail site with over 4 million products (product pages), and my predicament is thus - I want Google to be able to find and rank my product pages. Yes, I can link to a number of the best ones by creating well planned links via categorisation, silos, efficient menus etc (done), but can I utilise site search for this purpose? It was my understanding that Google bots don't/can't/won't use a search function... how could it? It's like expeciting it to find your members only area, it can't login! How can it find and index the millions of combinations of search results without typing in "XXXXL underpants" and all the other search combinations? Do I really need to robots.txt my search query parameter? How/why/when would googlebot generate that query parameter? Site Search is B.A.D - I read this everywhere I go, but is it really? I've read - "It eats up all your search quota", "search results have no content and are classed as spam", "results pages have no value" I want to find a positive SEO output to having a search function on my website, not just try and stifle Mr Googlebot. What I am trying to learn here is what the options are, and what are their outcomes? So far I have - _Robots.txt - _Remove the search pages from Google _No Index - _Allow the crawl but don't index the search pages. _No Follow - _I'm not sure this is even a valid idea, but I picked it up somewhere out there. _Just leave it alone - _Some of your search results might get ranked and bring traffic in. It appears that each and every option has it's positive and negative connotations. It'd be great to hear from this here community on their experiences in this practice.
Intermediate & Advanced SEO | | Mark_Elton0 -
Change in sitemap from XML to PHP caused to lose all organic rankings
Hi MOZers, I need some advice for my website: http://www.scorepromotions.ca/ I recently changed the sitemap submitted to GWT from http://www.scorepromotions.ca/sitemap.xml to http://www.scorepromotions.ca/google-sitemap.php I deleted the previously submitted XML sitemap from GWT on Friday & submitted the PHP sitemap on the advice of our developer. On Saturday, I noticed that all our organic rankings disappeared. So, I changed the PHP sitemap back to XML sitemap on Sunday. I am hoping to see my organic rankings recover to previous levels. Does anyone have any advice or experience to share about this issue ? Ankush
Intermediate & Advanced SEO | | ScorePromotions0 -
Is it Possible to Optimize Another Company Name/Product for Organic Results?
We have a potential new client seeking to rank organically for a company name and brand product who have left the market (still in business as they are a massive medical company, but just moved away from selling this one specific product) and my new client has since picked up their slack and slowly a couple others have entered the market over the past 5 years with different products. My client's product is a direct replacement/alternative to the other product, actually, it's the same, just had to be renamed/rebranded. They are wondering how to get their website ranked for that company and branded product name to show in Google SERPs organically without being slapped with a cease and desist order. We know they can do it for PPC, but how for organic results since we need the content within the site pages or tags. Any suggestions or real life experiences would be greatly appreciated! I look forward to reading your feedback. Patrick
Intermediate & Advanced SEO | | WhiteboardCreations0 -
ETags - Is it worth it?
I've been meaning to try out the eTag entity for a while now. They seem like a great way to notify the bot when to and when not to fetch your content. Fruthermore, it is impliad that proxy servers can make use of them and help your site load faster by not fetching a newer copy if one is not available. This is not something that is easy to test on a small site and implementation on bigger sites is in my case a one way road and a few weeks in hell with the developers on staff. Will eTags take some load off the a site with a lot of traffic and dynamically generated content? Is this a good practice, as far as search engines are concerned?
Intermediate & Advanced SEO | | Svetoslav0