Is there a problem with google?
-
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it.
I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip.
On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content.
So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via [email protected] and that’s it.
They have been ranking via this method for the last couple of years (through all the Google updates).
Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
-
Its sucks, but they will get caught out eventually (as Christina says it happening already).
Its a temping trick to try and copy, but they have such a head start it would be hard to beat them at there own game. Best concentrate on white hat backlinks, then when the reckoning comes, you will have the head start on them
-
I've noticed this, but after the last algorithm update, the sites I kept my eye on dropped out of the search engines altogether. And the others moved down off of page one.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long Google will take to Disavow a link?
Just want to know how long will Google take to Disavow a link? I uploaded my file on 18 Dec 2020 and today is 5th January 2021 and still, that link is appearing in my Search Console in Top linking domains. Anyone who recently done this practice and how long it took? I mentioned the domain name below and hopefully, it will disavow all the links [subdomain+www+without www] coming from that domain. domain:abcd.com Help me out, please...
White Hat / Black Hat SEO | | seotoolsland.com0 -
Does Google crawler understand & flag a blog post has text asserting sponsorship with dofollow outbound link?
I kind of know the answer, but just wanted to get some feedback from others. For the sake of argument, assume there are no other issues with the linking blog, such as: too many ads, thin content, etc. Question: If you make a payment for a blog post with a dofollow link, and in the blog post there is something to the effect of: "this post has been sponsored by..." Will Google crawlers detect that and flag that as an unnatural link?
White Hat / Black Hat SEO | | kekepeche0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
I have plenty of backlinks but the site does not seem to come up on Google`s first page.
My site has been jumping up and down for many months now. but it never stays on Google first page. I have plenty of back-links, shared content on social media. But what could i be doing wrong? any help will be appreciated. Content is legit. I have recently added some internal links is this might be the cause? Please help .
White Hat / Black Hat SEO | | samafaq0 -
Should you include keywords in your domain name to rank well on Google Places?
Is it okay to include keywords in your domain name (as well as business name) to rank well on Google Places? In my opinion, this is very spammy and the sites using this technique will be slapped by Google sooner or later.
White Hat / Black Hat SEO | | thegoatman1 -
Can anyone recommend a Google-friendly way of utilising a large number of individual yet similar domains related to one main site?
I have a client who has one main service website, on which they have local landing pages for some of the areas in which they operate. They have since purchased 20 or so domains (although in the process of acquiring more) for which the domain names are all localised versions of the service they offer. Rather than redirecting these to the main site, they wish to operate them all separately with the goal of ranking for the specific localised terms related to each of the domains. One option would be to create microsites (hosted on individual C class IPs etc) with unique, location specific content on each of the domains. Another suggestion would be to park the domains and have them pointing at the individual local landing pages on the main site, so the domains would just be a window through which to view the pages which have already been created. The client is aware of the recent EMD update which could affect the above. Of course, we would wish to go with the most Google-friendly option, so I was wondering if anyone could offer some advice about how would be best to handle this? Many thanks in advance!
White Hat / Black Hat SEO | | AndrewAkesson0 -
Why Google still display search result so bad?
When I search this keyword Backlink คือ by Google browser(Google.co.th) then I saw these Domain that is spam keyword and worse content (Spin content and can not understand what it said) อํานาจเจริญ.dmc.tv/?p=19
White Hat / Black Hat SEO | | taradmkt
ฉะเชิงเทรา.dmc.tv/?p=28 พังงา.dmc.tv/?tag=backlink หนองคาย.dmc.tv/?p=97 ขอนแก่น.dmc.tv/?tag=backlink ชัยนาท.dmc.tv/?p=70 ตราด.dmc.tv/?tag=backlink etc As you can see the search result**.** My question is 1. How to tell Google to check these network 2. Why these network display Top 10 for 3 weeks!!!!! and after that they rank drop. 3. Why Facebook page rank on Google in the search result Please make me clear.0 -
Google Panelizes to much SEO
I just read this interesting article about a new Google Penalty that will be up in the next upcoming weeks/months about Google making changes to the algorithm. The penalty will be targeted towards websites that are over optimized or over seo'ed. What do you think about this? Is this a good thing or is this not a good thing for us as SEO marketeers? here's the link: SEL.com/to-much-seo I'm really curious as to your point of views. regards Jarno
White Hat / Black Hat SEO | | JarnoNijzing0