Google Penalize?
-
Hello,
I read an statement somewhere which stated: "2 identical URLs linked to 2 different popular key phrases next to each other (on the same website/domain) will lead to a Google penalize. Google knows, that both terms are popular. This means, Google will ignore the links to your site (you'll not have any benefit) and the site you have your links on loses authority."
What are your thoughts on this statement?
Thank you.
-
We ran a test linking a new domain (lets say it was asdfdfhfgj.com) and found out that the second anchor text to the same url bears a lot less weight than the first one.
We linked asdfdfhfgj.com from a respected domain/page. Two links, different anchor texts. Both anchor texts where random strings that previously returned no results on Google. For the first months the page only ranked for the first random string, not the second. This is obviously just one data point but it does suggest something about the way Google treats links to identical urls from a single page.
However, this effect is far from a penalty and I doubt Google would ever penalize for having two links to an identical url on one page.
-
Would you mind sharing the source of this information?
I would like to read it in full context, and also check the date. I don't believe that statement has ever been true but I haven't been involved with SEO long enough to be certain.
The statement is inaccurate.
-
In most cases there's certainly less value for the second link (even more so to the target page since it's going to the exact same destination and the source page would have less link juice after the first link anyways). And yes, Google can measure several aspects of a keyword: popularity, commercial intent, search frequency, buzz/trend, etc. But I don't think one instance of this on a site would penalize an entire domain. A high instance of it would dilute a site in the hub/authority sense: http://en.wikipedia.org/wiki/HITS_algorithm as it's only pointing to one resource.
-
What are your thoughts on this statement?
It hits really high on my BS meter.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Top Landing Page has disappeared from Google Search Console but still shows at the top of Google
Hi, One of the top landing pages in my website has disappeared from GSC Search Analytics results. But I do get a good traffic from that page until now. What might be the reason for GSC to stop showing it on the results?
Technical SEO | | eduapps0 -
Google SERPs Show Different Title
Hi Guys, Can anyone please help with my situation. My domain is www.greedybins.com.au. I had title setup in every pages differently, and it has been 2 months since I made the changes. I keep checking by using site:www.greedybins.com.au in search. So far, only 1 title that been show correctly in SERPs. I used SEO Yoast before, I changed to All In One SEO Pack plugin, the titles are changing, but still not accurate as I made it. Somehow Google altered it by itself. I have tried fetch and submit sitemap couple times from Google Search Console. Could anyone please advise?
Technical SEO | | ray.soms0 -
Fake Links indexing in google
Hello everyone, I have an interesting situation occurring here, and hoping maybe someone here has seen something of this nature or be able to offer some sort of advice. So, we recently installed a wordpress to a subdomain for our business and have been blogging through it. We added the google webmaster tools meta tag and I've noticed an increase in 404 links. I brought this up to or server admin, and he verified that there were a lot of ip's pinging our server looking for these links that don't exist. We've combed through our server files and nothing seems to be compromised. Today, we noticed that when you do site:ourdomain.com into google the subdomain with wordpress shows hundreds of these fake links, that when you visit them, return a 404 page. Just curious if anyone has seen anything like this, what it may be, how we can stop it, could it negatively impact us in anyway? Should we even worry about it? Here's the link to the google results. https://www.google.com/search?q=site%3Amshowells.com&oq=site%3A&aqs=chrome.0.69i59j69i57j69i58.1905j0j1&sourceid=chrome&es_sm=91&ie=UTF-8 (odd links show up on pages 2-3+)
Technical SEO | | mshowells0 -
Google Sitemap - How Long Does it Take Google To Index?
We have changed our sitemap about 1 month ago and Google is yet to index it. We have run a site: search and we still have many pages indexed but we are wondering how long does it take for google to index our sitemap? The last sitemap we put up had thousands of pages indexed within a fortnight, but for some reason this version is taking way longer. We are also confident that there are no errors in this version. Help!
Technical SEO | | JamesDFA0 -
Is this a google dance?
My website keeps moving up and down in ranking but stays within page 2 to 3. Everyday its at a new position.
Technical SEO | | ragivan0 -
Google Places & Multiple Accounts?
As an agency that manages multiple accounts should I have all my Google Place accounts under one account or should I create a separate account for each client with a unique username and password for every client? Thanks,
Technical SEO | | fun52dig
Gary Downey0 -
Problems with google cache
Hi Can you please advise if the following website is corrupted in the eyes of Google, it has been written in umbraco and I have taken over it from another developer and I am confused to why it is behaving the way it is. cache:www.tangoholidaysolutions.com When I run this all I see is the header, the start of the main content and then the footer. If I view text view all the content is visible. The 2nd issue I have with this site is as follows: Main Page: http://www.tangoholidaysolutions.com/holiday-lettings-spain/ This page is made up of widgets i.e. locations, featured villas, content However the widgets are their own webpages in their own right http://www.tangoholidaysolutions.com/holiday-lettings-spain/location-picker/ My concern is that this part pages will affect the performance of the seo on the site. In an ideal world I would have the CMS setup so these widgets are not classed as pages, but I am working on this. Thanks Andy
Technical SEO | | iprosoftware0