Why our competitor with lower DR and PA outranks us in Google?
-
Hi everyone,
I really don't understand why our competitor with lower DR and PA outranks us in Google.lv (Google Latvia). Below is a screenshot showing that our company takes #2 for the following keyword "gāzes baloni" in Google. Our DR is 24 and our PA is: 26, whereas our competitors DR is 23 and their PA is 19.
The content on our page is much better too - we have clear Title, description, Q&A section etc, whereas our competitor has very limited content, just photos of the product and titles. Any suggestions would be highly appreciated.
Thank you very much in advance.
-
Hi Jeroen,
Fair enough. We are optimizing this landing page and will publish a new version soon. Hopefully it will help us gain #1 for this keyword.
Thank you very much for your advice anyway.
-
The thing is, a link from a "DA 25 domain" is'nt just going to cut it really. It's only a signal, but it's not the data you might need in order for google determine if it's a good link or not. There's a few other roles in there as well that could add to the SEO positions as well. Domain age, content, location of server, site loading speed (i.e insights), domain history as well. If SEO was so simple to have more / better DA based domains as a linking profile then everybody would be rich by now.
I suggest you do this: create a new H2/H3 and paragraph based on the subject, like something with 200 to 300 words, place it onto the page your attempting to rank on, and wait for a week. See what happens. Sometimes throwing in an update on your page might just be enough to get better rankings as "freshness" is also a quality guide.
-
Hi Jeroen,
Thank you very much for your reply. That's the thing, I do understand that there are many factors that affect ranking positions, and backlink profile is among the most important factors. But, the thing is, that our competitor has only one backlink to this page, and not the most valuable (with DA of 25) and what is more - we also have a backlink from the exact same website (it is a forum) + we also have 8 additional domains linking to our page. The loading speed is pretty much the same, both websites are optimised for mobile etc.
-
There's a few factors for that, but, you shoud'nt stare blind on PA/DA from Moz or Ahrefs in general. Google used a different method to calculate it's SEO strength. I guess they could have a few stronger backlinks and / or anchors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Google Cache issue
Hi, We’ve got a really specific issue – we have an SEO team in-house, and have had numerous agencies look at this – but no one can get to the bottom of this. We’re a UK travel company with a number of great positions on the search engines – our brand is www.jet2holidays.com. If you try ‘Majorca holidays’, ‘tenerife holidays’, ‘gran canaria holidays’ etc you’ll see us in the top few positions on Google when searching from the UK. However, none of our destination pages (and it’s only the destination pages), show a ‘cached’ option next to them. Example: https://www.google.com/search?q=majorca+holidays&oq=majorca+holidays&aqs=chrome..69i57j69i60l3.2151j0j9&sourceid=chrome&ie=UTF-8 This isn’t affecting our rankings, but we’re fairly certain it is affecting our ability to be included in the Featured Snippets. Checked and there aren’t any noarchive tags on the pages, example: https://www.jet2holidays.com/destinations/balearics/majorca Anyone have any ideas?
Technical SEO | | fredgray0 -
How do I avoid this issue of duplicate content with Google?
I have an ecommerce website which sells a product that has many different variations based on a vehicle’s make, model, and year. Currently, we sell this product on one page “www.cargoliner.com/products.php?did=10001” and we show a modal to sort through each make, model, and year. This is important because based on the make, model, and year, we have different prices/configurations for each. For example, for the Jeep Wrangler and Jeep Cherokee, we might have different products: Ultimate Pet Liner - Jeep Wrangler 2011-2013 - $350 Ultimate Pet Liner - Jeep Wrangler 2014 - 2015 - $350 Utlimate Pet Liner - Jeep Cherokee 2011-2015 - $400 Although the typical consumer might think we have 1 product (the Ultimate Pet Liner), we look at these as many different types of products, each with a different configuration and different variants. We do NOT have unique content for each make, model, and year. We have the same content and images for each. When the customer selects their make, model, and year, we just search and replace the text to make it look like the make, model, and year. For example, when a custom selects 2015 Jeep Wrangler from the modal, we do a search and replace so the page will have the same url (www.cargoliner.com/products.php?did=10001) but the product title will say “2015 Jeep Wrangler”. Here’s my problem: We want all of these individual products to have their own unique urls (cargoliner.com/products/2015-jeep-wrangler) so we can reference them in emails to customers and ideally we start creating unique content for them. Our only problem is that there will be hundreds of them and they don’t have unique content other than us switching in the product title and change of variants. Also, we don’t want our url www.cargoliner.com/products.php?did=10001 to lose its link juice. Here’s my question(s): My assumption is that I should just keep my url: www.cargoliner.com/products.php?did=10001 and be able to sort through the products on that page. Then I should go ahead and make individual urls for each of these products (i.e. cargoliner.com/products/2015-jeep-wrangler) but just add a “nofollow noindex” to the page. Is this what I should do? How secure is a “no-follow noindex” on a webpage? Does Google still index? Am I at risk for duplicate content penalties? Thanks!
Technical SEO | | kirbyfike0 -
Does Google index has expiration?
Hi, I have this in mind and I think you can help me. Suppose that I have a pagin something like this: www.mysite.com/politics where I have a list of the current month news. Great, everytime the bot check this url, index the links that are there. What happens next month, all that link are not visible anymore by the user unless he search in a search box or google. Does google keep those links? The current month google check that those links are there, but next month are not, but they are alive. So, my question is, Does google keep this links for ever if they are alive but nowhere in the site (the bot not find them anymore but they work)? Thanks
Technical SEO | | informatica8100 -
Google is indexing my directories
I'm sure this has been asked before, but I was looking at all of Google's results for my site and I found dozens of results for directories such as: Index of /scouting/blog/wp-includes/js/swfupload/plugins Obviously I don't want those indexed. How do I prevent Google from indexing those? Also, it only seems to be doing it with Wordpress, not any of the directories on my main site. (We have a wordpress blog, which is only a portion of the site)
Technical SEO | | UnderRugSwept0 -
Google Knowledge Graph related question
I have a client who is facing age discrimination in the film industry. (Big surprise there.) The problem is, when you type in his name, Google's new Knowledge Graph displays a brief bio about him to the right of the search results. This bio snippet includes his year of birth. Wikipedia is credited as the source for the bio information about him, and yet, his Wikipedia entry doesn't include his age or birth date. Neither does his iMDb bio. So the question is, How can he figure out where Google is getting that birthdate from? He wants to try and remove it, not falsify it. Thanks for any help you can offer.
Technical SEO | | JamesAMartin0 -
Google Webmaster Creation
When creating Google Webmaster Account is it advised to create 2 accounts for the 1 domain. one for non www and one for www?
Technical SEO | | daracreative0 -
Google Shopping Australia/Google Merchant Centre
So Google Shopping has finally landed in Australia so we've got some work todo hooking it up to our client ecom sites. Right now we have a handful of clients who are setup, the feed is getting in their ok but all products are sitting in "disapproved" status in the dashboard and clicking into each individual product the status says awaiting review. I logged a support ticket with Google to get some more info on this as it doesn't look right to me (ie the disapproved status in dashboard) and got a useless templated answer. Seems that if I switch the country destination to US the products are approved and live in google.com shopping search within the hour. Switch back to Australia and they go back to disapproved status. Anyone having the same issue/seen this before? I simply don't trust Google support and wondering if there's other factors at play here.
Technical SEO | | Brendo0