SEO ANALYSIS ON A NEW SITE
-
Hi just would like if anyone could help me in provide some seo analysis on a new website http://www.ppilegalservices.co.uk/
main keyword is mis-sold ppi
Its a very competitive keyword but not being able to come on google result in long tail keywords as well, Just got ranked on brand keywords like PPI LEGAL Services.
Also running out of ideas as to how to create quality content any tips please?
many thanks
-
Thanks Keri,
for your answer , its a completely new domain, and hardly done any backlinks , was just working on some infographics and content.
-
Hi Marcus,
Thanks for taking the time to answer my question in detail.
-
Hey
Yeah, I have to second what Keri says here. This is crazy competitive so you are not going to rank a new site for these terms overnight.
The site itself has no domain authority, there is no useful or interesting content and I can't see that any of the blog posts have won any
Content wise, if this was me, I would probably look to provide the best guide on how to reclaim your PPI for free and use that to pull in links but even that would be problematic as the moneysavingexpert site (a big brand in the UK) has already done that so you are fighting a bit of a losing battle.
If we discount the money saving expert site we have several other heavy hitting domains:
Then, amongst them we have some other PPI sites but looking at the link profiles, we are talking 15,000 linking sites in some cases. Now, obviously, most of that is going to be low quality but that is what you are up against here.
So, only way to do this white hat is to go after really long tail, dig into the keywords, look at PPI for specific vendors, types of loans etc and start to build some really strong content and promote it as best as possible.
For starters, look at some of the suggested searches:
- mis sold ppi calculator
- mis sold ppi template letter
- reclaim ppi
- ppi refunds
- mis sold ppi barclays
- mis sold ppi scams
- mis sold ppi reasons
- mis sold ppi hsbc
Maybe take one of those and break them down even further so mis sold ppi hsbc
- mis sold ppi hsbc loan
- mis sold ppi hsbc credit car
- dhsbc ppi claims
- hsbc ppi refunds
- hsbc ppi reclaim
- hsbc loans ppi
- hsbc fined ppi
- hsbc ppi credit card
If we break one of them down even further we get even more ideas:
Searches related to mis sold ppi hsbc credit card
- is there ppi on hsbc credit card
- shsbc credit card ppi claim address
- payment protection insurance hsbc credit card
- hsbc credit card ppi reclaim
- hsbc mis sold payment protection
- hsbc payment protection insurance mis sold
- hsbc personal loan ppi
- hsbc mis selling ppi
You could probably go deeper still
There is always a way, but you are just not going to drop into a market like this and shoot to the top of the deck but if you could lay down a 12 month strategy to have the most generally helpful PPI blog out there and cover every minute detail starting at the long tail then you could create something that would outrank the mostly spam based strategies of the few PPI companies on the first page.
Hope this helps
Marcus
-
A quick look in OSE shows a bunch of backlinks of low quality that appear to have been removed, and nothing of quality. Was there a site on this domain before your site?
It looks like that's a real competitive term in the UK. What I'd do is look at the top ten sites and ask myself "what content can I put together that will beat out the content that's on the top sites?" Most of the sites I see are one page lead generation forms. It's going to be tough to make a name for yourself with all of that junk out there.
Can you do something like http://www.moneysavingexpert.com/reclaim/ppi-loan-insurance where you have a lot of good, quality content? This will make it useful for the users, give people a reason to link to you, and give you great long-tail content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any disadvantages of employing additional images which open in new window? Will it impact bounce rate and rankings?
Hi all, Our website is all about our software product. Generally our website pages are filled with 3 to 6 screenshots of our product features. As Google recently shifting to mobile index and pageload speed is going to be priority, we decided to compress the images on our pages and show the same images of large size in new window when someone clicks on a image. I wonder if this helps or has any disadvantages? Users may click on these clickable images while browsing the pages and may shift to new window to view the image. Will this have any negative impact on bounce rate? Please share your thoughts. Thanks
Algorithm Updates | | vtmoz0 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
How is your site holding up post Penguin 4.0 roll out?
Hi team I was reading an article this morning from Barry Schwartz at SERoundtable about the impact of Penguin 4.0 and how webmasters are reporting minimal effects since launch. There was another article posted this morning about how the roll out may have began before the launch date, which would make sense as MozCast was reporting high temperatures leading up to Friday, specifically 9/12 to 9/15. I was wondering how everyone's traffic / rankings are performing after news broke on Friday, 9/23? Were you see fluctuations before that? What are you focusing on most now that Penguin 4.0 has launched? I understand we are a few days in and this is a real time / ongoing rollout, but I would be interested to hear what everyone has been seeing so far, if anything! Hope you all are doing well and I'm looking forward to hearing from you! Good luck!
Algorithm Updates | | PatrickDelehanty
P0 -
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
New .TLD domains - SEO Value?
Hi all, I see that a new wave of domains are to be released soon. We are not talking or 1 or 2 new extensions, but more like 700 new extensions on a TLD level. What's your views on their SEO value? thanks!
Algorithm Updates | | bjs20100 -
Confused About Addon Domains and SEO
I find addon domains really confusing. Everyone I've asked so far says that they don't affect SEO but I find that really hard to believe considering the same content is on both a subdomain and a subfolder and also has it's own unique domain. PLUS (in my case) completely different niche sites are sharing the same hosting. I really don't want to pay for hosting for all of my different sites but at the same time, if it's better/safer to do so for Panda/Penguin reasons I'm happy to do that. Thank you for your time. I look forward to your opinions/suggestions!
Algorithm Updates | | annasusmiles0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Is it hurting my seo ranking if robots.txt is forbidden?
robots.txt is forbidden - I have read up on what the robots.txt file does and how to configure it but what about if it is not able to be accessed at all?
Algorithm Updates | | Assembla0