Are backlinks the reason for my site's much lower SERP ranking, despite similar content?
-
Hi all,
I'm trying to determine why my site (surfaceoptics.com) ranks so much lower than my competitor's sites. I do not believe the site / page content explains this differential in ranking, and I've done on-site / on-page SEO work without much or any improvement. In fact I believe my site is very similar in quality to competitor sites that rank much higher for my target keyword of: hyperspectral imaging.
This leads me to believe there is a technical problem with the site that I'm not seeing, or that the answer lies in our backlink profile. The problem is that I've compared our site with 4 of our competitors in the Open Site Explorer and I'm not seeing a strong trend when it comes to backlinks either. Some competitors have more links / better backlink profiles but then other sites have no external links to their pages and lower PA and DA and still outrank us by 30+ positions.
How should I go about determining if the problem is backlinks or some technical issue with the site?
-
Thanks for both of your responses!
@IrvCo_Interactive I do have a Webmaster Tools and I do not have any warnings. But I do see 70 errors under Search Appearance > Structured Data, all regarding Missing: author and/or Missing: updated. I will look into what that's about.
I am working to fix the weird subdomains coming up in Google for our site. We never used these domains and someone suggested to me that this might be a result of a wildcard A record in my DNS. This other Moz article mentioned to use .htaccess to 301 these, so that might be the fix I use: http://moz.com/blog/find-your-sites-biggest-technical-flaws-in-60-minutes
I am not quite sure what I need to do to make sure my site's internal structure is solid, I just set it up in ways that seemed intuitive from a user perspective. I'm sure there are articles out there on this subject though.
@Moosa Hemani Indeed that site http://www.hvhtek.com/associates/surfopt/surfopt.html does use a few paragraphs from our site. What tool did you use to identify that this duplication was out there? I have done almost no work on link building and neither has anyone else, so in a sense what we have now is our natural link profile. But our site has been around for a long time so we may have gathered some bad links anyway.
-
There are literally tons of factors that are involved when Google ranks a site so this can be anything other than just links and quality of content. There are other factors like on-page optimization, site architecture, how you have been discussed on other websites, social media and much more…
I give a quick look to your website and here are the following things I witnessed in a one go!
- Duplication of Content
I believe there are two different websites were you are using the similar content, I just checked one page and Google says the content that is available on the website is more or less similar to the content available on this URL: http://www.hvhtek.com/associates/surfopt/surfopt.html
I believe this page also represents your company but technically they are two different domains containing duplicate content.
The idea is to have a unique and quality content on the website and this can help you improve your organic visibility.
- Links
In my honest opinion, links are playing their part too!
Open Site explorer said you are getting 182 links from the 10 root domains, probably you should increase the links from the root domains and if you are using tactics like blog commenting or forum posting to get links then consider avoiding it as this will hurt your rankings instead of building it.
Many of the links are no follow (probably coming from blog comments). It is important to have links from diversified areas and not all should be no followed or do followed.
Keeping the link profile as natural as possible is a way to go and outrank your competitors from the desired key phrases.
These are the quick analysis, I believe digging the website in detail might help you caught more areas where your attention is needed. There can be multiple factors that are playing their part and as a whole you are not been able to rank above your competitors and win organic traffic!
Hope this helps!
-
There are hundreds of ranking factors, so essentially what you are asking is the entire process of SEO.
Do you have Google Webmaster Tools for your site? That might be a good starting point. It's not just the quantity of links but the quality as Google sees it. Open Site Explorer is a great tool but you should check to see if Google has any warnings or alerts in your account.
I also like to look at how Google indexes a site. You get interesting results when you type in site:surfaceoptics.com into Google. Is the order of the pages what you would expect? Are high value pages coming up above low value pages? If not maybe your own internal linking on the site needs to be rethought. Do the email servers need to be indexable by Google? (pop, smtp, mail, mx, etc.) That seems a little wonky to me. I don't see that too often. You should maybe block those with a robots.txt file or something?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Words To Make Content 'unique?'
Hi All, I'm currently working on creating a variety of new pages for my website. These pages are based upon different keyword searches for cars, for example used BMW in London, Used BMW in Edinburgh and many many more similar kinds of variations. I'm writing some content for each page so that they're completely unique to each other (the cars displayed on each page will also be different so this would not be duplicated either). My question is really, how much content do you think that I'll need on each page? or what is optimal? What would be the minimum you might need? Thank for your help!
Technical SEO | | Sandicliffe0 -
Duplicate Content - What's the best bad idea?
Hi all, I have 1000s of products where the product description is very technical and extremely hard to rewrite or create an unique one. I'll probably will have to use the contend provided by the brands, which can already be found in dozens of other sites. My options are: Use the Google on/off tags "don't index
Technical SEO | | Carlos-R
" Put the content in an image Are there any other options? We'd always write our own unique copy to go with the technical bit. Cheers0 -
When to 301 a No1 ranking site to the new domain?
I have a site [company.com] that ranks number one for the products of my brand but I'm moving all the efforts to a dedicated brand domain. The old site covered a number of small brands and we had no dedicated brand sites, but we now focus on just this one brand and it doesn't belong on the old company domain name. BRAND belongs on the new brand.com Because of the age of the old company site and because it had the first copy about the brand, it's still ranking well for the brand product names, and the new site has some duplicate content issues that I'm in the throws of resolving. RANKS Company.com : number one for all product names Brand.com : nowhere for product brand names but top for the brand name (as I say, the product pages on this site have duplicate content issues which is likely keeping them ranked low - Hades low. I would rather not maintain two websites and I want to give brand.com every bit of available oomph , so should I at some point 301 the old company site to the new one? If so, is now the time? Thanks
Technical SEO | | Brocberry0 -
How to remove entire directory off Google's Cache
The old version of Webmaster tools used to allow you to select whether to remove a single page from index or an entire directory.
Technical SEO | | vpahwa
http://www.canig.com/pageimages/submitremovalrequest.jpg How can I do this with the new Webmaster Tools? I can't find the option to remove an entire directory.0 -
Duplicat content affecting SEO Rankings
We have one main site called buypropertyanywhere, it is a database it holds all the data for all our property websites. One of our most popular sites is housesalesbulgaria, which takes the data from buypropertyanywhere in regards to bulgarian property and display it. The same with housesalesturkey it takes the data in regards to turkish property and display it. We think because buypropertyanywhere and housesalesbulgaria has the same data it has high duplicate content . We think this is affecting the SEO rankings for housesalesbulgaria. Google is looking at housesalesbulgaria as if a copy of buypropertyanywhere. So therefore should we SEO buypropertyanywhere soley and link it to housesalesbulgaria through the articles and content we put on the site. Thanks in advance for any advice.
Technical SEO | | Feily0 -
Site 'filtered' by Google in early July.... and still filtered!
Hi, Our site got demoted by Google all of a sudden back in early July. You can view the site here: http://alturl.com/4pfrj and you may read the discussions I posted in Google's forums here: http://www.google.com/support/forum/p/Webmasters/thread?tid=6e8f9aab7e384d88&hl=en http://www.google.com/support/forum/p/Webmasters/thread?tid=276dc6687317641b&hl=en Those discussions chronicle what happened, and what we've done since. I don't want to make this a long post by retyping it all here, hence the links. However, we've made various changes (as detailed), such as getting rid of duplicate content (use of noindex on various pages etc), and ensuring there is no hidden text (we made an unintentional blunder there through use of a 3rd party control which used CSS hidden text to store certain data). We have also filed reconsideration requests with Google and been told that no manual penalty has been applied. So the problem is down to algorithmic filters which are being applied. So... my reason for posting here is simply to see if anyone here can help us discover if there is anything we have missed? I'd hope that we've addressed the main issues and that eventually our Google ranking will recover (ie. filter removed.... it isn't that we 'rank' poorly, but that a filter is bumping us down, to, for example, page 50).... but after three months it sure is taking a while! It appears that a 30 day penalty was originally applied, as our ranking recovered in early August. But a few days later it dived down again (so presumably Google analysed the site again, found a problem and applied another penalty/filter). I'd hope that might have been 30 or 60 days, but 60 days have now passed.... so perhaps we have a 90 day penalty now. OR.... perhaps there is no time frame this time, simply the need to 'fix' whatever is constantly triggering the filter (that said, I 'feel' like a time frame is there, especially given what happened after 30 days). Of course the other aspect that can always be worked on (and oft-mentioned) is the need for more and more original content. However, we've done a lot to increase this and think our Guide pages are pretty useful now. I've looked at many competitive sites which list in Google and they really don't offer anything more than we do..... so if that is the issue it sure is puzzling if we're filtered and they aren't. Anyway, I'm getting wordy now, so I'll pause. I'm just asking if anyone would like to have a quick look at the site and see what they can deduce? We have of course run it through SEOMoz's tools and made use of the suggestions. Our target pages generally rate as an A for SEO in the reports. Thanks!
Technical SEO | | Go2Holidays0 -
URL's for news content
We have made modifications to the URL structure for a particular client who publishes news articles in various niche industries. In line with SEO best practice we removed the article ID from the URL - an example is below: http://www.website.com/news/123/news-article-title
Technical SEO | | mccormackmorrison
http://www.website.com/news/read/news-article-title Since this has been done we have noticed a decline in traffic volumes (we have not as yet assessed the impact on number of pages indexed). Google have suggested that we need to include unique numerical IDs in the URL somewhere to aid spidering. Firstly, is this policy for news submissions? Secondly (if the previous answer is yes), is this to overcome the obvious issue with the velocity and trend based nature of news submissions resulting in false duplicate URL/ title tag violations? Thirdly, do you have any advice on the way to go? Thanks P.S. One final one (you can count this as two question credits if required), is it possible to check the volume of pages indexed at various points in the past i.e. if you think that the number of pages being indexed may have declined, is there any way of confirming this after the event? Thanks again! Neil0