Pages not ranking - Linkbuilding Question
-
It has been about 3 months since we made some new pages, with new, unique copy, but alot of pages (even though they have been indexed) are not ranking in the SERPS I tested it by taking a long snippet of the unique copy form the page and searching for it on Google. Also I checked the ranking using http://arizonawebdevelopment.com/google-page-rank
Which may no be accurate, I know, but would give some indication. The interesting thing was that for the unique copy snippets, sometimes a different page of our site, many times the home page, shows up in the SERP'sSo my questions are:- Is there some issue / penalty / sandbox deal with the pages that are not indexed? How can we check that?
- Or has it just not been enough time?
- Could there be any duplicate copy issue going on? Shouldn't be, as they are all well written, completely unique copy. How can we check that?
- Flickr image details - Some of the pages display the same set of images from flickr. The details (filenames, alt info, titles) are getting pulled form flickr and can be seen on the source code. Its a pretty large block of words, which is the same on multiple pages, and uses alot of keywords. Could this be an issue considered duplication or keyword stuffing, causing this. If you think so , we will remove it right away. And then when do we do to improve re-indexing?
The reason I started this was because we have a few good opportunities right now for links, and I was wondering what pages we should link to and try to build rankings for. I was thinking about pointing one to /cast-bronze-plaques, but the page is not ranking. The home page, obviously is the oldest page, and ranked the best. The cast bronze plaques page is very new.
- Would linking to pages that are not ranking well be a good idea?
- Would it help them to get indexed / ranking?
- Or would it be better to link to the pages that are already indexed / ranking?
- If you link to a page that does not seem to be indexed, will it help the domains link profile? Will the link juice still flow through the site
-
These two pages are similar but definitely not duplicates. I wouldn't worry about that being the issue. The first two answers in this thread have it right, you need to build links internally and externally to these new pages to help them out. You are indexed just fine, just need some link love.
Kate
-
Do I need to re-ask the question, or repost it? Is having SEO MOZ review it like an escalation of the question?
Thanks
-
You are a pro member, so you get two questions per month. Make sure you provide a link to this thread for reference.
-
Thanks Richard - How do I get it to go upto someone at SEOMoz to confirm?
-
Page looks great by the way!
Yes, there is lots of duplicate content here. However, with the other page copy, I would think you would not get penalized.
I must admit, this should go up to someone at SEOmoz to confirm.
-
Hi Richard
Wanted to see if you could see the links, and if you feel the flickr code on those pages is a good idea or not?
Thanks
-
Thanks Richard
www.impactsigns.com/cast-bronze-plaques
compared to
-
Yes, please post a link.
I am going with that as long as there is other content on the page and not simply a redundant pulling of Flickr code, you will fine.
If you are pulling the Flickr code and on that page is just a recompilation of images in a different order, then yes, I would say a duplicate content issue will arise.
I think that answered you question?
-
Richard
Thanks for your reply. It all makes sense. Was wondering if you coudl give me some detial about #4 (Flickr code showing) as I wwant to be sure I was clear, and what we are doing is not harming us.
- So even though the images are used on multiple pages, and the code is pulling the alt tags, captions, and titles (this has actually helped us rank for some longer tail kw, and have alot of images show up in image search which is good for us) and there are alot of KW in the code block, it would not penalize us in any way?
- I know there is not reallu a duplicate content penalty per se, and more of a filter, so for each query G can choose which of our "duplicated" page is most relevant. Would this be the same here?
- We have very well written, persuasize, and KW balanced on page copy, but if you look at the source code, the % of words taken up by the flick images infor is so large compared to our sales copy. Woudl this be drowning out the kw in the sales copy?
- Could I post a coupel URLS?
Thanks
Shabbir
-
Wow, I think you used up all your Q&A points on this post alone : )
-
No
-
Could be, but probably not
-
doubt it.
-
No
1a) Yes, very much so. Link to it, blog it, tweet it, and post on Facebook and other social sites
2a) Yes
- Yes
- Make sure the page is listed in the site XML and the new XML is uploaded to Google Webmaster.
- Be sure to link to this page from strong pages on your site, or blog.
- Get outside pages linking to this page.
- Blog it, tweet it, Facebook it, etc.
I hope that helps.
-
-
I guess it really comes down to what key phrase you are trying to rank, adding new pages with unique copy doesn't mean they will rank automatically, apart from onsite factors, you need to look into external factors as well, this includes building links to the new pages or taking advantage of social signals (if this applies to your industry).
To see whether there is any duplicate copy issue, I recommend using this http://www.copyscape.com, you can check whether there are any duplicate copy floating around on the net.
In regards to the flickr images, it really depends on the alt tag, how they describe the images, I don't think there is a problem using the same images on different pages with the same alt tags but if alt tags are all keywords, that might be a problem.
In regards to link building, my recommendation is to link to the page that would benefit users the most because apart from getting traffic in, I look into getting them to the most useful page to get them to convert, and I believe Google likes this more than just ranking the homepage. So if you find the "not-ranking" page beneficial to users, I would link to it and it will help get them indexed/ranking. One other thing you need to look into is the quality of the link, make sure it's relevant to your industry, because if they are just random links, Google might not pass value at all.
Hope this helps
-
That is a whole lot of questions so let me do my best to sum it up for you.
Your new pages are not ranking because new pages don't just rank. The quality of your content helps Google know what phrases to rank your pages for. The links to that page determine its relevance and authority, or how high it will rank for those phrases.
Putting up new content just because does not guarantee any rankings. Are there internal links to these pages? Are they in your sitemap? Do they have any external inbound links coming to them?
Make sure you have internal links to these pages as well as external links to them. Make sure the content is more than just original and well written- it has to be optimized. Make sure your title tags are all unique and keyword rich. These types of basic SEO practices should be followed first and foremost. Then if nothing is ranking like you think it should after 3 months, you can look at other things.
I would imagine that if they have been indexed but aren't ranking that they just need some optimizing and some link juice. That tends to get pages ranked pretty well.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content question
Hey Mozzers! I received a duplicate content notice from my Cycle7 Communications campaign today. I understand the concept of duplicate content, but none of the suggested fixes quite seems to fit. I have four pages with HubSpot forms embedded in them. (Only two of these pages have showed up so far in my campaign.) Each page contains a title (Content Marketing Consultation, Copywriting Consultation, etc), plus an embedded HubSpot form. The forms are all outwardly identical, but I use a separate form for each service that I offer. I’m not sure how to respond to this crawl issue: Using a 301 redirect doesn’t seem right, because each page/form combo is independent and serves a separate purpose. Using a rel=canonical link doesn’t seem right for the same reason that a 301 redirect doesn’t seem right. Using the Google Search Console URL Parameters tool is clearly contraindicated by Google’s documentation (I don’t have enough pages on my site). Is a meta robots noindex the best way to deal with duplicate content in this case? Thanks in advance for your help. AK
Technical SEO | | AndyKubrin0 -
Pages not indexable?
Hello, I've been trying to find out why Google Search Console finds these pages non-indexable: https://www.visitflorida.com/en-us/eat-drink.html https://www.visitflorida.com/en-us/florida-beaches/beach-finder.html Moz and SEMrush both crawl the pages and show no errors but GSC comes back with, "blocked by robots.txt" but I've confirmed it is not. Anyone have any thoughts? 6AYn1TL
Technical SEO | | KenSchaefer0 -
Thousands of 404-pages, duplicate content pages, temporary redirect
Hi, i take over the SEO of a quite large e-commerce-site. After checking crawl issues, there seems to be +3000 4xx client errors, +3000 duplicate content issues and +35000 temporary redirects. I'm quite desperate regarding these results. What would be the most effective way to handle that. It's a magento shop. I'm grateful for any kind of help! Thx,
Technical SEO | | posthumus
boris0 -
Many Pages Being Combined Into One Long Page
Hi All, In talking with my internal developers, UX, and design team there has been a big push to move from a "tabbed" page structure (where as each tab is it's own page) to combining everything into one long page. It looks great from a user experience standpoint, but I'm concerned that we'll decrease in rankings for the tabbed pages that will be going away, even with a 301 in place. I initially recommending#! or pushstate for each "page section" on the long form content. However there are technical limitations with this in our CMS. The next idea I had was to still leave those pages out there and to link to them in the source code, but this approach may get shot down as well. Has anyone else had to solve for this issue? If so, how did you do it?
Technical SEO | | AllyBank1 -
My website pages are not crawled, what to do?
Hi all. I have made some changes on the website so i like to crawled them by the search engines Google especially. I have made these changes around 2 weeks ago. I have submitted my website on good bookmarking websites. Also i used a tool available in Google webmasters "Fetch as Google", Resubmitted a sitemap.xml. Still my pages are not crawled your opinion please. Thanks
Technical SEO | | lucidsoftech0 -
Ranking Internationally
Whats the best URL strategy to rank for one term in different countries? We currently rank well in Australia, but are tackling many countries over the next few months. The page I am ranking is http://www.spendbitcoins.com/buy for the term 'buy bitcoins' or 'buy bitcoins in Australia' (replace Australia with each specific country, i.e. New Zealand, Japan, etc.) These are the strategies I have come up with, is one of these good or is there a different better way? Use country specific TLDs, pulling the content from the same site w/ translations Use country specific TLDs and create separate sites and create completely original content Use country specific TLDs, forwarding to the primary domain use a subdomain from the same site modify the page link to be something like spendbitcoins.com/buy/new-zealand Don't worry about any of this and just get links from sites with the proper TLD
Technical SEO | | jaychristopher0 -
Killing Page Rank flow
Another SEO has told a friend to nofollow certain internal links i.e. to their own website. There are no hard feelings but this sounds like nonsense to me. Firstly, I'm sure Matt Cutts said that the link juice will not be redistributed amongst the other links (can't find the post - does anyone have the URL or confirmation?). Secondly (and this is obvious) those pages which have links to other pages in the website will have no link juice to pass back, resulting in a lower "total sum" and the inability for that PR to flow back. In short it seems silly. Any thoughts would be interesting to hear.
Technical SEO | | IPROdigital0 -
132 pages reported as having Duplicate Page Content but I'm not sure where to go to fix the problems?
I am seeing “Duplicate Page Content” coming up in our
Technical SEO | | danatanseo
reports on SEOMOZ.org Here’s an example: http://www.ccisolutions.com/StoreFront/product/williams-sound-ppa-r35-e http://www.ccisolutions.com/StoreFront/product/aphex-230-master-voice-channel-processor http://www.ccisolutions.com/StoreFront/product/AT-AE4100.prod These three pages are for completely unrelated products.
They are returning “200” status codes, but are being identified as having
duplicate page content. It appears these are all going to the home page, but it’s
an odd version of the home page because there’s no title. I would understand if these pages 301-redirected to the home page if they were obsolete products, but it's not a 301-redirect. The referring page is
listed as: http://www.ccisolutions.com/StoreFront/category/cd-duplicators None of the 3 links in question appear anywhere on that page. It's puzzling. We have 132 of these. Can anyone help me figure out
why this is happening and how best to fix it? Thanks!0