Dealing with Omitted Page
-
For my most competitive term, the wrong page ranks (and not well either). The landing page I built for it has never shown up for that term except after I include the omitted results. The page that does rank is category page page above it. All that's fine, because neither page was all that great...BUT, I have completely re-written the content for the landing page, got local area pictures, local testimonials and a video. So here's my question:
Should I put all that content on the landing page that's been omitted or tweak the page that ranks and put it there? To me it makes the most sense to put the content on the page that has been omitted, but I don't know how google treats pages that have been omitted in the past. Is it going to have some sort of bias against the page, because it was omitted so many times earlier for that keyword? Or, will it be treated just like any other page, and if the content is good enough, then it will rank just fine.
If anyone's dealt with this, then I'd love to hear all about it!
Thanks,
Ruben
-
Usually a page is omitted because it's very similar to another page on your site, and Google has determined that the other page is more important. Since the category page is probably higher up in your site architecture, it would make since that Google would rank the other page. Now that you've put the effort into making your landing page more unique, I would give that a chance to rank. One thing you can do is find additional places on your site to point some internal links to your landing page, and try to build some links from other sites to that page as well. That will help you tell Google "hey, this page is important." Good luck!
-
It's hard to say without knowing the specifics and how these two measure up regarding their import, but I'd say give the landing page a go. After a month or so, if it still hasn't started ranking as well as you'd like, maybe consider tweaking the content even more on the landing page. Assuming this landing page could really move some KPIs for you, it's a semi-risk worth taking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Not Indexing Pages
Hi there! I have a problem that I was hoping someone could help me with. On google search console, my website does not seem to be indexed well. In fact, even after rectifying problems that Moz's on-demand crawl has pointed out, it still does not become "valid". There are some of the excluded pages that Google has pointed out. I have rectified some of the issues but it doesn't seem to be helping. However, when I submitted the sitemap, it says that the URLs were discoverable, hence I am not sure why they can be discovered but are not deemed "valid". I would sincerely appreciate any suggestions or insights as to how can I go about to solve this issue. Thanks! Screenshot+%28341%29.png Screenshot+%28342%29.png Screenshot+%28343%29.png
Algorithm Updates | | Chowsey0 -
Have you ever seen or experienced a page indexed which is actually from a website which is blocked by robots.txt?
Hi all, We use robots file and meta robots tags for blocking website or website pages to block bots from crawling. Mostly robots.txt will be used for website and expect all the pages to not getting indexed. But there is a condition here that any page from website can be indexed by Google even the site is blocked from robots.txt; because crawler may find the page link somewhere on internet as stated here at last paragraph. I wonder if this really the case where some webpages have got indexed. And even we use meta tags at page level; do we need to block from robots.txt file? Can we use both techniques at a time? Thanks
Algorithm Updates | | vtmoz0 -
Is it stil a rule that Google will only index pages up to three tiers deep? Or has this changed?
I haven't looked into this in a while, it used to be that you didn't want to bury pages beyond three clicks from the main page. What is the rule now in order to have deep pages indexed?
Algorithm Updates | | seoessentials0 -
What do media queries have to do with the page layout update?
Who thinks the lack of media queries will have an impact on whether the page layout update affects a site?
Algorithm Updates | | kimmiedawn0 -
How much link juice does a sites homepage pass to inner pages and influence inner page rankings?
Hi, I have a question regarding the power of internal links and how much link juice they pass, and how they influence search engine ranking positions. If we take the example of an ecommerce store that sells kites. Scenario 1 It can be assumed that it is easier for the kite ecommerce store to earn links to its homepage from writing great content on its blog, as any blogger that will link to the content will likely use the site name, and homepage as anchor text. So if we follow this through, then it can be assumed that there will eventually be a large number of high quality backlinks pointing to the sites homepage from various high authority blogs that love the content being posted on the sites blog. The question is how much link juice does this homepage pass to the category pages, and from the category pages then to the product pages, and what influence does this have on rankings? I ask because I have seen strong ecommerce sites with very strong DA or domain PR but with no backlinks to the product page/category page that are being ranked in the top 10 of search results often, for the respective category and product pages. It therefore leads me to assume that internal links must have a strong determiner on search rankings... Could it therefore also be assumed that a site with a PR of 5 and no links to a specific product page, would rank higher than a site with a PR of 1 but with 100 links pointing to the specific product page? Assuming they were both trying to rank for the same product keyword, and all other factors were equal. Ie. neither of them built spammy links or over optimised anchor text? Scenario 2 Does internal linking work both ways? Whereas in my above example I spoke about the homepage carrying link juice downward to the inner category and product pages. Can a powerful inner page carry link juice upward to category pages and then the homepage. For example, say the blogger who liked the kite stores blog content piece linked directly to the blog content piece from his site and the kite store blog content piece was hosted on www.xxxxxxx.com/blog/blogcontentpiece As authority links are being built to this blog content piece page from other bloggers linking to it, will it then pass link juice up to the main blog category page, and then the kite sites main homepage? And if there is a link with relevant anchor text as part of the blog content piece will this cause the link juice flowing upwards to be stronger? I know the above is quite winded, but I couldn't find anywhere that explains the power of internal linking on SERP's... Look forward to your replies on this....
Algorithm Updates | | sanj50500 -
Not necessary to have keywords in the page? Do you agree?
I am being told by my SEO consultants that: "According to present Google algorithm it is not necessary to have keywords in the page. What is more required is the content is relevant to the page and whether visitors will stay on that page or not. If visitors stay for a longer time in your site it add bonus to the ranking of the site. So I think it is not necessary to add key phrases in the content." Do you agree?
Algorithm Updates | | PegCorwin0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0