Why is this page not being delivered for Google search result?
-
Hey folks,
Figured I would try to get an experts insight on this.
On google search result for BLACK TITANIUM RINGS + TITANIUM-JEWELRY.COM the page that I "think" should show up is this one:
http://www.titanium-jewelry.com/black-titanium-rings.html
However, it does not. Imho, this page is highly relevant. I used Rank Tracker here on seomoz.org and the page is not even in top 50 of search engine results for google.
Our 'About Black Titanium Rings' page ranks #2 (http://www.titanium-jewelry.com/about-black-titanium.html) but the /black-titanium-rings.html page doesn't even rank.
Any suggestions on what I could look at to figure out why this page is being penalized?
We are not under a manual penalty (anymore!).
Thanks!
Ron
-
Hi Sha!
Thanks AGAIN! I just followed your instructions and gave google the 'nudge'. Now keep my fingers crossed and see how the page ranks.
I'm new to seomoz, but it's great to be part of a group with experts like you!
-
Hey James,
Welcome to SEOmoz
It's a wonderful community and quite addictive too!
Here's a post that I made a while back that might help you find your way around ... a couple of extra resources in the comments too.
Look forward to seeing you around in the future.
Sha
-
Hi James,
No worries
To give googlebot a little help, you can use the "Fetch as Googlebot" feature in webmaster tools to introduce the page to the crawler. Once the URL has been fetched, click the "submit to index" button to give it a head start.
Sha
-
Sha,
Thanks so much!! Seems so simple now, but I overlooked this. I switched it to index now and hopefully google will pick it up and start ranking it (high)
I REALLY APPRECIATE YOUR HELP!
-
Hi James,
I see this in the source code of the URL you referenced:
<meta content="noindex, nofollow" name="robots">
That might do it
Sha
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google spending majority of time on NAV bar vs. most important pages
Google is spending 37% of its time crawling the NAV bar which is on every page. Google is spending very little time 1% on the most important pages -- product pages (https://www.skinsafeproducts.com/tatcha-violet-c-brightening-serum-20-vitamin-c-10-aha) on www.skinsafeproducts.com . Does anyone know what is going on and how I can change the behavior?
Intermediate & Advanced SEO | | akih0 -
Google Detecting Real Page as Soft 404 Error
We've migrated my site from HTTP to HTTPS protocols in Sep 2017 but I noticed after migration soft 404 granularly increasing. Example of soft 404 page: https://bit.ly/2xBjy4J But these soft 404 error pages are real pages but Google still detects them as soft 404. When I checked the Google cache it shows me the cache but with HTTP page. We've tried all possible solutions but unable to figure out why Google is still indexing to HTTP pages and detecting HTTPS pages as soft 404 error. Can someone please suggest a solution or possible cause for this issue or anyone same issue like this in past.
Intermediate & Advanced SEO | | bheard0 -
Sitemap Indexed Pages, Google Glitch or Problem With Site?
Hello, I have a quick question about our Sitemap Web Pages Indexed status in Google Search Console. Because of the drastic drop I can't tell if this is a glitch or a serious issue. When you look at the attached image you can see that under Sitemaps Web Pages Indexed has dropped suddenly on 3/12/17 from 6029 to 540. Our Index status shows 7K+ indexed. Other than product updates/additions and homepage layout updates there have been no significant changes to this website. If it helps we are operating on the Volusion platform. Thanks for your help! -Ryan rou1zMs
Intermediate & Advanced SEO | | rrhansen0 -
Links / Top Pages by Page Authority ==> pages shouldnt be there
I checked my site links and top pages by page authority. What i have found i dont understand, because the first 5-10 pages did not exist!! Should know that we launched a new site and rebuilt the static pages so there are a lot of new pages, and of course we deleted some old ones. I refreshed the sitemap.xml (these pages are not in there) and upload it in GWT. Why those old pages appear under the links menu at top pages by page authority?? How can i get rid off them? thx, Endre
Intermediate & Advanced SEO | | Neckermann0 -
Google+ Page Question
Just started some work for a new client, I created a Google+ page and a connected YouTube page, then proceeded to claim a listing for them on google places for business which automatically created another Google+ page for the business listing. What do I do in this situation? Do I delete the YouTube page and Google+ page that I originally made and then recreate them using the Google+ page that was automatically created or do I just keep both pages going? If the latter is the case, do I use the same information to populate both pages and post the same content to both pages? That doesn't seem like it would be efficient or the right way to go about handling this but I could be wrong.
Intermediate & Advanced SEO | | goldbergweismancairo0 -
Empty search results labeled as Soft 404s?
I have a site with faceted search but sometimes when someone drills down too far it ends up with no results. The page and outlined and faceted navigation are still there. The site uses dynamic URLs for the faceted navigation but Google is reporting these no results pages as Soft 404s. How should we handle these? Should we redirect these? Can we return 404 in the status code but still show the no results page they are looking for? Thanks for your responses
Intermediate & Advanced SEO | | MarloSchneider0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
My page has fallen off the face of the earth on Google. What happened?
I have checked all of the usual things. My page has not lost any links or authority. It is not black listed or any other obvious sign. What's going on? This has just happened within the past 3 days.
Intermediate & Advanced SEO | | Tormz0