Can anyone help me understand why google is "Not Selecting" a large number of my webpages to include when crawling my site.
-
When looking through my google webmaster tools, I clicked into the advanced settings under index status and was surprised to see that google has marked around 90% of my pages on my site as "Not Selected" when crawling. Please take a look and offer any suggestions.
-
Thank you. Thank you. Thank you. That makes so much sense. This is also the issue I am having with my communities and cities pages, pointing at my http://luxuryhomehunt.com/homes-for-sale page.
Does that make sense?
-
Thanks for the response. The pop up is running in java, and from what I have been told search engines can crawl pages so long as the opt in is running in java. Typically a visitor would hit one of our landing pages such as http://luxuryhomehunt.com/homes-for-sale/Longwood/alaqua-lakes.html where they can find information about the specific community they are searching for then if they click on a listing they would be prompted to opt in.
Do you think there may be any correlation to me using or not using canonical tags? Another thing I was wondering is if it has anything to do with, my handling of pages 2,3,4,5 etc of a city or community with more than ten listings.
I am not sure as to why your connection would have been refused, I am currently running a xml sitemap generator and maybe that had something to do with it. Either way, I am super grateful for your help and for you looking at this. I am very new to SEO and trying to learn my way through as much as possible.
-
Hmm, I just tried to click on a listing in Google but I was served a popup which required that I enter in my contact information before I could access the site http://luxuryhomehunt.com/view-property/40096215. Did you just add this pop up? Since there is no way for users to opt out of entering in contact information to view a listing, then it may be possible that the search engines are being blocked as well.
I also tried crawling the site with Screaming Frog SEO Spider and Xenu, but my connection was refused... not sure if my IP was blocked or if the site is blocking crawlers, but my guess is the search engines may be having some trouble accessing all of the pages on your site.
At the very least, I'd recommend removing that popup since it's bad for user experience and may be causing problems with the search engines.
EDIT - I did some more digging and looked the Google cache for one of your listings - http://webcache.googleusercontent.com/search?q=cache:L6LzTqj9gQUJ:luxuryhomehunt.com/view-property/40445850+&cd=6&hl=en&ct=clnk&gl=us&client=firefox-a. On this page, you have the rel="canonical" tag set to http://luxuryhomehunt.com/view-property so that tells the search engines that all of your property listing pages should use that canonical URL, which explains why most of your pages are "Not Selected" per Google -
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2642366
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139066
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
How can I fix this home page crawl error ?
My website shows this crawl error => 612 : Home page banned by error response for robots.txt. I also did not get any page data in my account for this website ... I did get keyword rankings and traffic data, I am guessing from the analytics account. url = www.mississaugakids.com Not sure really what to do with this ! Any help is greatly appreciated.
Technical SEO | | jlane90 -
Can You Use More Then One Google Local Rich Snippet on a single site/ on a single page.
I am currently working on a website for a business that has multiple office locations. As I am trying to target all four locations I was wondering if it is okay to have more then one Local Rich Snippet on a single page. (For example they list all four locations and addresses within their footer and I was wondering if I could make these local rich snippets). What about having more then one on a single website. For example if a company has multiple offices located in several different cities and have set up individual contact pages for these cities, can each page have it's own Local Rich Snippet? Will Google look at these multiple "local rich snippets" as spaming or will they recognize the multiple locations and count it towards their local seo?
Technical SEO | | webdesignbarrie1 -
How to remove all sandbox test site link indexed by google?
When develop site, I have a test domain is sandbox.abc.com, this site contents are same as abc.com. But, now I search site:sandbox.abc.com and aware of content duplicate with main site abc.com My question is how to remove all this link from goolge. p/s: I have just add robots.txt to sandbox and disallow all pages. Thanks,
Technical SEO | | JohnHuynh0 -
Dear Support, my client is a large bank and his site is https. and seomoz does not give any data. what can i do ? thank you
Dear Support, what to do in case of https pages. seomoz does not give any data about it. please help asap. Thank you.
Technical SEO | | SebestynMrton0 -
Htaccess redirects how many can i have and does it slow site down
I have had to redesign my site this year www.in2town.co.uk because my hosting company made a huge mistake while trying to update the joomla on my site which resulted in me losing thousands of pages and links. What i would like to know is, i have put some of the old urls in my htaccess file but i would like to know how many old urls i can have in my htaccess file as i am unsure how to use it properly. my idea was, to have some of the lost url links to my site and put them in my htaccess file and have them pointing to similar pages. not sure if this is a good idea or not. i think i have lost a few hundred good links but i would like to know if these urls in the htaccess file would slow down my joomla site any advice would be great.
Technical SEO | | ClaireH-1848860 -
How long does it take for Google to index a new site and has anyone experienced serious fluctuations in SERP within 2 weeks after launch?
Hi guys, I have recently launched my ecommerce jewellery site - www.luxuryfinejewellery.com - and noticed some serious swings in SERP over the last couple of weeks. From ranking No 2, 3 and 4 for the keyword 'luxury fine jewellery' on Google.com, the homepage periodically disappears from the Top 50 altogether. I thought it was the Sandbox, as I recently purchased the domain name, within the last 6 weeks, however the fact that it does rank on the 1st page some of the time is a mystery. Has anyone also experienced this? Could you provide some advice on what to expect until the the rankings settle. Thanks in advance, Satbir
Technical SEO | | deluxebydesign0