What Should We Do to Fix Crawled but Not Indexed Pages for Multi-location Service Pages?
-
Hey guys! I work as a content creator for Zavza Seal, a contractor out of New York, and we're targeting 36+ cities in the Brooklyn and Queens areas with several services for home improvement. We got about 340 pages into our multi-location strategy targeting our target cities with each service we offer, when we noticed that 200+ of our pages were "Crawled but not indexed" in Google Search Console.
Here's what I think we may have done wrong. Let me know what you think...
- We used the same page template for all pages. (we changed the content and sections, formatting, targeted keywords, and entire page strategy for areas with unique problems trying to keep the user experience as unique as possible to avoid duplicate content or looking like we didn't care about our visitors.)
- We used the same featured image for all pages. (I know this is bad and wouldn't have done it myself, but hey, I'm not the publisher.)
- We didn't use rel canonicals to tell search engines that these pages were special made for the areas.
- We didn't use alt tags until about halfway through.
- A lot of the urls don't use the target keyword exactly.
- The NAP info and Google Maps embed is in the footer, so we didn't use it on the pages.
- We didn't use any content about the history or the city or anything like that. (some pages we did use content about historic buildings, low water table, flood prone areas, etc if they were known for that)
We were thinking of redoing the pages, starting from scratch and building unique experiences around each city, with testimonials, case studies, and content about problems that are common for property owners in the area, but I think they may be able to be fixed with a rel canonical, the city specific content added, and unique featured images on each page. What do you think is causing the problem? What would be the easiest way to fix it?
I knew the pages had to be unique for each page, so I switched up the page strategy every 5-10 pages out of fear that duplicate content would start happening, because you can only say so much about for example, "basement crack repair".
Please let me know your thoughts. Here is one of the pages that are indexed as an example: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/
Here is one like it that is crawled but not indexed: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/
I appreciate your time and concern. Have a great weekend!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Multiple business locations
Hi we run a Cardiff fencing company, but we also want to appear on Google for another city called Newport, do we need two business addresses, or is it just a matter of creating two separate pages on our WordPress website for the two cities? we want to be on the first page of Google.
Moz Local | | sarahwalsh0 -
Unsolved Drop in organic rankings but we cant work out why
Hi, we are an SEO agency with over 10 years experience. We have always used white hat methods, we have only ever built quality links, and written quality content marketing. We have consistently had good organic ranking's in Google. The website recently moved back in the serps, not just a bit but absolutely massively. We cant work out why, we cant justify why neither. We are wondering if any SEO experts know why such moves exist, even if you follow white hat methods?
Moz Local | | sarahwalsh0 -
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
How to optimize landing pages for local search?
I'm trying to understand how to optimize landing pages to appear in local search. For example, if someone in Chicago searches for "plumber", Yelp has a page "Top 10 Plumbers in Chicago." They are generating these pages for numerous business types and cities. I can't see anything on the page or metadata that indicates a geographic location or business type. What optimizations are they doing to get Google to know that it's a page for a specific city and type of business?
Local SEO | | Tourizee0 -
Trying to rank homepage nationally and internal pages locally?
We are a finance brokerage in Australia and we operate in a specialist niche and in regional areas with low competition but we have identified KW's that are very profitable to us but seem to need different approach re strategy. We specialise in Agribusiness lending. We have been pretty scrappy in the past with our SEO as it has always been done by me, and as a startup, as everyone knows, the jack of all trades can help and hinder! To date, we have done a lot of Adwords (and KW research) so I have a fair idea of what keywords I am after. Some KW are low competition and extremely profitable to us. But there is a difference between them on who our competitor is and how difficult it would be to rank and which strategy to use. For example Agribusiness, used by all major banks, now they provide agribusiness, but only via their own products, as we are brokers we tend to receive a lot of new leads as we are brokers and we can compare all products and as agribusiness can be quite complex this is a major point of difference for us. So my strategy to rank for this KW would include a national approach as we provide advice in this space on a national scale, which has worked well via AdWords leads. But would like to move away from my sole reliance on AdWords. Then we move onto KW that we have also had some success on a national scale via Adwords but the metrics suggest is better from a local perspective (local regional town), i.e hobby farm loan, rural finance, even home loans (when there is no other local competitor in small town). As we have brokers in other regional towns this also opens up an opportunity to have either internal pages with lots of local signals (i.e NAP, Authority outbound links, local KW, social signals from local FB groups etc). But can a internal page compete against a competitors HP, for example I was going to set up mysite/Toowoomba.com.au internal page with info re that broker and lots of local points, or am I best to create another site, i.e brandname-Toowoomba.com.au (still linking from my contact us page for Toowoomba) and focus solely on local for this site (including internal pages to rank locally, i.e Toowoomba Home loans)? the extra benefit is I then create another asset if I was to sell the region as a franchise (another discussion) So, my question is, can I mix my strategies without any issues, or should I create separate sites?
Local SEO | | AgLend0 -
Correct setup: One business, one website, two bricks and mortar locations
Hi all, we have a furniture business with two physical stores and one website, which has the ability to sell online but we hardly sell anything but we're just about to start Google PLA on a few products, let's say our website is at: nicefurniture.co.uk We have nicefurniture.co.uk/first-location and nicefurniture.co.uk/second-location set up with all the store info, a Google Map, contact info, etc. This is linked to from the footer on all pages. Issue 1: I have been told conflicting things about how to best set up our Google My Business page. One person says set the URL for both to our homepage, nicefurniture.co.uk and the other says point each GMB listing to the store location pages OR we should buy www.furniture-first-location.co.uk as our URL and point the GMB listing to that. To me, that doesn't make sense as we'd be spreading our domain equity with a new domain, surely? Issue 2: How do I get each GMB location to show up in Maps, etc when people search for 'furniture location' or even 'sofa location' or 'dining table location'? Would I need to optimize the store pages on our site? Issue 3: Years ago we did set up another URL, let's say nicefurniture-cornwall.co.uk and on it there are about 500 pages, all with links pointing to our main website. Google Search Console for our main website lists this other website as one of our top linking domains. Does that mean if we remove nicefurniture-cornwall.co.uk we'll risk a drop in rankings? Many thanks for your input.
Local SEO | | Bee1590 -
Backlinking for small service oriented websites
I am a newby to the SEO world. We are a graphic/web design/development company that has been pulled into the SEO world. We work for a lot of clients that are sensing the need for websites but they don't have the capital to invest in a large website. So after building simple sites for them they come to us wanting to be ranked better in the SERPS. I can go through and do the basics of meta tag info but with small sites, there isn't a lot you can do. I feel I need to learn to do backlinking but am completely lost. When I read about backlinks a lot of people talk about blogging websites, not service industry websites. Does anyone have tips to learn backlinking for small service companies that want to target their local area? loggers, woodworking shops, landscape companies, cabinet shops, home cleaners, bulk food stores, etc. I'm interested in maybe blogposts, or tutorials to read/watch, or software to help me out, etc. Thanks in advance for your help!
Local SEO | | Olvtr0 -
Question about Multi-Locale/Lang Sitemaps
If you have one site with multiple language and locale variations how best should one approach the sitemaps. Here is what I believe the options to be: sitemap_index.xml which includes all of the difference lang/locale sitemaps on the site create 1 main sitemap that includes the rel=alternate href lang for ever alternate page to the main US version. Do the sitemap_index.xml for all the other sitemaps and also include the rel=alternate href lang in those separate ones as well. I have these in this order because it goes from least to most work....Thoughts folks?
Local SEO | | DRSearchEngOpt0