Is this an example of bad doorway pages or perfectly fine and helping users?
-
I'm asking because I want to do something similar. http://bit.ly/1puGXJu Imagine hundreds of pages like this, with the city names switched out.
Since the inventory is different on each page, due to different inventory in different cities, are these pages not considered doorway pages and Google will probably be fine with them?
-
So it doesn't look like Google is penalizing them, and are actually rewarding them.
I agree.
I think that most of the time, Google will filter pages like these. Some of the time the entire website will be reduced in rankings.
But, sometimes stuff like this will get good rankings, and those good rankings might persist for a long time.
If a person decides to place that type of content on his website, he should know that it will probably have a short life in Google's search results and might cause his site to be penalized.
-
That's what I thought as well. However this site has had these pages for over 2 years, and in the last few months they rank in the #1 spot for all keywords. After the Penguin update last fall, these pages hit the #1 spot for all keywords. So it doesn't look like Google is penalizing them, and are actually rewarding them.
-
I would call it spam. Very low quality spam.
From what I have seen, Google has been killing pages like these for about ten years.
Pages like these usually are indexed and persist in the SERPs for a short time. But, then they are filtered from display or their websites are hit by panda within a few weeks to a few months. Occasionally they persist but are almost always filtered or the entire website penalized.
You can have a great, fantastic website and then add a few folders of pages like these and see your rankings go into the tank.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which URL and rel=canonical structure to use for location based product inventory pages?
I am working on an automotive retailer site that displays local car inventory in nearby dealerships based on location. Within the site, a zip code is required to search, and the car inventory is displayed in a typical product list that can be filtered and sorted by the searcher to fit the searchers needs. We would like to structure these product inventory list pages that are based on location to give the best chance at ranking, if not now, further down the road when we have built up more authority to compete with the big dogs in SERP like AutoTrader.com, TrueCar.com, etc. These higher authority sites are able to rank their location based car inventory pages on the first page consistently across all makes and models. For example, searching the term "new nissan rogue" in the Los Angeles, CA area returns a few location based inventory pages on page 1. The sites in the industry that are able to rank their inventory pages will display a relatively clean looking URL with no redirect that still displays the local inventory like this in the SERP:
Local Website Optimization | | tdastru
https://www.autotrader.com/cars-for-sale/New+Cars/Nissan/Rogue
but almost always use a rel=canonical tag within the page to a page with a location parameter attached to the end of the URL like this one:
https://www.autotrader.com/cars-for-sale/New+Cars/Nissan/Rogue/Los+Angeles+CA-90001"/>
I'm having a hard time figuring out why sites like this example have their URLs and pages structured this way. What would be the best practice for structuring the URL and rel=canonical tags to be able to rank for and display location based inventory pages for cars near the searcher?0 -
I can't get my page to rank. What am I doing wrong?
I'm new to this forum and this is my first question. So if I'm not supposed to ask this type of question, please forgive me. I'm trying my best to get http://www.westcoastflenterprises.com/#!roofing/bbb1e to rank on the first page in Google for "roofing contractors" in the following SW Florida cities: "Naples, Bonita Springs, and Fort Myers." Our company has a physical address in Fort Myers only so I understand it's going to be harder to get it to rank for Naples and Bonita Springs. But I can't even get this page to rank well for "roofing contractors in Fort Myers." The page authority is 25 and our domain authority is 27. Our home page authority is 39. Our primary category in Google is building restoration & preservation. But we have divisions in our company: Roofing Concrete Ornamental metals I would love it if our roofing page could rank higher than the third page, which is where it currently sits. I worked really hard to get each of our roofing-material manufacturers to link directly to our roofing page, not the home page. My hope is that you can help me because I'm really discouraged. Thanks in advance.
Local Website Optimization | | Jason_Taylor0 -
Pages ranking outside of sales area
Hi there Moz Community, I work with a client (a car dealership), that mostly serves an area within 50-100 miles at most from their location. A previous SEO company had built a bunch of comparison pages on their website (i.e. 2016 Acura ILX vs. Mercedes-Benz C300). These pages perform well in their backyard in terms of engagement metrics like bounce rate, session duration, etc. However, they pull in traffic from all over the country and other countries as well. Because they really don't have much of an opportunity to sell someone a car across the country that a customer could easily buy at their local dealership, anyone from outside their primary marketing area typically bounces. So, it drags down their overall site metrics plus all of the metrics for these pages. I imagine searchers from outside their primary sales area are seeing their location and saying "whoah that's far and not what I'm looking for." I tried localizing the pages by putting their city name in the title tags, meta descriptions, and content, but that doesn't seem to really be getting rid of this traffic from areas too far away to sell a car to. My worry is that the high bounce rates, low time on site, and general irrelevancy of these pages to someone far away are going to affect them negatively. So, short of trying to localize the content on the page or just deleting these pages all together, I'm not quite sure where to go from here. Do you think that having these high bouncing pages will hurt them? Any suggestions would be welcomed. Thanks!
Local Website Optimization | | Make_Model1 -
Need Help - Google has picked up an overseas company with the same name and put it in search on the right
Hi All, Google has picked up a competitors logo from overseas (same name) and input it with the wikipedia excerpt on the right hand side of search. What the heck can I do to get this removed as its a serious legal/brand issue. See URL - http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu Hope someone can help !! Cheers Dave http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu
Local Website Optimization | | CFCU0 -
Is my competitor doing something blackhat? - Cannot only access pages via serps , not from website navigation /search
Hi Mozzers, One of my competitors uses a trick whereby they have a number of different sitemaps containing location specific urls for their most popular categories on their eCommerce store. It's quite obvious that they are trying to rank for keyword <location>and from what I am see, you cant to any of these pages from their website navigation or search , so it's like these pages are separated from the main site in terms of accessing them but you can access the main website pages/navigation the other way round (i.e if you select one of the pages from finding it in serps) </location> I know that google doesn't really like anything you can't access from the main website but would you class this as blackhat ? / cheating etc ... They do tend to rank quite well for these alot of the pages and it hasn't seem to have affected pages on their main website in terms of rankings. I am just wondering , if it's worth us doing similar as google hasn't penalised them by the looks of things.. thanks Pete
Local Website Optimization | | PeteC120 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Different page for each service at each location? Where does it end!
If we have 15 different locations and 10 different services, do we need to make keyword targeted landing pages for each combination? Is that actually the best method or is there some alternative? For example, if we are a law office specializing in slip and falls and car accidents, do we need a page for EACH location for each service (ie. Miami Car Accident Lawyer, Miami Slip and Fall Lawyer, Orlando Slip and Fall Lawyer, Orlando Car Accident Lawyer) etc. to maximize our ranking potential in each location? Is there a better way or are we left with this until Google gets "smarter"?
Local Website Optimization | | RickyShockley0 -
How do I fix duplicate content issues if the pages are really just localized versions?
Does this still hurt our SEO? Should we place different countries on their own respective domains (.co.uk, etc)?
Local Website Optimization | | fdmgroup0