Question about url structure for large real estate website
-
I've been running a large real estate rental website for the past few years and on May 8, 2013 my Google traffic dropped by about 50%. I'm concerned that my current url structure might be causing thin content pages for certain rental type + location searches.
My current directory structure is:
domain.com/home-rentals/california/
domain.com/home-rentals/california/beverly-hills/
domain.com/home-rentals/california/beverly-hills/90210/
domain.com/apartment-rentals/california/
domain.com/apartment-rentals/california/beverly-hills/
domain.com/apartment-rentals/california/beverly-hills/90210/
etc..I was thinking of changing it to the following:
domain.com/rentals/california/
domain.com/rentals/california/beverly-hills/
domain.com/rentals/california/beverly-hills/90210/** Note: I'd provide users the ability to filter their results by rental type - by default all types would be displayed.
Another question - my listing pages are currently displayed as:
domain.com/123456And I've been thinking of changing it to:
domain.com/123456-123-Street-City-State-ZipShould I proceed with both changes - one or the one - neither - or something else I'm not thinking of?
Thank you in advance!!
-
Let me add, though - if you're already 301ing a ton of expired listings at large scale (in the thousands), I'd try to ease this in gradually. Maybe just 404 new ones and then start switching the back-log. I'm always hesitant to switch signals on thousands of pages at once.
-
This is a point of disagreement among many SEOs, but at that volume AND if people rarely link back to the individual property pages, I would lean toward 404s over 301s. It's just going to be more Google-friendly at that scope. The other option would be to develop some kind of permalink structure that you could re-use as properties change, but that really depends a lot on the logic of your site and can get pretty complex.
-
If you shoot me a PM I'll send you the site url.
After examining the larger rental sites I decided to proceed with 301'ing all the /rental-type/ directories to /rentals/ because they all appear to rely on user filtering search results rather than pre-filtered search results via urls. As we discussed previously, I think the pros outweigh the cons - but what do I know?!...
I definitely have a growing expired listing 301 problem then - each day roughly 10k listings are removed and their urls 301 to the search results for the city that the rental was located in. Should I switch the 301 to a 404 and serve the city search results the same as I do now?
I submitted a reconsideration request last week and received the "No manual spam actions found" message back.
-
It's really hard to advise without knowing more about the site, but consolidating the different types of rentals may be a good bet. If those search types are useful for visitors, then don't 301-redirect. I'd probably use rel=canonical here, or META NOIDNEX those variants.
Inactive listings are tougher. If they don't attract links and won't become active again in the future, then I think 404s are ok. A very large number of 301s that grows rapidly over time can start to cause problems and raise some red flags. It's fairly rare, but it has happened.
Removing the cities with no data is a good bet. You could META NOINDEX those, if they aren't typically linked to. I find that NOINDEX is easier to reverse later than canonical or 301. It's not an exact science, I'm afraid, and it often depends on the size of the site and the crawl architecture.
-
Should I 301 redirect the /rentals-type/* directories in the single /rentals/* and allow users to filter rental type in the search results -or- keep those pages and rel=canonical them to the /rentals/?
For listings that are no longer active (ie. rented) should we 404 those urls? We currently 301 them to the state/city searches results that the listing was located in.
Until 1 hour ago our site also allowed people to navigation into every city within every state whether we had rentals in those city/states or not. I've removed all of those pages and 301'd the urls to the main state pages which only display the cities where we have rentals. That change removed about 1500 unique urls.
Thank you again for being so helpful!! I actually tried PM'ing you but your username wouldn't come up.
-
I try not to over-interpret toolbar PR, but 500K indexed URLs for a PR5 site is, on the surface, likely to create problems for you. Best-case, your ranking ability is diluted across way too many pages. Worst case, you could encounter something on the scale of Panda.
Either way, at that scale, clean-up really can help. It is not an easy process - it takes time, and even best practices usually have to be adjusted to match the site structure and Google's reactions to your changes. For a site that size, it's really hard to give you quick and easy answers to where to start, but if there are reasonable ways to consolidate large numbers of "thin" pages, then I'd definitely consider that.
-
Thanks for the insight Dr. Meyers!!
Here's a little more information - my site's homepage is a PR5, I roughly have 225k rental listings and Google has indexed roughly 500k urls - combo of search results & listing pages.
I proceeded with changing the listing url structure from "domain.com/123456" to "domain.com/property/987-street-city-state-zip-123456" and 301'd the old format to the new. I know this probably had nothing to do with my traffic drop, but it's a change I've been planning to make and figured there's no better time than now.
My hunch is that my search result pages are the thin content culprits because I have them setup 2 ways:
- domain.com/rentals/state/city/ which returns all listings that match the search location- domain.com/apartment-rentals/state/city/ which returns all apartment listings that match the search location
It's completely possible to produce 2 very similar search results (however with different title, h1, etc.) via these 2 search urls. Do you think I should 301 the /rental-type/state/city/ to /rentals/state/city/? If needed, I can privately send you me site's url.
Glad you mentioned pagnation - all 2nd+ page result pages include the following meta tag:
<meta name="<a class="attribute-value">robots</a>" content="<a class="attribute-value">noindex</a>" />
And the on-page pagnation links look like - 2nd page result shown:
<div id="<a class="attribute-value">pagination</a>"> <a href="[?pos=0&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc](view-source:http://www.rentalsource.com/rentals/california/carson/?pos=0&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc)" rel="<a class="attribute-value">prev</a>">« preva> <a href="[?pos=0&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc](view-source:http://www.rentalsource.com/rentals/california/carson/?pos=0&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc)">1a> <span class="<a class="attribute-value">selected_page</a>">2span> <a href="[?pos=20&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc](view-source:http://www.rentalsource.com/rentals/california/carson/?pos=20&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc)">3a> <a href="[?pos=30&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc](view-source:http://www.rentalsource.com/rentals/california/carson/?pos=30&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc)">4a> <a href="[?pos=40&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc](view-source:http://www.rentalsource.com/rentals/california/carson/?pos=40&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc)">5a> <a href="[?pos=20&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc](view-source:http://www.rentalsource.com/rentals/california/carson/?pos=20&min=0&max=999999&beds=0&baths=0&pets=&pics=&sortby=min_rent&orderby=asc)" rel="<a class="attribute-value">next</a>">next »a> div>
Do you see any issues with this setup?
I've also made a few other changes since my last message:
- used linkdetox.com to analyze my backlinks and submitted a disavow request for the "toxic" ones
- purchased a "Site Audit" from Alexa and it came back with a 96/100 score
- contacted a recommended SEO firm and they want $5k per month for 6 months to fix my problem
-
Unfortunately, other than being 99% sure there was an algorithm update around May 9th (dubbed "Phantom" by some folks), and even having seen it hit a former client, we have very few clues about what it actually did. Some folks have suggested it was "Panda-like" in which case thin content could be a culprit.
It's really tough to tell without seeing the site and the scope of the problem, but doubling up all of your rental pages could absolutely create problems, especially when you pair that with geographic searches and drill-downs. A couple of things I'd dig into before you completely change your structure:
(1) What's the scope of the doubling up, relative to your entire index size?
(2) Are there other culprits, such as search sorts and filters in play?
(3) Have you managed pagination (most likely with rel=prev/next, but there are other options)? With all of these geographic folders, you might have a ton of paginated search.
I think reducing your index size could be beneficial, but I'd make sure that the rental pages are the primary culprit first. I don't think the property URL change would help that much. It's a nice-to-have, but it wouldn't impact Panda or cause you major problems with Google the way it is. It's just slightly less user-friendly and slightly less keyword-targeted. I'd deal with the thin content first.
-
No you don't need to submit a reconsideration request if you haven't received anything. Chances are you got hit by a combination of Penguin and Panda. They may have just refreshed one of the updates on the 8th. Looks like no one really knows exactly what it is. Because you're changing your link structure around, check your Webmaster Tools 404 errors to make sure something is buggy.
If you added /rental-type/ to setup the keywords and the pages both pages for regular rentals, condo rentals, townhouse rentals, for Baltimore are unique then don't bother changing your structure around. You're better off optimizing the pages further if they need it, then checking the pages linking to you to see if something has happened to them. If you have links from someone caught selling links, you wouldn't have seen a penalty but their links wouldn't pass as much SEO juice.
Yes the URLs like this
domain.com/rental/123456-123-Street-City-State-Zip
are better than
I'd make that change right away if you're just using an ID to reference properties.
-
Thank you to both of you for your prompt replies.
It appears there was some type of Google change on May 8, but according to Matt Cutts it wasn't Penguin related:
http://searchengineland.com/if-that-was-a-google-update-you-felt-googles-not-confirming-it-158925My concern with splitting the rental type results across multiple directories is that I could be creating a lot thin content pages:
domain.com/townhouse-rentals/maryland/baltimore/as opposed to:
domain.com/rentals/maryland/baltimore/I should note that the /rentals/state/city/ URLs currently exists/works on my site and I added the /rental-type/state/city/ URLs a few years ago to leverage the keyword in the directory name, title & H1 tags. My site did perform quite well with that structure for multiple years. If I did make the change I would 301 the /rental-type/* directories to /rentals/*.
I've purchased the Alexa site audit and Screaming Frog software to analyze my site. Google Webmaster Tools doesn't report any site issues and I haven't received any messages from Google. Should I submit a reconsideration request?
As for the listing URLs, in my original message I mistyped the proposed directory - I meant:
domain.com/rental/123456-123-Street-City-State-Zip -
With a 50% decline chances are you are being stalked by either a bird or bear. Heck, they could be teaming up to chase you around.
My recommendation is that you do nothing to your site until someone has conducted a full audit and it is key the person conducting this know what key indicators to watch for in your sites history regarding panda and penguin.
I tend to see much more unnoticed panda hate than unnoticed penguin hate and many people have told me they were hit by a bird but closer examination revealed the real culprit was a bamboo loving bear.
-
It's really best not to change your URL structure around. If you really need to, then definitely make sure you have 301 directs all pointed from the old links to the new ones.
The permalink keywords in the middle don't really apply as much weight as they used to. Using /home-rentals/ and /rentals/ won't immediately relate the pages to those keywords anymore. So with that, set your structure based on the different sections of your site so they don't conflict rather than inserting keywords. So example: "domain.com/search/california/" doesn't conflict with "domain.com/category/california/"
I need to see your pages to give you a better response on the last question. With permalinks, it's always good to match your page title with the page's main keyword. So if the title is 123 Street Ave then the link should be /slug/123-street-ave/. The slug is whatever descriptive keyword for that type of post is. It would be /search/, /category/, or no slug at all.
That doesn't answer your question for the SEO decline though. Chances are you've been affected by the recent Penguin 2.0 update. I'd start by checking my links and seeing if any of those sites got hit. Also check your webmaster tools and see if any notices have popped up.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why my website some pages is not index in google ?
Hi, I have submitted my pages in Google fetch for consideration tool but they are not indexed yet in the Google search. Additionally, there is also no error shown by the Google.
On-Page Optimization | | seo.kishore890 -
When You Add a Robots.txt file to a website to block certain URLs, do they disappear from Google's index?
I have seen several websites recently that have have far too many webpages indexed by Google, because for each blog post they publish, Google might index the following: www.mywebsite.com/blog/title-of-post www.mywebsite.com/blog/tag/tag1 www.mywebsite.com/blog/tag/tag2 www.mywebsite.com/blog/category/categoryA etc My question is: if you add a robots.txt file that tells Google NOT to index pages in the "tag" and "category" folder, does that mean that the previously indexed pages will eventually disappear from Google's index? Or does it just mean that newly created pages won't get added to the index? Or does it mean nothing at all? thanks for any insight!
On-Page Optimization | | williammarlow0 -
Sorry, but that URL is inaccessible
Our site is all on the https protocol so every time I use the on-page grader it tells me the link is unavailable. What can we do? When I use the http protocol (which is 301 redirected to the https) it still gives me the same message.
On-Page Optimization | | whiskeyfl1 -
Dates in URL's
I have an issue of duplicate content errors and duplicate page titles which is penalising my site. This has arisen because a number of URLs are suffixed by date(s) and have been spidered . In principle I do not want any url with a suffixed date to be spidered. Eg:- www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/06_07_13/13_07_13 http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/20_07_13/27_07_13 Only this URL should be spidered:- http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm I have over 10,000 of these duplicates and firstly wish to remove them on block from Google ( not one by one ) and secondly wish to amend my robots.txt file so the URL's are not spidered. I do not know the format for either. Can anyone help please.
On-Page Optimization | | carbisbayhols0 -
How do I switch my website to www.mydomain?
I'm having canonical URL problems. I keep trying to set everything to www.mydomain.com, but I'm using wordpress and my site keeps either disappearing completely or giving me an infinite loop error. I've tried changing my main wordpress setting, whether I have canonical URLs turned on, my .htaccess file... Could this be an issue with either my host or my domain registrar? Any ideas how I can fix it?
On-Page Optimization | | lauragee0 -
Long Url but makes no sense
Hi Just joined. Crawl states that I am getting a lot of errors, looks like the spider is getting confused and looping back on itself ? Is there a way to see where the crawl was formulated (ie where from) ? It is generating urls like: http://www.wickman.net.au/wineauction/wine_auction_alert.aspx/auction/auction/auction/auction/auction/auction/Default.aspx from http://www.wickman.net.au/wineauction/wine_auction_alert.aspx
On-Page Optimization | | blinkybill0 -
An ecomerce seo question
Looking for a few opinions on this please...Trying to reduce the number of pages I have to seo to rank on my websites and at the same time avoid the google over optimisation issues. Previously on our ecomerce websites we would have a category page for, say, 12 times, we would then seo that page for generic terms related to the page; ie, blue dress, cheap blue dress, blue party dress etc. The individual product pages would then be seoed with the title and h1 tags containing the exact product name and the url containing the product name too. This worked fine but we are suffering from some duplicate content issues of late (the products are mixture of few unique items and probably 95% imported affiliate datafeeds) as we have an average of 80,000 products per store we have neither the time nor the staff to rewrite everything (the products update daily directly from the merchants so would need to be done daily) What we are planning on moving toward is blocking the individual product pages from Google and instead putting all efforts into the category pages. The category page will contain plenty of quality unique content related to the category so the only duplicate content would be a line of the product name and price. Whilst we would still rank the category page for broad keywords we also would like to now rank the category page for 16 individual product names as there is a good profit to make made by the sheer volume of product names we plan on ranking for. Obviously we could not get all the products into the url and the page title as that would be silly but would it be acceptable to have multiple h2 tags on the page, each with a different entry, the product names (H1 will be saved for the category name). We can easily bold these keywords to help in the optimisation as per the seo moz onsite analysis tool and we can add image text to ensure the product name is featured at least twice on the page. As so few sites actually seo for the long tail product names, most retailers rank by virtue of their domain quality alone, our onsite seo doesn't have to be 100% but getting the best we can out of the page will help the efforts. Many thanks Carl
On-Page Optimization | | Grumpy_Carl0 -
Canonical URL problem
On page analysis wanted me to add a canonical url tag. However I added then re ran the on page analysis and it came up with an error. What is the proper way to add a canonical url tag in the head of an index page? ie. add a canonical tag to www.hompeage.com/index.html would it be ? Or should I ignore this for a home page? Because I add it then run the analysis again and get this? Appropriate Use of Rel Canonical Moderate fix <dl> <dt>Canonical URL</dt> <dd>"http://www.ensoplastics.com/index.html"</dd> <dt>Explanation</dt> <dd>If the canonical tag is pointing to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. Make sure you're targeting the right page (if this isn't it, you can reset the target above) and then change the canonical tag to reference that URL.</dd> <dt>Recommendation</dt> <dd>We check to make sure that IF you use canonical URL tags, it points to the right page. If the canonical tag points to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. If you've not made this page the rel=canonical target, change the reference to this URL. NOTE: For pages not employing canonical URL tags, this factor does not apply.</dd> <dd>So do I add it or not? If I don't I get a lower page rating if I take it off I get a higher page rating with room for improvement. </dd> </dl>
On-Page Optimization | | ENSO0