Query results being indexed and providing no value to real estate website - best course of action?
-
Hi friends, I have a real estate website that has thousands of these type of query results pages indexed - http://search.myrealestatewebsite.com/l/43453/New_York_City_Rentals?per=100&start=159
What would be the best course of action to ensure those do not get indexed, as most provide no value whatsoever.
1. I'm limited to what I can do in the IDX, but I do believe I can modify the URL parameters for the website in Webmaster tools? Would this be correct? What would my parameter look like?
2. I have a webmaster tools for the website, then also the subdomain, which one would I submit the url parameter, or both?
-
Ideally, you'd set the meta robots in that page to noindex,follow. This will allow link juice to flow from all of those pages to the pages in your main navigation as well as removing them from the index.
If you cannot modify the section of those pages, then, at a minimum, you could tell Webmaster Tools to ignore the pre and start parameters (specify that the parameter merely sorts the data on the page). Then, you'd end up with just 1 page indexed per city, which is probably a lot better than where you are now.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mixed branches / targeting countries results in the SERP
hi all, I have one hard nut to break and I would like to kindly ask you for any idea / help 🙂
Local Website Optimization | | execom99
we have web page localized to multiple languages targetting different countries.
e.g. we have: domain.com/int/ ... default in english, in search console / Internation targeting / Country with Target users in "Unlisted"
domain.com/uk/ ... english for UK, Target users in "United Kingdom"
domain.com/de/ ... german content for Germany, Target users in "Germany"
... etc. Each branch (country specific) has its own sitemap.xml covering approx 50-60% of all the pages for the specific branch and for most of these pages we have set hreflang (rel="alternate") for most important product pages. There are some issues in the sitemaps we are fixing (e.g. no returning link) so my assumption is that google may not use the sitemap, therefore, hreflang is not in use (it is part of sitemap). For example, one branch can have 150 pages submitted and 30 indexed in Sitemap detail of search console. The problem is, that when for example I search for the product name from Germany (google.de and German's IP through VPN in browser's incognito mode) I'm receiving mixed results. Our product names and technology are rather english, e.g. "cloud protection" and it is also phrase German user would search for. But in SERP he gets results from our domain.com/uk and also from our domain.com/sg which is completely wrong. Is there a way to really prevent it ?
thanks
T0 -
Discourage search engines from indexing this site AFTER a site launch
Hi, I have unticked "Discourage search engines from indexing this site" a few months before the initial release of my website. I don't want to be found by search engines until the official release (still a few months left). Do you think that ticking this box again will harm the website's long-term ranking or have any repercussion on the website? Do you have any additional advice to avoid being temporarily ranked until the official release which won't harm the website in SERPs? Thanks for your answers.
Local Website Optimization | | Juvo0 -
Which is the best, ".xx" or ".com.xx" in general and for SEO?
Hi, I'm working for a digital marketing agency and have traffic from different countries. We are planning to make different websites for each country. What is the best SEO practice to choose the domain between ".xx" or ".com.xx" from Spain, Mexico, Chile, Colombia and Peru?
Local Website Optimization | | NachoRetta
I think that the ccTLD is better always, for example ".es" better than ".com.es"0 -
How best to clean up doorway pages. 301 them or follow no index ?
Hi Mozzers, I have what is classed as doorway pages on my website. These have historically been location specific landing pages for some of our categories but from speaking to a number of different webmasters , then general consensus is that they are not in google guidelines so I will be getting punished by having them. My options are : I can 301 the pages back to their original category pages . This will conserve some link juice to pass back to the respective category page. I can set these as Follow No index. Not sure what will happen here with regards to link value etc. What would be best ?... Some of the pages do currently rank "fairly well" for some of the locations so I am getting traffic from them but I also know I will be getting a algorithmic penalty for having them so how best I clean these up ?. Also , by cleaning up the site structure , would I see any benefit here ? or will I have to wait for a new panda update/ refresh ? I thought the panda refresh won't use a new dataset thanks Pete
Local Website Optimization | | PeteC120 -
Omitted Results city-queries for the same brand on different subdomains?
I've noticed on a few occasions where two subdomains share the same brand and are also attempting to rank for phrases specific to one city - the stronger subdomain tends to send the other subdomain to the "omitted search results" for those city specific queries. The subdomains do tend to have some duplicate content that they share but if the two pages on the different subdomains are unique for the search phrase in question wouldn't Google choose to surface both results? Or is this a question of domain diversity in the SERPs where the 2 results would just be too similar since they share the same root domain and have topically similar content? I've seen cases where they can share the first page of results but more often than not it seems that one is sent to the "omitted results". Any thoughts on strategy in this situation? The companies being described end up wanting to rank for the same city because they both serve a portion of the city in case anyone is wondering.
Local Website Optimization | | GSO0 -
Best way to remove spammy landing pages?
Hey Mozzers, We recently took over a website for a new client of ours and discovered that their previous webmaster had been using a WordPress plugin to generate 5,000+ mostly duplicated local landing pages. The pages are set up more or less as "Best (service) provided in (city)" I checked Google Webmaster Tools and it looks like Google is ignoring most of these spammy pages already (about 30 pages out of nearly 6,000 are indexed), but it's not reporting any manual webspam actions. Should we just delete the landing pages all at once or phase them out a few (hundred) at a time? Even though the landing pages are mostly garbage, I worry that lopping off over 95% of a site's pages in one fell swoop could have other significant consequences. Thanks!
Local Website Optimization | | BrianAlpert780 -
Rebranding a Website to a new Domain Name
Hi All, I'm looking to rebrand my current website to a new domain name.
Local Website Optimization | | Mark_Ch
In short the current website has out grown it's potential. The domain name is not memorable nor is it attracting a wider audience.
I will create my new website and 301 redirect the old website to the new, hence pass SEO value. Google Places
Having spoken to Google they tell me that I can simply change the URL in Google Places to the new URL. Articles on my current website
I have a number of rich content articles on my current website, can I simply create my new website and copy & paste these previously written articles? Google+, Twitter, Facebook, etc.
What should I do for accounts associated with the current website? Any other useful information would be much appreciated. Regards Mark0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0