Duplicate content issue with dynamically generated url
-
Hi,
For those who have followed my previous question, I have a similar one regarding dynamically generated urls.
From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page.
I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean=
but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing.
What is my solution for this? Nofollow these pages? Block them thru robots txt?
-
When I looked at your site, changing the criteria changed the listings on the page, so each page was unique. However, I'm guessing 100% of the listings can be accessed by just clicking through the pages of results without changing the criteria?
If you decide the best approach is to block the different versions of search results pages, I would consider using the canonical meta tag to specify the canonical (main) version of the page.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated Content on Wordpress Mobile&Desktop versions- Is it bad for SEO?
Hello, I use a Wordpress theme for displaying mobile and desktop versions separately. The problem is that if you use tools like screaming frog or if you look at the code (view source), you can detect duplicated content. But if you're browsing on your mobile you will see only the content I have created for the mobile version and the same if you're looking at the desktop version, you will only see the desktop content.
On-Page Optimization | | AlphaRoadside
Is this creating an SEO problem? If it is, please let me know why and if it has a solution. Thanks in advance.0 -
Content Mismatch
Hi, I've added my app to search console, and there are reported 480 content mismatch pages. How can I solve this problem?
On-Page Optimization | | Silviu0 -
Duplicate Page content | What to do?
Hello Guys, I have some duplicate pages detected by MOZ. Most of the URL´s are from a registracion process for users, so the URL´s are all like this: www.exemple.com/user/login?destination=node/125%23comment-form What should I do? Add this to robot txt? If so how? Whats the command to add in Google Webmaster? Thanks in advance! Pedro Pereira
On-Page Optimization | | Kalitenko20140 -
Duplicate Page Content
Hi, I am new to the MOZ Pro community. I got the below message for many of my pages. We have a video site so all content in the page except the video link would be different. How can i handle such pages. Can we place adsense AD's on these pages? Duplicate Page Content Code and content on this page looks similar or identical to code and content on other pages on your site. Search engines may not know which pages are best to include in their index and rankings. Common fixes for this issue include 301 redirects, using the rel=canonical tag, and using the Parameter handling tool in Google Webmaster Central. For more information on duplicate content, visit http://moz.com/learn/seo/duplicate-content. Please help me to know how to handle this.. Regards
On-Page Optimization | | Nettv0 -
Similar URLs
I'm making a site of LSAT explanations. The content is very meaningful for LSAT students. I'm less sure the urls and headings are meaningful for Google. I'll give you an example. Here are two URLs and heading for two separate pages: http://lsathacks.com/explanations/lsat-69/logical-reasoning-1/q-10/ - LSAT 69, Logical Reasoning I, Q 10 http://lsathacks.com/explanations/lsat-69/logical-reasoning-2/q10/ - LSAT 69, Logical Reasoning II, Q10 There are two logical reasoning sections on LSAT 69. For the first url is for question 10 from section 1, the second URL is for question 10 from the second LR section. I noticed that google.com only displays 23 urls when I search "site:http://lsathacks.com". A couple of days ago it displayed over 120 (i.e. the entire site). 1. Am I hurting myself with this structure, even if it makes sense for users? 2. What could I do to avoid it? I'll eventually have thousands of pages of explanations. They'll all be very similar in terms of how I would categorize them to a human, e.g. "LSAT 52, logic games question 12" I should note that the content of each page is very different. But url, title and h1 is similar. Edit: I could, for example, add a random keyword to differentiate titles and urls (but not H1). For example: http://lsathacks.com/explanations/lsat-69/logical-reasoning-2/q10-car-efficiency/ LSAT 69, Logical Reasoning I, Q 10, Car efficiency But the url is already fairly long as is. Would that be a good idea?
On-Page Optimization | | graemeblake0 -
How do you avoid duplicate content when you sell products that are produced by other manufacturers?
I have a packaging product site, and they sell products from various manufacturers. What can we do with the product detail pages? As of now, the client has copy pasted content straight from the "About" sections on the manufacturers' sites. Obviously, those manufacturers want my client to sell their products, and the products need to be described. How much of a no-no is this copy pasting, and how can I fix it?
On-Page Optimization | | lhc670 -
I have one page on my site... but still get duplicate name and content errors.
i have only the index.html page. my domain has a permanent 301 to the root. why am i getting duplicate problems? i only have one page the index .html???
On-Page Optimization | | one4u2see0 -
Does 301 generate organic content ?
I manage this domain name www.jordanhundley.com . Right now it is 301 to www.jordanhundley.net where I hosted the content for almost 18 months. At this point you are only able to read the 301 script if you use CTRL U at the .com domain. Does Google read the content beyond the script? Is the 301 website getting juice from the targeted domain ? This is the script I´m using <html> <head> <title>Jordan Hundleytitle> head> <frameset rows="100%,*" border="0"> <frame src="[http://www.jordanhundley.net](view-source:http://www.jordanhundley.net/)" frameborder="0" /> frameset><noframes>noframes> html>
On-Page Optimization | | mPloria0