Strategies for revising my duplicate content?
-
New to SEO and SEOmoz. I tried searching for this first and I'm sure it's on here but I could not find it.
I have a site that markets fishing charters in a few dozen cities. Up to now I was relying on PPC and using each city page as a landing page of sorts. Each citiy page is very similar (there are only so many ways to write about a type of fish or fishing). What would be the recommended way for optimizing this, keeping in mind the duplicate information we provide on each page seems to be important to people.
Site is www.vipfishingcharters.com
Thanks!
-
No because im guessing each page has a title tag specific to the location, if you merge 2 pages together your end up with one title tag rather then 2.
Persoanlly myself I would have a page which lists all the locations called locations.php then link to each city from that page, if you build a few links to the city page your might find the city pages get crawled more often.
So your have:
vipfishingcharters.com/locations/boca-raton-fishing-charters/
-
Wow thanks for all this feedback guys. Yes, I sensed that the best solution is write original content but it's true - there really are only a finite number of ways to explain what a sailfish is. I guess I can reword but how does Google's comparison engine work?
Assuming I could rewrite all the text, would it help to create separate file names for the images on each page (i am not going to have 40 different pictures of sailfish) - does this matter at all?
Searches for my site nearly always include a city name along with them so my original logic was to create a "landing page" for every city. Right now nearly 90% of my traffic is CPC but I'm trying to change that.
Would it make more sense to consolidate cities and group them on a smaller number of pages or does this then kill the localized search?
-
Hi Noah!
If there is to be a page for each location then there should be something useful to say about each location too.
Consider what would actually be useful to users specifically searching for the page about their area.
As well as the content about the activities you offer (which, as previous answerers have said, should be unique for each page and show you to be the 'fishing guy') I would suggest:
- Information about how to get to each location
- Local places to stay if people are to travel for the fishing activities
- Anything that makes that particular city unique/interesting in terms of the experience you offer
That type of information that is specific about each location will be unique to each landing page.
As previous answerers have also suggested, customer testimonials and other forms of user generated content would be great.
-
Nice... great idea. Get enthusiastic customers to share.. and send a photo if they caught a nice one. They will probably link to it from their facebook page if their photo is on a fishing boat with a monster fish.
-
Great suggestions, William.
If you want to be the "fishing guy" you have to earn it.
-
You could start with getting a customer testimonial, just have a page with location you fished, comment and collect them, each one will be unique, can be added to the beach they finished and added benefit 'installs trust in potential new business'.
Send the customer an email day after fishing trip with a link to the form.
That's just one way sure there is other ways.
-
First off, you have a very nice website as I am also an avid angler. I do agree with EGOL as you can build unique content for different cities, for one of my sites I promote the bass industry however I can target areas via lake location in different metros. Do you have only one launch point for your service or many? another thing you could do is partner with tackle shops within your target areas and the reference them in your article with the metro making that page include that location you are targeting.
Or you could develop a contest for the best fishing photo in that metro all in all there are many ideas on how you can get listed for these types of metro keywords related to your industry..
One person that you could learn form would be Wil Reynolds as he spoke about this stuff in one of his seminars http://www.youtube.com/watch?feature=player_detailpage&v=hSQ0DZdSDMI#t=368s
You can also reference his last blog post on SEOMOZ
http://www.seomoz.org/blog/never-worry-about-an-algorithm-update-again-a-history
All very good stuff, and welcome to SEOMOZ there is a ton of experts in hear to help.
Hope this gave you some ideas and get your gears working.
-
Each citiy page is very similar (there are only so many ways to write about a type of fish or fishing). What would be the recommended way for optimizing this, keeping in mind the duplicate information we provide on each page seems to be important to people.
I am going to be straight and honest.
Write unique content for each city. Just do it.
(there are only so many ways to write about a type of fish or fishing)
I don't believe this... get to writing. If you want to be the "fishing guy" you should be able to do this no sweat. It just takes time.
I know you will not like my answer. But you know I am right.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googlebot indexing URL's with ? queries in them. Is this Panda duplicate content?
I feel like I'm being damaged by Panda because of duplicate content as I have seen the Googlebot on my site indexing hundreds of URL's with ?fsdgsgs strings after the .html. They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later. At a loss what to do. Since Panda, I have lost a couple of dozen #1 rankings that I've held for months on end and had one drop over 100 positions.
On-Page Optimization | | sparrowdog0 -
Duplicate content, is it ever ok?
I am building a large site for a client who sells physical products. I am using WordPress as my CMS (as a piece of background information). There are a few products that need to be listed in the sites hierarchy in multiple locations as such: Parent A Child 1 Parent B Child 2 Child 3 Parent C Child 1 I am concerned that having a product exist in multiple instances will cause indexing problems for that product. I can't be the only person to come across this issue, would love some feedback on the best practices for such an issue. Thanks in advance
On-Page Optimization | | Qcmny0 -
Eshop - Prevent Duplicate Product Titles... A Strategy
Hi Mozzers ! I'm running a small online vintage bag and jewellery store (www.vintageheirloom.com). Items are unique and need to be written up individually. Whilst the details condition, age etc are unique many of the items name's aren't. For instance, we have over a year had several bags that we've named: Vintage Celine classic box bag Naturally Moz has pulled up these as duplicates, same product title. I could append a unique database id number at the end of each title but it looks strange for the customer. I could add a few details like condition, age & price e.g. Vintage Celine classic box bag - B+ 80s 90s £425 but I don't think this would be entirely unique either. Has anyone a good strategy for creating unique product titles that looks ok and informative to the customer? Thanks for looking. Kevin abut some items do have the same Does anyone have a strat
On-Page Optimization | | well-its-1-louder0 -
When it comes to duplicate page content how do I deal with correcting it. Its a dynamic e commerce site.
I am under the impression that with ecommerce sites this happens often and that there's a plug in or just simply not worry about it since queries will often find similar conent.
On-Page Optimization | | Wayne_c0 -
Content by Country
Currently we have a news website aimed at several countries. We want to filter the content of some url (home, category pages, ...) using the country of origin of the visitor. For example in the home we've heard of global character, and a column with news of the country of origin of the visitor. This may affect the position or cause a Google penalty? thank you very much
On-Page Optimization | | promonet0 -
Duplicate content
the report shows duplicate content for a category page that has more than one page. how can we avoid this since i cannot make a different meta content for the second page of the category page: http://www.geographics.com/2-Cool-Colors-Poster-Board-14x22/c183_66_327_387/index.html http://www.geographics.com/2-Cool-Colors-Poster-Board-14x22/c183_66_327_387/index.html?page=2 thanks, Madlena
On-Page Optimization | | Madlena0 -
Duplicate content - what to do?
Hi, We have a whole lot of articles on our site. In total 5232 actually. The web crawler tells me that in the articles we have a lot of duplicate content. Which is sort of nonsense, since each article is unique. Ah, some might have some common paragraphs because they are recurring news about a weekly competition. But, an example: http://www.betxpert.com/artikler/bookmakere/brandvarme-ailton-snupper-topscorerprisen AND http://www.betxpert.com/artikler/bookmakere/opdaterede-odds-pa-sportschef-situationen-pa-vestegnen These are "duplicate content", however the two article texts are not the same. The menu, and the widgets are all the same, but highly relevant to the article. So what should I do? How can i rid myself of these errors? -Rasmus
On-Page Optimization | | rasmusbang0 -
Duplicate content on homepage?
Hi I have just created a new campaign and it states that I have duplicate page content which would affect search rankings. Basically it is counting my site www.mydomain.com and www.mydomain.com/index.php as two seperate pages. How can I make it so that only www.mydomain.com is visible reducing the duplicate content issue? Many Thanks
On-Page Optimization | | idv0