SEO Best practice for competitions
-
I am considering running a competition and wanted to get some feedback on SEO Best Practice.
We will have a unique competition URL - following the completion of the competition it will be 301'd to home page
Every entrant will be given a unique URL for the competition to share, if someone enters using there URL they get an extra ticket. This means we will create a large number of new unique URL's over a short period of time, the pages however will have the same content. Is this potentially bad for Duplicate content?Any advice? Perhaps a canonical tag on all unique competition entrant URLs?
Any other considerations?
-
Agreed, I'd approach this as a custom acquisition and brand awareness project as well. Especially given that the website would have a short life span and SEO is a long term investment.
If this was me I'd optimize the website for the competition name and use social networking to build awareness (ie. facebook, twitter, pinterest etc).
-
Robert
I completely understand what you are trying to do. It can and will work. However, I'd suggest you look into your current backlink profile and see if there are diverse kinds of links, natural links. Also, don't overdo any one kind of link building tactic. This is not a 100% Bait and Switch, but, I'll ask you this, if SEO did not exist, would you still do it ? Think of this as a Custom Acquisition and Brand Awareness technique while helping towards SEO.
-
How about you create dynamic links that are rewritten which redirect to the one main page and just swaps variables.
I.E.
Link shared = site.com/username --> which is actually site.com/user.php?u=username --> sets a cookie with "referral = username" ---> redirect to main "Registration" (or other) page ---> check/read in cookie and apply variables as needed.
That would solve the duplicate content issue while keeping track of referrals.
Cheers,
Oleg
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which is the best to see some immediate SEO benefits and how long does it usually take to boot your DA?
I am on a mission to improve our DA and generally try to move up the ladder on SERPS. Is there anything I could do straight away to attract the most gains? Thank you
Intermediate & Advanced SEO | | Tobywith0 -
When does it make sense to make a meta description longer than what's considered best practice?
I've seen all the length recommendations and understand the reasoning is that they will be cut off when you search the time but I've also noticed that Google will "move" the meta description if the search term that the user is using is in the cached version of the page. S I have a case where Google is indexing the pages but not caching the content (at least not yet). So we see the meta description just fine on the Google results but we can't see the content cache when checking the Google cached version. **My question is: **In this case, why would it be a bad idea to make a slightly lengthier (but still relevant) meta description with the intent that one of the terms in that description could match the user's search terms and the description would "move" to highlight that term in the results.
Intermediate & Advanced SEO | | navidash0 -
Link Anchor Text - Best Practice?
Moz - Open Site Explorer using the following setup: Tab: Inbound Links
Intermediate & Advanced SEO | | Mark_Ch
Show: "all"
from: "Only Internal" I have run a number of random tests and have noticed the following results in the link anchor text. [No Anchor Text]
company name
website url
Home
etc. What is the best practice and naming convention to be used? Regards Mark0 -
Best practice to avoid cannibalization of internal pages
Hi everyone, I need help from the best SEO guys regarding a common issue : the cannibalization of internal pages between each other. Here is the case : Let's say I run the website CasualGames.com. This website provides free games, as well as articles and general presentation about given categories of Casual Games. For instance, for the category "Sudoku Games", the structure will be : Home page of the game : http://www.casualgames.com/sudoku/ Free sudoku game listings : (around 100 games listed) http://www.casualgames.com/sudoku/free/ A particular sudoku game : http://www.casualgames.com/sudoku/free/game-1/ A news regarding sudoku games : http://www.casualgames.com/sudoku/news/title The problem is that these pages seem to "cannibalize" each other. Explanation : In the SERPS, for the keyword "Casual Games", the home page doesn't appear well ranked and some specific sudoku games page (one of the 100 games) are better ranked although they are "sub-pages" of the category.. Same for the news pages : a few are better ranked than the category page.. I am kind of lost.. Any idea what would be the best practice in this situation? THANKS a LOT.
Intermediate & Advanced SEO | | laboiteac
Guillaume0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
XML Sitemaps for Message Boards / Forums - Best Practices?
I'm working with a message board that has been around for 10+ years and never taken SEO best practices into consideration. They recently started seeing mobile URLs show up in regular results, which they don't want. I'm recommending they implement multiple sitemaps to properly indicate to Google how to crawl the site and what to index. I've never dealt with a site this large so I'm not sure best practices. They have a HUGE community and new URLs are created every second. Doing a site: search returns "About 12,100,000" URLs. What are some best practices / the best way to approach sitemaps for a site of this size?
Intermediate & Advanced SEO | | MichaelWeisbaum0 -
Best Practice for Inter-Linking to CCTLD brand domains
Team, I am wondering what people recommend as best SEO practice to inter-link to language specific brand domains e.g. : amazon.com
Intermediate & Advanced SEO | | tomypro
amazon.de
amazon.fr
amazon.it Currently I have 18 CCTLDs for one brand in different languages (no DC). I am linking from each content page to each other language domain, providing a link to the equivalent content in a separate language on a different CCTLD doamin. However, with Google's discouragement of site-wide links I am reviewing this practice. I am tending towards making the language redirects on each page javascript driven and to start linking only from my home page to the other pages with optimized link titles. Anyone having any thoughts/opinions on this topic they are open to sharing? /Thomas0 -
Server cache and SEO
I have a question about server cache and seo. For example. www.chanel.com.cn , the server is in US, and uses China Cache to improve local Chinese users access speed, so what do you think this way will work for search engines spiders too? when a spider is crawlling the website, does the content it crawl on US server or China cache? what's best practice for those kind of SEO on server side? thanks Boson
Intermediate & Advanced SEO | | topchinaseo0