SEO Best practice for competitions
-
I am considering running a competition and wanted to get some feedback on SEO Best Practice.
We will have a unique competition URL - following the completion of the competition it will be 301'd to home page
Every entrant will be given a unique URL for the competition to share, if someone enters using there URL they get an extra ticket. This means we will create a large number of new unique URL's over a short period of time, the pages however will have the same content. Is this potentially bad for Duplicate content?Any advice? Perhaps a canonical tag on all unique competition entrant URLs?
Any other considerations?
-
Agreed, I'd approach this as a custom acquisition and brand awareness project as well. Especially given that the website would have a short life span and SEO is a long term investment.
If this was me I'd optimize the website for the competition name and use social networking to build awareness (ie. facebook, twitter, pinterest etc).
-
Robert
I completely understand what you are trying to do. It can and will work. However, I'd suggest you look into your current backlink profile and see if there are diverse kinds of links, natural links. Also, don't overdo any one kind of link building tactic. This is not a 100% Bait and Switch, but, I'll ask you this, if SEO did not exist, would you still do it ? Think of this as a Custom Acquisition and Brand Awareness technique while helping towards SEO.
-
How about you create dynamic links that are rewritten which redirect to the one main page and just swaps variables.
I.E.
Link shared = site.com/username --> which is actually site.com/user.php?u=username --> sets a cookie with "referral = username" ---> redirect to main "Registration" (or other) page ---> check/read in cookie and apply variables as needed.
That would solve the duplicate content issue while keeping track of referrals.
Cheers,
Oleg
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WordPress Themes and SEO
I am helping out a client with updating their website.
Intermediate & Advanced SEO | | cangelmarketer
The theme they currently have hasn't been updated in ages (I am going to guess years). Would there be a difference in updating to the most recent version of their theme and changing them to a completely different theme? Or because they update in the current theme is so large anyway, it won't make a difference in terms of SEO. The reason I ask is that they don't know their Themeforest details to log in and download the most recent version of the theme, so they would have to re-purchase it, and with the hosting, they have access to a range of themes includes in their package. Thanks0 -
Brightcove SEO Best Practices?
My company has decided to go with Brightcove as a video platform so we can better monetize all of the video content we create and better customize the experience as well. We have a pretty decent YouTube presence, and I won't let them stop using that because it would totally alienate us from part of our audience. So I was hoping someone could help me with the following: Are we able to keep videos hosted on YouTube as well as Brightcove without any risk of duplicate content? If we use the Brightcove player to embed videos in our on-site content, are we hindering potential organic search visibility? On the embeds, it's looking like it's using an iframe in our content (https://www.leafly.com/news/cannabis-101/sativa-indica-and-hybrid-whats-the-difference-between-cannabis-ty) - We're using a Brightcove WP plugin for the embed, but I was wondering if anyone had suggestions on a better way to implement/if this is even an issue at all. Are there any other general best practices/insights anyone has working with this platform? I found this article on their site, but I was wondering if there was anything else I should consider. Thank you in advance for any insights/answers!
Intermediate & Advanced SEO | | davidkaralisjr0 -
Search Console - Best practice to fetch pages when you update them?
Hi guys, If you make changes to a page e.g. add more content or something is it good practice to get google to fetch that page again in search console? My assumption is this way, Google can review the updated page quicker, resulting in faster changes in the SERPs for that page. Thoughts? Cheers.
Intermediate & Advanced SEO | | wozniak650 -
Subdomains + SEO
Hi everyone, So a little background - my company launched a new website (http://www.everyaction.com). The homepage is currently hosted on an amazon s3 bucket while the blog and landing pages are hosted within Hubspot. My question is - is that going to end up hurting our SEO in the long run? I've seen a much slower uptick in search engine traffic than I'm used to seeing when launching new sites and I'm wondering if that's because people are sharing the blog.everyaction.com url on social (which then wouldn't benefit just everyaction.com?) Anyways, a little help on what I should be considering when it comes to subdomains would be very helpful. Thanks, Devon
Intermediate & Advanced SEO | | EveryActionHQ0 -
Please share best practices for subfolders and paths in a domain name
I am seeking feedback on the best way to proceed with regards to a project I am working on. Say for example the domain was domain.com and this site wanted to target specific markets such as realtors, attorneys, churches, and restaurants. Which URL structure would be better? domain.com/industries/attorneys or domain.com/attorneys Can I get your feedback along with any supporting articles. This is for a large ecommerce site but these particular pages are solely going to be used for marketing purposes to bring those site visitors to the website to let them know we understand their needs. Thanks for your help> Malcom
Intermediate & Advanced SEO | | PrintPlace.com0 -
Seo Technical Audit - Trustworthy, competent and competitive firms
Hello, We are a fairly largish site designed for user generated content (forum). We have been around for over a decade and are looking to get a proper seo audit done so that we can use the inputs from the audit to build upon our seo strategy. I went through the seo recommended pages and found that the options are priced high. Are there any competent and reasonably priced firms that are known to do a great job of the audit?
Intermediate & Advanced SEO | | alfadosa0 -
Best practices with reoccurring event listings
On our client's events page there are a few reoccurring events that each have their own detail page. I'm trying to figure out what's the best practice for minimising duplicate content. For example, for the Bribie Island Markets that repeat weekly there are 2 (+more) detailed event pages: http://www.ourbribie.com/e/bribie-island-markets/1869/2013-12-07/2013-12-07
Intermediate & Advanced SEO | | michaelp85
http://www.ourbribie.com/e/bribie-island-markets/1869/2013-12-14/2013-12-14 While they both contain duplicated content, they're unique in that they display the specific events date/time. My thinking is that the future events (e.g. 2013-12-14) should have a canonical link to the upcoming/next event (i.e. 2013-12-07). However this would require constantly updating/changing the canonical links. What's the best way to deal with this from a duplicate content prospective? Any better recommendations?0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0