Where to learn how best to promote content?
-
So now I created some really good content (with help of Egol and Peter here on moz.com) and now I need to promote it. To get it in front of authoritative sites so they hopefully will write about and link to it.
I erroneously figured it would be fairly easy. I contacted two writers of a high level industry blog/magazine that previously had mentioned us in press, sent them an email with an invitation to check it out and please let me know what they thought. NO response. They probably get deluged.
Anyway, I can't afford to pay a marketing company to promote it. Where can I learn how to best do this myself? The content isn't going to help anyone if no one sees it....
Thanks for any leads!
Ron
-
Hi Ron,
A good tactic can be to find out where people are talking about or asking questions that your created content answers. A good way to do this is to use google to find forums and online communities that are discussing the very area that your content is related to. Drawing people to your content in these forums and communities can be a good way to raise brand awareness or have people reading and sharing your content (providing it is v. good quality or answers their questions).
Another great way to promote it is through social media. Doing this well though through social media usually requires a strong following and regular quality content updates. If you do this well, this platform for content promotion is most effective.
All the best,
Gav.
-
Thanks so much. You have a second career as a content marketing consultant! Heck I'd hire you!
-
This site does sell physical product.
The homepage has three vertical columns. The right column is entirely content. How to use the products, how to select them, product comparisons, how to repair them, history, printable instructions, videos, The first link in the top navigation goes to an huge page of free information. The last link goes to a blog and the next to last link goes to a youtube channel.
Make your site the "go to place" for your product niche and you will get a lot of traffic, receive fewer questions by email and kill your competitors in the long tail.
I run adsense on every page too, but block my hard core competitors.
-
Hi Egol
That's an interesting angle. Seems putting these links where your retail shoppers see it on home page and click on the link would hinder product sales conversions. Since it could be distracting them from their shopping purpose. Does your site sell a physical product?
-
Here's what I do with new content.
-
Publish it on my website.
-
Promote it on my homepage and on relevant pages on my website.
Usually, I do nothing beyond that. The page initially ranks deep in the SERPs but starts climbing over time. Slowly the traffic grows, a few of my visitors share it and that brings in more visitors. In several months the page is visible in relevant SERPs and pulling in increasing amounts of traffic.
Rarely, I will contact a blogger or two who regularly point at interesting content.
Usually, I write additional articles that are closely related but compete for entirely different keywords. These will start pulling in traffic and as the collection grows all of the pages rise in the SERPs.
Most people don't have the patience for that, but instead of spending time on promotion, I am preparing another page of content. After doing this for a few years I have a lot of traffic and a site that ranks well for most pages that I publish within a few months.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Content Cannibalism Question with example
Hi, Since I love writing and I write a lot I always find myself worried about ruining for my self with Content Cannibalism. Yesterday, while looking to learn about diamonds I encountered a highly ranked website that has two pages ranking high on the first page simultaneously (4th and 5th) - I never noticed it before with Google. The term I googled was "vvs diamonds" and the two pages were: http://bit.ly/1N51HpQ and http://bit.ly/1JefWYS Two questions: 1. Does that happen often with Google (presenting two lines from the same site on first page)? 2. Would it be better practice for the writer to combine them? - creating a one more powerful page... Thanks
Intermediate & Advanced SEO | | BeytzNet1 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Opinions on Boilerplate Content
Howdy, Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please. What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings). For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name. I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it? I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites. Thanks!
Intermediate & Advanced SEO | | kirmeliux0 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
Why are these pages considered duplicate content?
I have a duplicate content warning in our PRO account (well several really) but I can't figure out WHY these pages are considered duplicate content. They have different H1 headers, different sidebar links, and while a couple are relatively scant as far as content (so I might believe those could be seen as duplicate), the others seem to have a substantial amount of content that is different. It is a little perplexing. Can anyone help me figure this out? Here are some of the pages that are showing as duplicate: http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758 http://www.downpour.com/catalogsearch/advanced/byNarrator/?mediatype=audio+books&bioid=3665 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Marcus+Rediker/?bioid=10145 http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Robin+Miles/?bioid=2075
Intermediate & Advanced SEO | | DownPour0