Getting Authority Social Blogs to Index
-
We have a few authority blogs that I manage to help increase our brand awareness and build power to our website. We have Blogspot, Wordpress, Tumblr & Typepad. Our content get's a summary syndicated to our authority blogs with an attribution link back to the original post. I also manually check them one a month to make sure it looks good and the content syndicated correctly. I even add unique content to these blogs once in awhile. I recently realized that the majority of the pages are not indexing. I added the blogs to our GSC & Bing webmasters and submitted the sitemaps. This was done on December 11th, as of now some pages indexed in Google and Bing says the sitemaps are still pending...
- Blogspot - 32/100 pages indexed
- Wordpress - 34/81 pages indexed
- Tumblr - 4/223 pages indexed
- Typepad - 3/63 pages indexed
Can anyone help me figure out why I can't get Google to index more pages or Bing to process the sitemaps timely?
-
Thank you!!!! I've printed up your responses and applying your suggestions to my 2018 strategy. I've been using similar methods but this really breaks it down and gives me what I need to make an actual organized game plan going forward.
-
what tool or strategy did you use to determine link prospects?
Buzzstream is really good tool, for me is a really good CRM to keep in order my
links prospect but is not even close to being a decent "links prospect generator"Please don't get me wrong Buzzstream is a nice tool, I use it regularly to organize
my links prospect but I do not generate them with Buzzstream I just use it to follow up themBy Order, these are the better tools for that
- Bing
Then you have
- Semrush
- Majestic
- Ahrefs
There is no a magic tool at least I don't know one. I use the API for all my tools
(Semrush, Majestic, Ahrefs and so on) to collect data, then organize it and repeat the process
over and over again at the beginning looks like a chaotic process but once you that over and over again start to recognize the patterns.It is a repetitive, tedious and time-consuming process that's why I created my own script.
And base on my experience the best SEO do the same (Create their own framework)In fact, this is how Moz was born. Started as SEOmoz agency
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
-
Hi, Lindsay, I'm glad I was useful and have brought something positive
In my case, I use Moz, Semrush, Majestic, Ahrefs and Raven Tools. All of them are really goods tools
How to determinate how many links you will need to rank on the first page?
Well, in that case, you have two options the manual, hard and slow way but very accurate and the easy and fast in my case I use the second to make a quick research like your case.
THE MANUAL WAY
With Moz
1. Select the keyword, in this case, we will use social security increase
2. Go to Moz Pro > Keyword Explorer > SERP Analysis
3. See full analysis and Export CSV
4. On that case, you will have the first 10 results for that specific keyword
5. Moz will give you this numbersMonthly Volume 1.7k-2.9k
Difficulty 62
Organic CTR 61%
Priority 616. Take every URL and run an audit with Open Site Explorer
In this case, the first result will be https://www.ssa.gov/news/cola/Domain Authority 94
Page Authority 78
It has 120 Root Domains
It has 462 Total LinksMake a deep analysis
Link Source
- External Links
- Internal Links
Link Type
- Follow
- No Follow
As an example
- Target > this page
- Link Source > Only External
- Link Type > Only Follow
Repeat the process over and over again until you get the job done
you can use Excel to collect the data or you can download the CVSWith Semrush
1. Select the keyword, in this case, we will use social security increase
2. Go to Semrush > Keyword Analytics > Organic Search Results > Export
3. Go to Semrush > Keyword Analytics > Keyword Difficulty ToolDifficulty 90.72
Volume 5904. Once you have downloaded all the URLs on Semrush (Top 10)
5. Analyze every one with Semrush
6. Semrush > Domain Analytics and again collect the data on excelWith those numbers, you will have the answer to your question
Keep in mind all those top 10 pages are big websites likeSo you will not beat them in this world or any other world or even on any other dimension
But you can use Moz, Semrush and Long Tail Pro to found some interesting long tail keywords easy to rank
and If you make your homework and re-write the content as memorable as you can
(I'm not a copywriter so I have someone in my team for that but base on my experience a good article can cost you 20$)
Found 10 or 20 Keywords focus on them, create outstanding content around those keywords found links prospect
and try to outreach them. At the end of the day, you will have a sustainable SEO strategy (Long Term SEO), not something that you pull a trick today and be gone from the search result tomorrow.NOTE: I run this tasks on an automated process (Use the API from Moz, Semrush, Majestic, ect_)_
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
-
Yes, very nice work Roman, thank you! I really appreciate your research and well thought out response.
Using your example...
I don't have Ahrefs, we use SEMRush. Pretty sure they have the same features overall. I also use Long Tail Pro, MOZ, Majestic, etc.
How did you determine this---> "You'll need backlinks from ~32 websites to rank in top 10 for this keyword"
Also, what tool or strategy did you use to determine link prospects? Where these the backlinks of those ranking for the keyword? We have buzzstream, it's a great tool for link prospecting as well.
Regarding adding lists, info graphics, statistic numbers, etc... that's on my Q1 to do list for sure. We just hired an in house developer/designer who's going to help me with this.
Thank you again!
-
Nice work, Roman.
What a generous and informative reply.
-
EGOL is right so I want to add some value to you, from my point of view (Personal Opinion based on my experience)
This is what I would do in your case
- Forget your blogs
- Analysis the articles of your main website blog
- Get some keywords useful for those post
- Make a link prospect to your post
- 20 good links pointing to a single article can give you more traffic than all your network together.
Let's take this article of your site as an example
https://www.medicarefaq.com/blog/social-security-benefit-increase-announced-2018/Lets take
social security increase as the main keyword and let see some numbers from Ahrefs- volume 4,300
- Traffic potential 31,000
- Keyword difficulty 27/100
- You'll need backlinks from ~32 websites to rank in top 10 for this keyword
So with this information in mind, I will create a list of link prospect
I made a 10 minutes research a get 150 Links Prospect with this Blog Ranking Criteria:- NOT A PBN
- Domain Authority
- Non-spun Content
- Unique IP's
- Free or Paid
- Type Of Link Given
- Indexed in Google
These are some examples of what I founded
findhealthtips.com > PA 31
diabeticlifestyle.com > PA 38
bhma.org > PA 39Another good tip is re-write the article and convert it into the most memorable article related to _social security increase,_with lists, infographics, and statistics numbers, maybe some videos of relevant people talking about the topic, internal and external links to related content.
I'm pretty sure if you do that will give you more brand awareness, mentions, authority and traffic that all your content network.
-
"Giving attribution links does not change the fact that they are duplicate and thin content. Why should Google index them? They do not provide anything new to the web. Google will either send these pages to the supplemental index or not index them at all. If they are indexed they will eventually fall out of the index."
I can see your point on this... and obviously that's the case with mine since only the content that's unique seems to be staying indexed.
"Yes, but if you can get unique and valuable links, you will be better off getting them pointing straight at your main domain."
Yes, I usually only focus on our main domain. Constantly looking for high authority links and guest blog opportunities.
-
Great answer!
-
"It's not considered duplicate or thin content because you're giving an attribution link back to the original content."
Giving attribution links does not change the fact that they are duplicate and thin content. Why should Google index them? They do not provide anything new to the web. Google will either send these pages to the supplemental index or not index them at all. If they are indexed they will eventually fall out of the index.
"Do you mean to get link value I need to have more authority backlinks to my blogspot or external links to other authority sites?"
Yes, but if you can get unique and valuable links, you will be better off getting them pointing straight at your main domain.
-
The strategy applied here is from SerpSpace syndication networks. From my experience, this strategy works well. It's not considered duplicate or thin content because you're giving an attribution link back to the original content. The blogspot pages that have indexed have shown to positively increase the page SERPs it's linking too within a day of indexing.
"And, the blogspot site will not pass any linkvalue to medicarefaq.com unless it has unique links into it from websites that are outside of your own network and not duplicates of websites that already link to medicarefaq.com."
Do you mean to get link value I need to have more authority backlinks to my blogspot or external links to other authority sites?
-
I visited a few of the post pages on the blogspot site. These pages appear to be simply the first few sentences from the medicarefaq.com website. I don't think that Google will like these pages because they are: A) signposts for medicarefaq.com, B) duplicate content of medicarefaq.com, and C) thin content.
For the blogspot site to be an asset, the content needs to be unique and substantive. And, the blogspot site will not pass any linkvalue to medicarefaq.com unless it has unique links into it from websites that are outside of your own network and not duplicates of websites that already link to medicarefaq.com.
If these were my websites, I would put all of my time into medicarefaq.com and stop working on sites that merely link to it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain authority
Hi I Want to know why my website domain authority drop last month it is in 20 now it is in 18 i want to know what are the reasons.
Intermediate & Advanced SEO | | joaozzi0 -
What does actually Mobile First Index means?
Hello All, What does actually Mobile First Index means? Is it that on my desktop in google.co.uk when I will search my keyword then site will come on top whose Mobile performance is good as per google? and then what is Mobile Second Index? Thanks!
Intermediate & Advanced SEO | | micey1231 -
I'm noticing that URL that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before?
I'm noticing that URLs that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before? Here's an example:
Intermediate & Advanced SEO | | nystromandy
http://www.thefader.com/2017/01/11/the-carter-documentary-lil-wayne-black-lives-matter0 -
Fix Google Index error
I changed my blog URL structure Can Someone please let me how to solve this?
Intermediate & Advanced SEO | | Michael.Leonard0 -
SEO of blogging websites
What are the best practices of doing SEO of article/blogging websites.
Intermediate & Advanced SEO | | Obbserv0 -
How important is the number of indexed pages?
I'm considering making a change to using AJAX filtered navigation on my e-commerce site. If I do this, the user experience will be significantly improved but the number of pages that Google finds on my site will go down significantly (in the 10,000's). It feels to me like our filtered navigation has grown out of control and we spend too much time worrying about the url structure of it - in some ways it's paralyzing us. I'd like to be able to focus on pages that matter (explicit Category and Sub-Category) pages and then just let ajax take care of filtering products below these levels. For customer usability this is smart. From the perspective of manageable code and long term design this also seems very smart -we can't continue to worry so much about filtered navigation. My concern is that losing so many indexed pages will have a large negative effect (however, we will reduce duplicate content and be able provide much better category and sub-category pages). We probably should have thought about this a year ago before Google indexed everything :-). Does anybody have any experience with this or insight on what to do? Thanks, -Jason
Intermediate & Advanced SEO | | cre80 -
Looking for guest blogging sites
Hello, Does anyone have a list or a few good guest blogging sites like Myblogguest.com and guestblogit.com (is this a good on?) where you get a link back in return for a quality post? Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Best Product URL For Indexing
My proposed URL: mydomain.com/products/category/subcategory/product detail Puts my products 4 levels deep. Is this too deep to get my products indexed?
Intermediate & Advanced SEO | | waynekolenchuk0