Will a PDF Pass PageRank?
-
I created a PDF - will it pass PageRank?
-
Adding more...
The buttons from some shopping carts will work in .pdf documents. So if you write one with a parts list, you can place a buy button beside each part to make it really easy for the person to purchase.
Also, type your domain name in the pdf. That way, if people print it and want to go back to your website you might get a navigation query. Typing the URL where it can be found might do the same thing.
-
Very interesting discussion indeed. I wonder if "creating a workpath" for text would make it easier to read. But I guess a OCR is going to have a difficult time associating the actual text with the link regardless.
-
We make our links very obvious. I would not try to hide them because I want them clicked (it is hard to monetize a .pdf but easy to monetize an .html page - so I want the visitor to get onto my .html pages).
You can lock .pdf documents so that they can not be edited. Then other webmasters are free to post them on their own domain and give me backlinks. Of, course they could rewrite their own just as any other content can be spun or rewritten.
-
I'll help you by adding one.
... And a thumbs up well spent.
-
What an interesting topic. Has anyone done any testing around effectiveness of PDF vs HTML resources and whether it treats anchor text in the same way?
-
Thanks a ton EGOL, I have been looking around for more info on this subject for quite a while. What are your thoughts on how to create the links? Is it considered black hat tactics to place invisible links in those pdf's? My thinking here is that I know competitors will start stealing our pdf documents to use for their own websites. I was thinking of placing invisible links on some key phrases that link to product pages, and then when competitors upload our pdf's to their site, we get backlinks from their websites. Does that make sense, and does it seem like a viable strategy or a potentially penalizing one?
-
Thank you. I really like this subject and enjoyed preparing that answer!
-
Wow, way to give an absolutely excellent answer. I wish I could give more than 1 thumbs up!
-
Links in .pdf documents will be displayed in your Google Webmaster Tools Backlinks, they will accumulate pagerank (I have some PR6 pdf documents), and they will pass pagerank.
It is a good idea to place links into .pdf documents that you give away on the web not only for pagerank reasons but also to give users an easy link to visit your site for more information. Think about usability when you create .pdf documents in the same ways that you think about usability for your website.
Also, if you complete the "properties" attributes of .pdf documents you can give them a title that will appear in the SERPs like a title tag on an .html webpage. I get lots of traffic from the SERPs that come straight into my .pdf documents and then click a link in the document that takes them to a relevant page on my website.
Finally... in addition to .pdf documents you can also get viable backlinks and clickthroughs from .ppt (PowerPoint) .xls (Excel) and other types of files. Consider allowing other webmasters to include them on their site. That way they can bring you links from other domains.
-
From what i have seen its a little unclear, and would largely be dependant on how you create the pdf. Provided your pdf has been created using a text editor for the text (and not mad up of a bunch of images) then if pdf's are crawled - at least you stand a chance of the text and so on ranking in the first place. (You can google search by file type, including pdf, so one would assume they should rank in their own right if tagged and as text).
Whether a pdf will pass rank or not? I would suggest it should - provided it actually ranks itself in the first place (sounds obvious I know).
-
Depending on how it's uploaded on to the page, the page can still build links to it and gain in authority and trust. The content of the PDF will likely not be able to be crawled by the engines though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I deindex a page then will Google stop counting those links pointing to it?
Hey everyone, I am deindexing some posts of my website as I think they are not providing any value to the users. My question is that if I deindex a post and it has some good quality links pointing to it, will google stop those links counting for my website?
Intermediate & Advanced SEO | | Bunnypundir0 -
Will Google recognize a canonical to a re-directed URL works?
A third party canonicalizes to our content, and we've recently needed to re-direct that content to a new URL. The third party is going to take some time updating their canonicals, and I am wondering if search engines will still recognize the canonical even though there is a re-direct in place?
Intermediate & Advanced SEO | | nicole.healthline0 -
If a website Uses <select>to dropdown some choices, will Google see every option as Content Or Hyperlink?</select>
If a website Uses <select> to dropdown some choices, will Google see every option as Content Or Hyperlink?</select>
Intermediate & Advanced SEO | | Zanox0 -
Will I mess with Authorship if I setup multiple client websites under my Webmaster tools login?
I currently have a dozen client websites or so that I have setup under my Webmaster tools login. Should I put them each separately under their own webmaster tools, then just add me as a user? Is the way I'm doing it now messing with Authorship?
Intermediate & Advanced SEO | | daviddischler0 -
Why will Google not remove a manual penalty against us?
Our site was placed under a manual penalty last year in June 2012 after penguin rolled out. We were advised by Google that we had unnatural links pointing to our site. We fought for months, running backlink checks and contacting webmasters where Google's WMT was showing the sites which had links. We have submitted numerous reconsideration requests with proof of our efforts in the form of huge well labeled spreadsheets, emails, and screen shots of online forms requesting link removal.When the disavow tool came out we thought it was a godsend and added all the sites who had either ignored us or refused to take down the links to the disavow.txt with the domain: tag. Then we submitted another reconsideration request, but to no avail.We have since had email correspondence with a member of the Google Quality Search Team who after reviewing the evidence of all our previous reconsideration requests and disavow.txt still advised us to make a genuine effort and listed sites which had inorganic links pointing to our site which were already included in the disavow.txt.Google has stated "In order for your site to have a successful reconsideration request, we will need to see a substantial, good-faith effort to remove the links, and this effort should result in a significant decrease in the number of bad links that we see."We have truly done everything we can and proven it too! Especially with all the sites in the disavow.txt there must be a decrease in links pointing to our site. What more can we do? Please help!
Intermediate & Advanced SEO | | Benbug0 -
Will Google penalize a site that had many links pointing to it with utm codes?
I want to track conversions using utm parameters from guest blog posts on sites other than my own site. Will Google penalize my site for having a bunch of external articles pointing to one page with unique anchor text but utm code? e.g. mysite.com/seo-text?utm_campaign=guest-blogs
Intermediate & Advanced SEO | | wepayinc0 -
How to Preserve PageRank for Disappearing Pages?
Pretend that USA Today has a section of their site where the sell electronics, which they locate at http://electronics.usatoday.com. The subdomain is powered by an online electronics store called NewCo via a white label. Many of the pages on this subdomain have relatively high PageRank. But few, if any, external sites link to the subdomain--the PageRank of the subdomain is largely due to internal links from the usatoday.com root domain. USA Today's deal with NewCo expires and they decide to partner with my startup instead. But, unlike NewCo, we won't be providing a white-label solution; rather, USA Today will be redirecting all of the electronics-related links on their root domain to my site instead of the electronics.usatoday.com subdomain. They also agree to direct all of the pages on electronics.usatoday.com to me. Ideally USA Today would add 301's to all of their pages on electronics.usatoday.com that direct to the corresponding pages on my site, but they don't have the engineering wherewithal or resources to do this. Therefore, what is the best way to pass the PageRank from the electronics.usatoday.com pages to my site? Would it work to have USA Today change the CNAME for electronics.usatoday.com to my site and then create pages on my site that mimic the USA today URL structure? For example, let's say there was a page located at electronics.usatoday.com/ipods. Could we give electronics.usatoday.com a CNAME form my site and then create a page on my site located at mysite.com/ipods that 301'ed to the ipod page on my site? Would that preserve the PageRank?
Intermediate & Advanced SEO | | jack789078900 -
Controlling PageRank vs flat site architecture
Hey all. Here's the scenario. I have this pretty trusted site with a relatively high PR. The navigation menu has around 300 links. But this is because it is a CSS menu that drills down into subcategories. Now, would restricting the amount of links in this menu be beneficial? I am not worried about any subcategory pages not being crawled or indexed, but I am concerned that subcategory pages will not receive as high of PageRank if they are not linked to directly from the home page, thereby lowering the ranking potential. Even with new pages that are created they receive a PR of 5 if linked to from the home page. But I'm also thinking that toning down the menu size would be beneficial by funneling more PageRank to category pages and increasing the likelihood of ranking for some core head/middle terms. I have seen sites that externalize the menu in JavaScript files and disallow it in Robots.txt to prevent too much PageRank from linking out, but SEO isn't really a one-solution-fits-all in my experience. I may try a test. Externalizing the menu may also increase the relevance for pages because I won't have a bunch of other content on the page not relevant to that page's specific keywords. Anyone with experience in this arena? I would love to hear your input. Thanks
Intermediate & Advanced SEO | | JeremyNelson580