Best practice to have gated white paper indexed by Google
-
Our main website white paper page has an image and brief description of the white paper. Once you click the white paper you are redirected to a form to access the gated white paper. Once you complete that form you are redirected to the white paper pdf which is housed on a subdomain/Hubspot.
Because of this, I do not believe our website is getting "credit" for the keywords/content on these pages. Any suggestions on how we can allow the search engines to crawl this content while still keeping it gated? As I understand it a sub domain cannot hep or hurt (aside from critical crawler issues) the main domain.
Thank you
-
I think I have the same question. I was just going to ask it less intelligently. Our gated content always takes the shape of a PDF form on Hubspot.
What are the best practices for getting some SEO benefits from the Case Studies and white papers that we gate?
(In case anyone is interested, here's a landing page where you can see what I mean: https://www.cockroachlabs.com/case-studies/mux/)
Thanks everyone!
-
A similar question- we have informational, content rich brochures in PDF form on Hubspot. Hubspot does not allow PDFs to be added to the site map. Is there another way to get these indexed and searchable?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is showing erroneous results on SERPs page
Hello, All, In April, two months ago, we caught a hack on a client's website. It created about 40 pages in what looked to be a black hat link tactic. We removed the pages, resubmitted the sitemap.xml (it reprocessed) and ran it through screaming frog to confirm all the pages were gone, but the forty pages still show up in the search results for a site search. We have both the www. and non www. version of sites claimed and set a preference. Nothing is awry with the robots.text. We're not really sure what to do to resolve it. We asked Google to recrawl (fetch) the site. I'm not sure what's going on with it. The website's name is fortisitsolutions.com The site search bringing up the pages from the hack is below. site:www.fortisitsolutions.com Any ideas?
On-Page Optimization | | Cazarin-Interactive0 -
Should I block google indexing "search.php"
My question is I have a search page on our website , you can search by date, number of people staying and so on, I am just wondering should block this in the robots.txt ? Because we have pretty URL'S already for searching by county and searching by towns. I cannot see any benefit of having e.g "search/search.php?sp_dateFrom=16%2F12%2F2015&sp_dateTo=23%2F12%2F2015&sec_drop%5B%5D=727&spesh_town_id=764&q=&occupants=5&bedrooms=3&submit=SEARCH#search" indexed. Would I be correct in doing this ?
On-Page Optimization | | McCaldin0 -
Tags or Categories: Which is best?
I seem to have duplicate issues on my website due to tags and categories having the same title. If I was to delete one of the two, which would be better to delete? Thanks!
On-Page Optimization | | IoanSaid0 -
Tags to index or to no index that is the Question
Sorry for the silly Shakespearean introduction but what are peoples opinions on leaving tag clouds to be indexed or no indexable by the search engines. I tend to leave them non indexable at the moment. I use wordpress for blogging as do most others.
On-Page Optimization | | onlinemediadirect0 -
Getting pages indexed by Google
Hi SEOMoz, I relaunched a site back in February of this year (www.uniquip.com) with about 1 million URL's. Right now I'm seeing that Google is not going past 110k indexed URL's (based on sitemaps). Do you have any tips on what I can do to make the site more likeable by Google and get more indexed URL's? All the the part pages can be browsed to by going to: http://www.uniquip.com/product-line-card/suppliers/sw-a/p-1 I've tried to make the content as unique as possible by adding random testimonials and random "related part numbers" see here: http://www.uniquip.com/id/246172/electronic-components/infineon/microcontrollers-mcu/sabc161pilfca Do I need to wait more time and be more patient with Google? It just seems like I'm only getting a few thousand URL's per day at the most. Would it help me if I implemented a breadcrumb on all part pages? Thanks, -Carlos
On-Page Optimization | | caneja0 -
Shall Google index a search result?
Hi, I've a website with about 1000 articles.Each article has one ore more keywords / tags. So I display these keywords at the article page and put a link to the intern search engine. (Like a tag cloud) The search engine lists als articles with the same keyword and creates a result page. This result page is indexed by Google. The search result contains the title of the article, a short description (150-300 chars.) and a link to the article. So, Google believes, that there are about 5.000 pages instead of 1.000 because auf the link to the search result pages. The old rule was for me: More pages in Google = better. But is this still true nowadays? Would be a "noindex, follow" better on these search result pages? (Is there a way to tell Google that this is a search result page?) Best wishes, Georg.
On-Page Optimization | | GeorgFranz0 -
How to get Google images traffic?
How to get Google images traffic? Take a look at traxnyc.com and sugest what we can improve. Thanks in advance.
On-Page Optimization | | DiamondJewelryEmpire0