Problem of printer friendly version.
-
For one of our client's side, most of the backlinks are going to printer friendly version page. I recommeded to him to use the canonical tag on printer friendly version pointing to other page.
Luckily, while searching i came across this posts at - http://www.seomoz.org/q/solving-printer-friendly-version
The solution recommended was this -
<link type="text/css" rel="stylesheet" media="print" href="our-print-version.css">
My questions are -
1. what should i write in place of our-print-version.css
Should it be print.css ?
2. Where do i place this code ? in which file ?
-
Correct.
Often a site will refer to numerous CSS files. There are tools which will combine multiple CSS files into a single file and properly compress the files to optimize them for page speed.
-
Thanks once again for clarification.
The only question is whether changes need to be made to optimize the code from a SEO or Page Speed perspective.
You mean to say, css code must reside in an external file and linked from page to minimise code.
-
Do i need to 301 printer friendly page ?
No. Your site's visitors need to access the printer friendly page. If you add a 301, then no one will be able to view the print friendly page.
I should also clarify, if your site currently offers a print friendly page and it works, then your programmer has already taken care of the issue from a website functionality perspective. The only question is whether changes need to be made to optimize the code from a SEO or Page Speed perspective.
-
"It would need to be accessible and declared on the printer friendly version page"
That' what i was looking for. I will ask designer to declare this file in printer friendly version page. So, the solutution will be -
We place this code in printer friendly version page -
<link type="text/css" rel="stylesheet" media="print" href="print.css">
print.css will have the css code for print format pages. and print.css will be a separate file
Do i need to 301 printer friendly page ?
-
How the CSS is presented is up to your web designer. It could be a part of the site's main css, or in a separate file. It would need to be accessible and declared on the printer friendly version page.
As part of speed optimizations, all CSS files may be condensed into a single file.
-
Thanks a lot Ryan.
CSS declarations are made in the of your HTML document
i was not sure, that's why i asked this. Should this declaration be made in printer friendly version page ?
-
Hi Atul.
I looked at the Q&A response link you offered. I will try to offer some clarifications:
Where do i place this code ? in which file ?
CSS declarations are made in the of your HTML document
what should i write in place of our-print-version.css
The name of the file which contains the css code for your print format pages
For one of our client's side, most of the backlinks are going to printer friendly version page. I recommended to him to use the canonical tag on printer friendly version pointing to other page.
Your recommendation is sound, and I agree with it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to ensure AJAX rendered categories remain SEO friendly.
I'm considering implementing the following: http://lnavigation.demo.aheadworks.com/index.php/electronics-computers/electronics.html?___SID=U&mode=grid. My concern is that it doesn't seem to align with Making AJAX Application Crawlable. However, the pagination, Grid/List toggle, etc.. appear to have their full HREF links intact and when accessed at that URL the appropriate, matching data is displayed. So, it seems that if Google can see the full URL path then it should be crawlable, correct? I'm not very concerned about the filters being SEO friendly. Hoping the Moz community will offer some helpful insight. Thank you!
Technical SEO | | bearpaw0 -
Website Cached Version
Hi all Why my full content is not appearing in Text only version(cached version): http://webcache.googleusercontent.com/search?q=cache:zakoopi.com&es_sm=93&strip=1 Original website link: http://www.zakoopi.com/ How can I resolve this issue?
Technical SEO | | Obbserv0 -
Should I redirect desktop users visiting mobile page to desktop version?
So I am redirecting mobile users to mobile version of the page and also have alternate attrubute set up for that: What about the opposite case? When user from desktop computer visits mobile version of the page. Should I redirect him back to desktop version?
Technical SEO | | poiseo1 -
Are W3C Validators too strict? Do errors create SEO problems?
I ran a HTML markup validation tool (http://validator.w3.org) on a website. There were 140+ errors and 40+ warnings. IT says "W3C Validators are overly strict and would deny many modern constructs that browsers and search engines understand." What a browser can understand and display to visitors is one thing, but what search engines can read has everything to do with the code. I ask this: If the search engine crawler is reading thru the code and comes upon an error like this: …ext/javascript" src="javaScript/mainNavMenuTime-ios.js"> </script>');}
Technical SEO | | INCart
The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element
in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed).
One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create
cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer
the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). and this... <code class="input">…t("?");document.write('>');}</code> ✉ The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed). One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). Does this mean that the crawlers don't know where the code ends and the body text begins; what it should be focusing on and not?0 -
Importance of WMT change of address and problem doing it
Hi, How important is it to submit a change of address in WMT? I say that because I am having problems doing it so wondered if it was worth the hassle in trying to fix it. I am getting the error: "We couldn't verify website.co.uk. To submit a change of address, website.co.uk must be verified using the same method as www.website.co.uk. Add website.co.uk to your account and verify ownership, then try again." I have looked on the web to try and find an answer and have come across 2 suggestions: You might have lost the verification with the redirect. If you used a metatag on the home page, the home page is now redirecting. If you had uploaded a verification text file, that file is probably now gone and redirecting as well. You probably need to re-verify the site. Either re-upload the text file and configure it not to redirect (may be difficult) or use the DNS server verification method. You need to verify the non www. version of the website because that's the way Google like it. Not sure why solution 2 would be necessary but it does seem to be what WMT are getting. Because the site already redirects, 1 would then come in to play. Is it worth persevering with because IT will be getting a long list of stuff to do from me as it is.... Thanks all
Technical SEO | | Houses0 -
Wordpress theme installation problem
Hi, I was trying to install different theme from wordpress - appearance - install theme As soon as I did that, I saw error message and can't even access to my wp-admin Can anyone help me with this problem? My site is http://www.inksellsanpedro.com I tried rename old theme - didn't help delete old theme - didn' help either Any help will be greatly appreciated. Thank you
Technical SEO | | BistosAmerica0 -
Wordpress Canonical Problem
I'm using wordpress for my website but m unable to implement Canonical tag property for pages under the same category, Like for matt's blog: The Tag is same .. for all pages under that category: http://www.mattcutts.com/blog/type/googleseo/ & http://www.mattcutts.com/blog/type/googleseo/page/2/ is it some hack or some plugin ? please suggest! thanks
Technical SEO | | AnkitRawat0 -
Home Page Indexing Question/Problem
Hello Everyone, Background: I recently decided to change the preferred domain settings in WM Tools from the non www version of my site to the www version. I did this because there is a redirect from the non www to the www and I've built all of my internal links with the www. Everything I read on SEO Moz seemed to indicate that this was a good move. Traffic has been down/volatile but I think it's attributable mostly to a recent site change/redesign. Having said that the preferred domain change did seem to drop traffic an additional notch. I made the move two weeks ago. Here is the question: When I google my site, the home page shows up as the site title without the custom title tags I've written. The page that displays in the SERP is still the non www version of the site. a site:www.mysite.com search shows an internal page first but doesn't return the home page as a result. All other pages pop up indexed with the www version of the page. a site:mysite.com (notice lack of www) search DOES SHOW my home page and my custom title tags but with a non www version of the page. All other pages pop up indexed with the www version of the page. Any one have thoughts on this? Is this a classic example of waiting on Google to catch up with the changes to my tiny little site?
Technical SEO | | JSOC0