Rel=author, google plus, picture in Article page SERP
-
Hello,
Could someone explain the easiest way to use Google Plus and rel="author" to claim our articles written by us and get our picture beside them in the Google SERPS
site: nlpca(dot)com
-
Here's a great SEOMoz post on the subject:
http://www.seomoz.org/blog/authorship-google-plus-link-building
And if you're using a Wordpress CMS: http://www.devonwebdesigners.com/3278/relauthor-step-by-step-for-wordpress/
Best of luck, I'm in the process of implementing this myself. And as others have noted, you can test if it's working through GWT.
-
I have tried to implement this on my Grass Roots SEO site which seems to be working just by using the following code:
<a <span="">rel</a><a <span="">="author" href="</a>https://plus.google.com/107417726658928760708">+Martin Evans
My Google+ profile also shows my site as a site I contribute to.
More info can be found at http://en.forums.wordpress.com/topic/google-publisher-code-with-badge-and-1-button?replies=1#post-790448 if you plan to do this on a wordpress site.
-
here's a great walk through. I just did it on our site and it worked well.
http://www.virante.com/blog/2012/01/08/how-to-show-your-author-photo-in-google-search-results/
just make sure to use Google's rich snippet tool to check your work before submitting it to Google's Microdata Submission
Also, here's google's official walk through http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1408986
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated category pages still showing in Google
Despite our blog using rel=next and rel=”prev” we’re still finding paginated pages getting impressions in Google, suggesting they are taking up unnecessary crawl budget. An example is: https://www.theukdomain.uk/seo/page/2/ What steps would you recommend I take to most benefit my sites SEO? Thanks, Sam
Intermediate & Advanced SEO | | sjefferies0 -
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
Google blocks certain articles on my website ... !
Hello I have a website with more than 350 unique articles, Most of them are crawled by Google without a problem, but I find out certain articles are never indexed by Google. I tried to rewrite them, adding fresh images and optimizing them but it gets me nowhere. Lately, I rewrite an article of those and tried to (fetch and render) through Google Webmasters, and I found this result, can you tell me if there is anything to do to fix that? BMVh4
Intermediate & Advanced SEO | | Evcindex0 -
Article page canonicalization
Hey there, A client rents all kinds of party articles, like plates, bowles, etc. Currently, al his article pages have canonicals to their parent category pages, supposedly to have any pagevalue flow to these category pages, (which are much more relevant for SEO). Is there anyone who agrees with this method? I think a noindex,follow would be a better measure to prevent Google from accessing all these 'low value' article pages. Besides, a canonical should indicate that page A and B are (almost) identical, which they most certainly are not in this case. What are your thoughts?
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
Prioritise a page in Google/why is a well-optimised page not ranking
Hello I'm new to Moz Forums and was wondering if anyone out there could help with a query. My client has an ecommerce site selling a range of pet products, most of which have multiple items in the range for difference size animals i.e. [Product name] for small dog
Intermediate & Advanced SEO | | LauraSorrelle
[Product name] for medium dog
[Product name] for large dog
[Product name] for extra large dog I've got some really great rankings (top 3) for many keyword searches such as
'[product name] for dogs'
'[product name]' But these rankings are for individual product pages, meaning the user is taken to a small dog product page when they might have a large dog or visa versa. I felt it would be better for the users (and for conversions and bounce rates), if there was a group page which showed all products in the range which I could target keywords '[product name]', '[product name] for dogs'. The page would link through the the individual product pages. I created some group pages in autumn last year to trial this and, although they are well-optimised (score of 98 on Moz's optimisation tool), they are not ranking well. They are indexed, but way down the SERPs. The same group page format has been used for the PPC campaign and the difference to the retention/conversion of visitors is significant. Why are my group pages not ranking? Is it because my client's site already has good rankings for the target term and Google does not want to show another page of the site and muddy results?
Is there a way to prioritise the group page in Google's eyes? Or bring it to Google's attention? Any suggestions/advice welcome. Thanks in advance Laura0 -
Rel Canonical for HTTP and HTTPS pages
My website has a login that has HTTPS pages. If the visitors doesn't log in they are given an HTTP page that is similar, but slightly different. Should I sure a Rel Canonical for these similar pages and how should that be set up? HTTP to HTTPS version or the other way around? Thank you, Joey
Intermediate & Advanced SEO | | JoeyGedgaud1 -
Why is Google Ranking the Umbrella Category Page when Searching for Sub-Categories Within that Umbrella Category?
I have an e-commerce client who sells shoes. There is a main page for "Kids" shoes, and then right under it on the top-navigation bar there is a link to "Boys Shoes" and "Girls Shoes." All 3 of these links are on the same level - 1 click off the home page. (And linked to from every page on the website via the top nav bar). All 3 are perfectly optimized for their targeted term. However, when you search for "boys shoes" or "girls shoes" + the brand, the "Kids" page is the one that shows up in the #1 position. There are sitelinks beneath the listing pointing to "Girls" and "Boys." All the other results in Google are resellers of the "brand + girls" or "brand + boys" shoes. So our listing is the only one that's "brand + kids shoes." Our "boys" shoes page and "girls" shoes page don't even rank on the 1st page for "brand + boys shoes" or "brand + girls shoes." The only real difference is that "kids shoes" contains both girls and boys shoes on the page, and then "boys" obviously contains boys' shoes only, "girls" contains girls' shoes only. So in that sense there is more content on the "kids" page. So my question is - WHY is the kids page outranking the boys/girls page? How can we make the boys/girls pages be the ones that show up when people specifically search for boys/girls shoes?
Intermediate & Advanced SEO | | FPD_NYC0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0