Google authorship page versus posrt
-
I have a wordpress.com blog and have recently set up google authorship (at least I think I have).
If I add a new post what happens to the old post in terms of authorship?
Is the solution opening a new page for each article?
If so does the contribution link in google+ pick up all pages if you only have home link?
many thanks
-
Hi Stefania!
Yes, you have correctly installed the code snippet for Google Authorship across your site using a footer widget, and linked to your blog from your personal Google+ profile. You can keep this setup as long as this remains a one-author blog. Google Webmaster Tools has a structured data testing tool that is very useful for determining if Authorship is working for any page. It shows that Authorship is working for your first blog post, and gives you a nice preview of what the search result will look like if Google decides to show your author photo with it - see http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fjijil.pasadena2shop.com%2F.
Remember that correctly setting up Google Authorship does not guarantee that your author photo will show in search results. Focusing on creating high-quality content on both your blog and Google+, and becoming an authority on the topics you write about, however, will increase your chances of standing out in search results with Google Authorship.
Please let me know if you have any questions, and best of luck with your new blog!
Christy
-
Hi Christy
The link to the blog is http://jijil.pasadena2shop.com/
thks in advance for any help you can give
-
Hi Stefania, are you able to share the link to your blog?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google make continued attempts to crawl an old page one it has followed a 301 to the new page?
I am curious about this for a couple of reasons. We have all dealt with a site who switched platforms and didn't plan properly and now have 1,000's of crawl errors. Many of the developers I have talked to have stated very clearly that the HTacccess file should not be used for 1,000's of singe redirects. I figured If I only needed them in their temporarily it wouldn't be an issue. I am curious if once Google follows a 301 from an old page to a new page, will they stop crawling the old page?
Intermediate & Advanced SEO | | RossFruin0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Why is google ranking me higher for pages that aren't optimised for keywords those that are?
I am finding that our homepage and other pages are being ranked higher against keywords that we have optimised other pages for. e.g Keyword: Luxury Towels Google Ranks our homepage http://www.towelsrus.co.uk at 20 for this and the page I am trying to rank for it is nowhere to be seen http://www.towelsrus.co.uk/sport-spa/luxury-towels/catlist_fnct498.htm Why is this and is this why our position for certain keywords fluctuates? How do I remedy this problem?
Intermediate & Advanced SEO | | Towelsrus0 -
Google+ Authorship for Multi-Author Company Blogs
Can a company's Google+ page be designated as the author of web content (as can be done with individuals) so that the COMPANY comes up as the author in the web results? Is it preferable for company bloggers to create individual Google+ profiles and be listed as the author of the posts that they write? Or rather is it a smarter move to create a company persona (under the guise of a real person) and have all authorship be attributed to that personal Google+ profile. AuthorRank is going to become more and more important to Google's algorithm. As bloggers write for a company, if they are listed as the author of the work, they create trust for their own personal brand. If and when this employee leaves, this equity is presumably taken with them instead of remaining with the company. Is this assumption correct? How are companies dealing with this potential issue?
Intermediate & Advanced SEO | | AnthonyMangia0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030 -
How to make Google forget my pages ?
Hello all ! I've decided to delete many pages from my website wich had poor content. I've made a php 301 redirect from all these old pages to a unique page (not the home page, a deep page). My problem is that this modification has been made a week ago and my position in the SERPs have crashed down... What can I do ? I believe that I'll get up again when Google will see that these pages don't exist anymore but it could take a long time 😞 (these page are in the Google cache with a date older than my modification's date) I've read somewhere that I should put a link to the destination page (where old pages are 301 redirected) but I don't understand how it could help... Can someone help me ? Tell me what I've done wrong... These pages were very poor and I've deleted them in order to boost the global quality of my site... It should help me in the SERPs, not penalize me...
Intermediate & Advanced SEO | | B-CITY0 -
Category Pages - Canonical, Robots.txt, Changing Page Attributes
A site has category pages as such: www.domain.com/category.html, www.domain.com/category-page2.html, etc... This is producing duplicate meta descriptions (page titles have page numbers in them so they are not duplicate). Below are the options that we've been thinking about: a. Keep meta descriptions the same except for adding a page number (this would keep internal juice flowing to products that are listed on subsequent pages). All pages have unique product listings. b. Use canonical tags on subsequent pages and point them back to the main category page. c. Robots.txt on subsequent pages. d. ? Options b and c will orphan or french fry some of our product pages. Any help on this would be much appreciated. Thank you.
Intermediate & Advanced SEO | | Troyville0