Which address do I use for citations
-
Hello,
When I created my google places, I entered my address and when I got my google places activated I noticed that the address google places was displaying was a short abbreviation of my address. So my question is when it comes to creating citations for my listing do I grab the address google places generated for me in the listing or the long version of my address? I've just heard when it comes to creating citations, you need to make sure it is identical across the board. I hope this makes sense.
Thanks!
-
My pleasure!
-
Awesome! Thank you so much!
-
Hi Stewart,
So, in your hypothetical example, the only difference is W vs. West. This is the type of thing that I don't believe matters at all to Google. It is my opinion that they know that W = West, so I would not be concerned about citation consistency in this regard. To be on the safe side in building new citations, you can choose to build them with the 'W', as Google typically uses this abbreviation, but if you have old citations that use 'West', I wouldn't be worried about this. What would be a problem is if you were missing a suite number or something like that. But small differences like this one do not appear to make any difference. Hope this helps!
-
Hi Miriam,
So when I created my google places I entered the full address ex. 123 West Fake Street, Fakeville, CA 12345 and when I got it activated it started showing in the google listing as 123 W Fake St Fakeville, CA 12345. So my question was when it comes to creating citations, what address should I use, the full address or the short abbreviation of my address. Hope thanks makes sense.
-
Hi Stewart,
Are you able to be more specific? I want to be sure I'm understanding exactly what you mean by Google abbreviating your address. It could be nothing to worry about or something concerning, depending on what you are describing. Thanks!
-
Can you elaborate on this "Googe Places < Google+ Page"?
-
The short answer? Yes. Grab the same address Google Places is using. On a side note you have created your Google+ page haven't you? It took me the longest time to realize that Googe Places < Google+ Page.
http://moz.com/blog/finding-and-building-citations-like-an-agency
Hope this helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO implications when changing the business function however using the same domain name
Hi Everyone, Request some advice on the following situation please: If a client wishes to start a new business (advertising agency) using a domain name that they previously used for a completely different business function (selling hats & t-shirts) Is there anything you can do to "clean" the domain so that the previous indexing of the domain does not effect the new business and give the client a fresh start. Any help or advice would be greatly appreciated. Regards and thanks.
Technical SEO | | Republica0 -
How to use robots.txt to block areas on page?
Hi, Across the categories/product pages on out site there are archives/shipping info section and the texts are always the same. Would this be treated as duplicated content and harmful for seo? How can I alter robots.txt to tell google not to crawl those particular text Thanks for any advice!
Technical SEO | | LauraHT0 -
Does Google Still Pass Anchor Text for Multiple Links to the Same Page When Using a Hashtag? What About Indexation?
Both of these seem a little counter-intuitive to me so I want to make sure I'm on the same page. I'm wondering if I need to add "#s to my internal links when the page I'm linking to is already: a.) in the site's navigation b.) in the sidebar More specifically, in your experience...do the search engines only give credit to (or mostly give credit to) the anchor text used in the navigation and ignore the anchor text used in the body of the article? I've found (in here) a couple of folks mentioning that content after a hashtagged link isn't indexed. Just so I understand this... a.) if I were use a hashtag at the end of a link as the first link in the body of a page, this means that the rest of the article won't be indexed? b.) if I use a table of contents at the top of a page and link to places within the document, then only the areas of the page up to the table of contents will be indexed/crawled? Thanks ahead of time! I really appreciate the help.
Technical SEO | | Spencer_LuminInteractive0 -
How do I Address Low Quality/Duplicate Content Issue for a Job portal?
Hi, I want to optimize my job portal for maximum search traffic. Problems Duplicate content- The portal takes jobs from other portals/blogs and posts on our site. Sometimes employers provide the same job posting to multiple portals and we are not allowed to change it resulting in duplicate content Empty Content Pages- We have a lot of pages which can be reached via filtering for multiple options. Like IT jobs in New York. If there are no IT jobs posted in New York, then it's a blank page with little or no content Repeated Content- When we have job postings, we have about the company information on each job listing page. If a company has 1000 jobs listed with us, that means 1000 pages have the exact same about the company wording Solutions Implemented Rel=prev and next. We have implemented this for pagination. We also have self referencing canonical tags on each page. Even if they are filtered with additional parameters, our system strips of the parameters and shows the correct URL all the time for both rel=prev and next as well as self canonical tags For duplicate content- Due to the volume of the job listings that come each day, it's impossible to create unique content for each. We try to make the initial paragraph (at least 130 characters) unique. However, we use a template system for each jobs. So a similar pattern can be detected after even 10 or 15 jobs. Sometimes we also take the wordy job descriptions and convert them into bullet points. If bullet points already available, we take only a few bullet points and try to re-shuffle them at times Can anyone provide me additional pointers to improve my site in terms of on-page SEO/technical SEO? Any help would be much appreciated. We are also thinking of no-indexing or deleting old jobs once they cross X number of days. Do you think this would be a smart strategy? Should I No-index empty listing pages as well? Thank you.
Technical SEO | | jombay3 -
Is the seomoz on-page factor :Appropriate Use of Rel Canonical working properly?
I have a word press site with a rel canonical plug in. The rel="canonical" href= is there and the url in there works and goes to the actual page.So why does the seomoz keep giving the warning: Appropriate Use of Rel Canonical
Technical SEO | | CurtCarroll0 -
Is it possibly to use anything besides a 302 re-direct when your doing a re-direct for someone to login?
Hopefully this makes sense. So I am working on a site that uses a 302 re-direct for logins. As in it goes from a profile page to the login via a re-direct, most of the time I see sites use this as a meta refresh, but in this case I wasn't sure. Obviously when I run a crawl diagnostic I'm getting a lot of errors as in over 100. Now I know there is no link juice with this, but I was just wondering what other people thought on using 302's for logins? Thanks
Technical SEO | | kateG12980 -
Using a third party server to host site elements
Hi guys - I have a client who are recently experiencing a great deal of more traffic to their site. As a result, their web development agency have given them a server upgrade to cope with the new demand. One thing they have also done is put all website scripts, CSS files, images, downloadable content (such as PDFs) - onto a 3rd party server (Amazon S3). Apparently this was done so that my clients server just handles the page requests now - and all other elements are then grabbed from the Amazon s3 server. So basically, this means any HTML content and web pages are still hosted through my clients domain - but all other content is accessible through an Amazon s3 server URL. I'm wondering what SEO implications this will have for my clients domain? While all pages and HTML content is still accessible thorugh their domain name, each page is of course now making many server calls to the Amazon s3 server through external URLs (s3.amazonaws.com). I imagine this will mean any elements sitting on the Amazon S3 server can no longer contribute value to the clients SEO profile - because that actual content is not physically part of their domain anymore. However what I am more concerned about is whether all of these external server calls are going to have a negative effect on the web pages value overall. Should I be advising my client to ensure all site elements are hosted on their own server, and therefore all elements are accessible through their domain? Hope this makes sense (I'm not the best at explaining things!)
Technical SEO | | zealmedia0