Universal Search vs Local Organic
-
Hi,
My web site has high rankings in universal SERP's. However, in my city organic search the competitors’ web sites that even don’t show up in universal Serp’s have higher rankings than mine. Not sure what I’m doing wrong.
Thanks for any insight.
-
Hi ZIhe, I'm a little over a year late, but in case you are still watching, I'm thinking you need more local keywords on page and in title's? If you are ranking for universal, but not local (organic), then that might be the issue?
Just a thought. Cheers!
~BB
-
Thanks for advice, but my question is about organic SERP's not local or combined listings.
-
Make sure your Google Places page is set up correctly and you are targeting the best keywords.
Get people to leave you reviews on your places page and create lots of local citations.
There is a few great tools that can help you do this...
https://www.whitespark.ca/local-citation-finder/
Hope this helps....
Matthew
-
Do the other sites have a physical address on their sites? Do you?
Do you have a google places listing?
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Http vs. https - duplicate content
Hi I have recently come across a new issue on our site, where https & http titles are showing as duplicate. I read https://mza.seotoolninja.com/community/q/duplicate-content-and-http-and-https however, am wondering as https is now a ranking factor, blocked this can't be a good thing? We aren't in a position to roll out https everywhere, so what would be the best thing to do next? I thought about implementing canonicals? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
How do we better optimize a site to show the correct domain in organic search results for the location the user is searching in?
For example, chicago-company.com has the same content as springfield-company.com and I am searching for a general non-brand term (i.e. utility bill pay) and am located in Chicago. How can we optimize the chicago-company.com to ensure that chicago's site results are in top positions over springfields site?
Intermediate & Advanced SEO | | aelite1 -
Search Results not Updating (Title, Description, and URL)
Issue: I recently discovered that my site was accessible by both HTTP and HTTPS. The site has used a rel canonical tag to point to the HTTP version. Google+ was pointing to HTTPS though. The title, description, and URL shown in the results for the homepage is HTTPS, other pages are HTTP, etc... Steps taken to Resolve: This week I did the following... 301'd all non-checkout pages to the HTTP version Switched Google+ URL to HTTP version and added new post with an HTTP link to the homepage. Used webmaster tools to recrawl and reindex the site Resubmitted XML Sitemap No luck... the site is still not updating... any advice would be greatly appreciated. Thanks all! Site is Here
Intermediate & Advanced SEO | | AhlerManagement0 -
PageRank vs. Domain Authority
A web-site has PR5 but in OSE I see that Domain Authority is only 26. I've also checked and the domain was registered in 2009. Is this normal?
Intermediate & Advanced SEO | | ditoroin0 -
Organic Rankings for the US & Australia
I have a site that is ranking well for competitive keywords in the US, but would like to have it rank in Australia as well. Although there's no direct correlation, I'm running large Adwords campaigns in both countries. I've read to write localized content for each region, but not sure if this is effective as it used to be. I've also read to use location markup and microformats. Any feedback would be greatly appreciated. Thank you in advance
Intermediate & Advanced SEO | | NickMacario0 -
Local ranking (keyword) strategies
Hello SEOmozers, I've been working on improving all components of my SEO skills for the past 6 months. I have definitely had some great victories and some gray defeats. My newest challenge is local ranking for a home improvement company. My target is to rank them locally with Google within the top 7 results. I have managed to do so, but only for one keyword "windows and doors CITY". My campaign, in terms of anchor text has a wide variety of long and shortail keywords, I have not concentrated on the above keyword. My question is, how do I go about to rank this website in the local results for all other keywords "windows CITY", "window replacement CITY", etc... What I don't understand is how Google picks up which keywords to rank the website locally for, and which ones to ignore. Any information will be well received. Cheers, Nikster
Intermediate & Advanced SEO | | thenikster0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0