Questions created by Etna
-
How to Properly Add Simple Review Schema to Your Website For a Review Pulled from a Third-Party Site
Hello, I'd like to pull the content of a review from a third-party site and put it on my client's website. My plan is to add review schema to this content but I want to make sure everything I'm doing is white hat before I implement. Can someone please tell me if the following example is okay to do? For example...
Reviews and Ratings | | Etna
I'd like to pull an entire review from Yelp and put it on my website. I would link out to the review on Yelp and then give credit to Yelp in the publisher section of the schema. If I give credit to the name of the reviewer and where the review is being pulled from (in this case Yelp) on both the actual website as well as in the schema, is this white hat and something I could implement? Also, is "simple review" markup the correct markup to use in this scenario?
https://developers.google.com/search/docs/data-types/reviews Thanks in advance for your help!!1 -
Hreflang tag on links to alternate language site
Hey everyone! In the interest of trying to be brief, here's the situation in my favorite form of communication, bullet points! Client has two sites; one is in English and one is in Japanese Each site is a separate URL, no sub-domains or sub-pages Each main page on the English version of the site has a link to the homepage of the Japanese site Site has decent rankings overall, with room for improvement from page 2 to page 1 No Hreflang tags currently used in links to the Japanese version from the English version Given that the site isn't really suffering for most rankings, would this be helpful to implement on the English version? Ideally, I'd like each link to be updated to the corresponding subject matter of the Japanese, but in the interim it seems like identifying to Google that the link on the other side is a different language might be helpful to both the user and to maybe help those rankings on page two creep a little higher to page one. Thanks for reading, I appreciate your time.
Intermediate & Advanced SEO | | Etna0 -
Call Tracking, DNI Script & Local SEO
Hi Moz! I've been reading about this a lot more lately - and it doesn't seem like there's exactly a method that Google (or other search engines) would consider to be "best practices". The closest I've come to getting some clarity are these Blumenthals articles - http://blumenthals.com/blog/2013/05/14/a-guide-to-call-tracking-and-local/ & the follow-up piece from CallRail - http://blumenthals.com/blog/2014/11/25/guide-to-using-call-tracking-for-local-search/. Assuming a similar goal of using an existing phone number with a solid foundation in the local search ecosystem, and to create the ability to track how many calls are coming organically (not PPC or other paid platform) to the business directly from the website for an average SMB. For now, let's also assume we're also not interested in screening the calls, or evaluating customer interaction with the staff - I would love to hear from anyone who has implemented the DNI call tracking info for a website. Were there negative effects on Local SEO? Did the value of the information (# of calls/month) outweigh any local search conflicts? If I was deploying this today, it seems like the blueprint for including DNI script, while mitigating risk for losing local search visibility might go something like this: Hire reputable call-tracking service, ensure DNI will match geographic area-code & be "clean" numbers Insert DNI script on key pages on site Maintain original phone number (non-DNI) on footer, within Schema & on Contact page of the site ?? Profit Ok, those last 2 bullet points aren't as important, but I would be curious where other marketers land on this issue, as I think there's not a general consensus at this point. Thanks everyone!
Local Website Optimization | | Etna1 -
Recovering from Black Hat/Negative SEO with a twist
Hey everyone, This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you. Scenario In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to. We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages. Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site. Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them. Next Steps? The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are: Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1) Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795) Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? Would love some input or examples from anyone who can help, thanks in advance!
White Hat / Black Hat SEO | | Etna0 -
Disavow Links & Paid Link Removal (discussion)
Hey everyone, We've been talking about this issue a bit over the last week in our office, I wanted to extend the idea out to the Moz community and see if anyone has some additional perspective on the issue. Let me break-down the scenario: We're in the process of cleaning-up the link profile for a new client, which contains many low quality SEO-directory links placed by a previous vendor. Recently, we made a connection to a webmaster who controls a huge directory network. This person found 100+ links to our client's site on their network and wants $5/link to have them removed. Client was not hit with a manual penalty, this clean-up could be considered proactive, but an algorithmic 'penalty' is suspected based on historical keyword rankings. **The Issue: **We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick. When talking about scaling this tactic, we run into some ridiculously high numbers when you talk about providing this service to multiple clients. **The Silver Lining: **Disavow Links file. I'm curious what the effectiveness of creating this around the 100+ directory links could be, especially since the client hasn't been slapped with a manual penalty. The Debate: Is putting a disavow file together a better alternative to paying for crappy links to be removed? Are we actually solving the bad link problem by disavowing or just patching it? Would choosing not to pay ridiculous fees and submitting a disavow file for these links be considered a "good faith effort" in Google's eyes (especially considering there has been no manual penalty assessed)?
Intermediate & Advanced SEO | | Etna0 -
Why is my m-dot site outranking my main site in SERPs?
My client has a WP site and a Duda mobile site that we inherited. For some reason their m-dot site is ranking on P1 of Google for their top KWs instead of the main site which is much more robust. The main site might rank beyond page 5 when the generic home page for their m-dot site appears on P1. Does anyone have any idea why this might be happening?
Intermediate & Advanced SEO | | Etna0 -
SERP Question - Site showing up for national term over local term recently
Hey Moz, This has been happening to me with a couple of clients recently and I wanted to kick it out to the community and see if anyone else has experienced it and might be able to shed some light on why. (Disclaimer: Both clients are in the elective healthcare space)
Algorithm Updates | | Etna
Scenario: Client's site is optimized for a fairly competitive "procedural keyword + location" phrase. Historically, the site had been ranking on the first page for a while until it suddenly dropped off for that query. At the same time, the page now ranks on the first page for just the procedural term, without the location modifier (obviously much more competitive than with the location modifier). Searches on Google were set to the city in which the client was located. Not that I'm complaining, but this seems a little weird to me. Anyone have a similar situation? If so, any theories about what might have caused it? TL;DR - Site ranked on 1st page for "keyword + location modifier" historically, now ranking on 1st page for "keyword" only and not found with "keyword + location modifier" TRQd9Hu0 -
Mini sitelinks in local-pack?
Recently after performing a search I noticed one result in the "7-pack" included several sitelink-type links. It stood out among the others to me, and I was curious if this was schema or perhaps Google playing around with their local results? I'll include a screenshot for an example, any insights or links to articles discussing topic this would be appreciated. Thanks Moz! ASLo4GF
Algorithm Updates | | Etna0