Spammy Structured Data Markup Removal
-
Hi There,
I'm in a weird situation and I am wondering if you can help me.
Here we go, We had some of our developers implement structured data markup on our site, and they obviously did not know what they were doing. They messed up our results in the SERP big time and we wound up getting manually penalized for it. We removed those markups and got rid of that penalty (phew), however now we are still stuck with two issues.
We had some pages that we changed their URLs, so the old URLs are now dead pages getting redirected to the newer version of the same old page, however, two things now happened:
a) for some reason two of the old dead pages still come up in the Google SERP, even though it's over six weeks since we changed the URLs. We made sure that we aren't linking to the old version of the url anywhere from our site.
b) those two old URLs are showing up in the SERP with the old spammy markup. We don't have anywhere to remove the markup from cause there are no such pages anymore so obviously there isn't this markup code anywhere anymore.
We need a solution for getting the markup out of the SERP.
We thought of one idea that might help - create new pages for those old URLs, and make sure that there is nothing spammy in there, and we should tell google not to index these pages - hopefully, that will get Google to de-index those pages.
Is this a good idea, if yes, is there anything I should know about, or watch out for? Or do you have a better one for me?
Thanks so much
-
Thanks so much
I'll try that right away
-
yes just create one you can call 301-sitemap.xml and submit it to google webmaster tools. This is a separate one from your full sitemap as when you ll get those pages removed from google seeps you can just delete it without affecting your normal sitemap.
-
thanks for your answer,
Should I create a sitemap with only dead pages? and then have two sitemaps?
let me know, please.
-
Hi Yosepgr,
one thing I would like to clarify IMO is that dev needs SEO guidance on how to implement schema. Sometimes people just request schema implementation and then wait for dev to do it. I'm not saying is your case but we, as SEO, should provide technical guidance on how to correctly implement that.
That being said I had a similar case in the past and what I did was creating a sitemap including just the dead URLs. I this way I was forcing google to crawl them and see that they now redirect to the new version.
After doing so, ensure that your redirect is actually a permanent redirect (301). You can check that easily with screaming frog by crawling those URLs in list mode or get the ayima plugin for chrome and visit the URL so you can see what the header response look like. Ensure that the redirect is 301 and with just 1 step (if possible).
It may take a while for google to digest the but you shouldn't be worried about schema as if google is penalizing your site for spammy markup, it will penalize only pages containing that markup which are now dead and removed from the site.
I hope this helps!
e
-
Hey there,
It's definitely not that good of an idea to re-do the old url's. Have you submitted the site to be reindexed? Make sure you update your sitemap if needed (and/or robots) and reupload these to google. Then wait. Any additional changes might confuse G even more. Make sure to 301 the old pages to the new ones.
If you still need help with the schema code drop me a PM.
Have a great day
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is My Site Structure Suppressing Product Pages
Hey Guys, I've built some ecommerce sites using WooCommerce, and I've been auditing some of the sites to see why I'm not getting more traffic to my product pages. I have several informational blog posts and resources that are getting a lot of traffic, but my product pages aren't ranking very well. There are two things that I think could be causing the issue, but I could use some extra eyes on this. Products are listed several sub-categories down in the structure of the site. For example, this product is listed under a fifth level sub-category: /product-category/ ->FIRE SAFETY » FIRE EXTINGUISHERS » PORTABLE FIRE EXTINGUISHERS » FIRE EXTINGUISHER ACCESSORIES » FIRE EXTINGUISHER BRACKETS Also, I checked to see what Google's indexed under the /product/ directory, which is the default format for WooCommerce products. It looks like all of my products are given lower authority than other top-level directories, including /product-tag/ and /product-category/ It seems like an adjustment to how my products are structured in the site might go a long way. If you have any experience with this and could weigh in on it, I'd appreciate it.
Technical SEO | | robbinsinternational0 -
Several Items in the Organization schema structured file
Hi MOZ community! Could you please help me with the issue? I have implemented Organization Schema on the website. But according to the structure I can not markup data once. So now I have 2 Items for a Organization schema on a page. The questions are: 1. Does Google consider both of them? 2. Is that OK to have a few Items for one type of schema on the page? Thank you
Technical SEO | | juicefromtheraw10 -
Removing a canonical tag from Pagination pages
Hello, Currently on our site we have the rel=prev/next markup for pagination along with a self pointing canonical via the Yoast Plugin. However, on page 2 of our paginated series, (there's only 2 pages currently), the canonical points to page one, rather than page 2. My understanding is that if you use a canonical on paginated pages it should point to a viewall page as opposed to page one. I also believe that you don't need to use both a canonical and the rel=prev/next markup, one or the other will do. As we use the markup I wanted to get rid of the canonical, would this be correct? For those who use the Yoast Plugin have you managed to get that to work? Thanks!
Technical SEO | | jessicarcf0 -
Page Load Timings: How accurate is Google Analytics Data?
Hello Guys, what are your experiences? How accurate is google analytics data regarding page load times? I know that one of my sites has trouble with pageload times, especially in India and USA. We are based in middle Europe and regarding to the GA data we have here in middle europe of about 2 seconds page load time. Moreover we have of about 4 seconds in USA and 10 seconds in India. Therefore I decided to test for a few sides a CDN (on these pages all static files are served over the CDN). However, first GA data indicates, that the page load times are even getting worse!!! But when I test it for example with pingdom (http://tools.pingdom.com/fpt/) and compare it with an old landing page without CDN implementation, the tool says it's faster. The CDN provider (maxcdn) send me also some reports, which indicate, that the page load time should be faster...That's the reason why I ask about your experience with the GA page load time data, because personally I get the impression you cannot trust the data... Thanks for your help! Cheers
Technical SEO | | _Heiko_2 -
Is it good practice to update your disavow file after a penalty is removed.
I was wondering if you could use the disavow file by adding to it - even after your site has recovered from a partial site penalty. As a recurring SEO procedure, we are always looking at links pointing to our Website. We then ascertain those links that are clearly of no value. In order to clean these up, would it be good practice to update your disavow file with more of theses domains. Is the disavow file just used for penalty issues to alert google of the work you have done? (we have had penalty in the past but fine now) Would this method help in keeping high quality links to the fore and therefore removing low quality links from Googles eyes? I would welcome your comments.
Technical SEO | | podweb0 -
How Long Until Manually Removed Anchor Text Links Stop Showing Up?
I've been manually removing spammy links from crappy sites and overly used money terms for several months... how long does it take for those links to stop showing up in crawl reports?
Technical SEO | | pixelproductions0 -
URL Structure for "Find A Professional" Page
I've read all the URL structure posts out there, but I'm really undecided and would love a second opinion. Currently, this is how the developer has our professionals directory working: 1. You search by inputting your Zip Code and selecting a category (such as Pool Companies) and we return all professionals within a X-mile radius of that ZIP. This is how the URL's are structured... 1. Main Page: /our-professionals 2. The URL looks like this after a search for "Deck Builders" in ZIP 19033: /our-professionals?zipcode=19033&HidSuppliers=&HiddenSpaces=&HidServices=&HidServices_all=[16]%2C&HidMetroareas=&srchbox= 3. When I click one of the businesses, URL looks like this: viewprofile.php?id=409 I know how to go about doing this, but I'm undecided on the best structure for the URL's. Maybe for results pages do this: find-professionals/deck-builders/philadelphia-pa-19033 And for individual pro's profiles do this: /deck-builders/philadelphia-pa-19033/Billys-Deck-Service Any input on how to best structure this so that we can have a good chance of showing in SERPs for "Deck Builders near New Jersey" and the such, would be much appreciated.
Technical SEO | | zDucketz0 -
Iframe & pulling data from higher ranked domain
Hi, i have a question regarding iframes and SEO. I know iframes are bad practice but if you have a brand new domain and want to improve its ranking more quickly, you can host the website file in a higher authority domain, and load an iframe on the new domain. Is this true? For example, if I build and host the website files on www.masterdomain.com (domain authority 48), and then load the pages within an iframe on www.newdomain.com (domain authority 5), will that help increase the domain rank for www.newdomain.com? What are the advantages (if any) and disadvantages for each domain www.newdomain.com and www.masterdomain.com if we do this? Thanks
Technical SEO | | Essentia0