Spammy Structured Data Markup Removal
-
Hi There,
I'm in a weird situation and I am wondering if you can help me.
Here we go, We had some of our developers implement structured data markup on our site, and they obviously did not know what they were doing. They messed up our results in the SERP big time and we wound up getting manually penalized for it. We removed those markups and got rid of that penalty (phew), however now we are still stuck with two issues.
We had some pages that we changed their URLs, so the old URLs are now dead pages getting redirected to the newer version of the same old page, however, two things now happened:
a) for some reason two of the old dead pages still come up in the Google SERP, even though it's over six weeks since we changed the URLs. We made sure that we aren't linking to the old version of the url anywhere from our site.
b) those two old URLs are showing up in the SERP with the old spammy markup. We don't have anywhere to remove the markup from cause there are no such pages anymore so obviously there isn't this markup code anywhere anymore.
We need a solution for getting the markup out of the SERP.
We thought of one idea that might help - create new pages for those old URLs, and make sure that there is nothing spammy in there, and we should tell google not to index these pages - hopefully, that will get Google to de-index those pages.
Is this a good idea, if yes, is there anything I should know about, or watch out for? Or do you have a better one for me?
Thanks so much
-
Thanks so much
I'll try that right away
-
yes just create one you can call 301-sitemap.xml and submit it to google webmaster tools. This is a separate one from your full sitemap as when you ll get those pages removed from google seeps you can just delete it without affecting your normal sitemap.
-
thanks for your answer,
Should I create a sitemap with only dead pages? and then have two sitemaps?
let me know, please.
-
Hi Yosepgr,
one thing I would like to clarify IMO is that dev needs SEO guidance on how to implement schema. Sometimes people just request schema implementation and then wait for dev to do it. I'm not saying is your case but we, as SEO, should provide technical guidance on how to correctly implement that.
That being said I had a similar case in the past and what I did was creating a sitemap including just the dead URLs. I this way I was forcing google to crawl them and see that they now redirect to the new version.
After doing so, ensure that your redirect is actually a permanent redirect (301). You can check that easily with screaming frog by crawling those URLs in list mode or get the ayima plugin for chrome and visit the URL so you can see what the header response look like. Ensure that the redirect is 301 and with just 1 step (if possible).
It may take a while for google to digest the but you shouldn't be worried about schema as if google is penalizing your site for spammy markup, it will penalize only pages containing that markup which are now dead and removed from the site.
I hope this helps!
e
-
Hey there,
It's definitely not that good of an idea to re-do the old url's. Have you submitted the site to be reindexed? Make sure you update your sitemap if needed (and/or robots) and reupload these to google. Then wait. Any additional changes might confuse G even more. Make sure to 301 the old pages to the new ones.
If you still need help with the schema code drop me a PM.
Have a great day
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inconsistency between content and structured data markup
Hi~ everyone What does Google think about the inconsistency between content and structured data markup? Is this kind of a cheating way ? Is hurt my SEO?
Technical SEO | | intern2020120 -
Folders in url structure?
Hello, Revamping an out-of-date website and am wondering if I need to include the folders (categories) in the url structure? The proposed structure has 8 main folders. I've been reading that Google is ok if the folder is not included in the url, but is it really? The hesitation I have is that the urls are getting long and the main folder only has only a sub folder beneath it. So, /folder-name/facility-name/treatment-overview. This looks too long, doesn't it? Thanks!
Technical SEO | | lfrazer1230 -
Old pages - should I remove them from serps?
Hi guys, I need an advice from you, a recommendation. I have some old LPs from old campaigns, around 70 pages indexed on Google, campaigns that are not available anymore. I have removed them from my DB, but they still remained on server so Google still sees them as URLs on my site, witch I totally agree. What should I do with this pages? Should I remove them completely? (url removal tool) or use rel=canonical? How will this affect my domain authority and rankings? This pages doesn't bring traffic any more, maybe a view now and then, but overall this pages don't bring traffic.
Technical SEO | | catalinmoraru0 -
How i can remove 404 redirect error (Wordpress)
Hello ,
Technical SEO | | mayankebabu
I am getting 404 error in some pages of my wordpress site http://engineerbabu.com/ .
Those pages are permanently removed. Is there any plugin to fix this prob or anyway so that google will not crawl these pages.0 -
Is adding reviews to your site using schema structured data markup considered duplicating content?
A client of mine whats to add reviews from other sites such as Judys Book and Yahoo to their site. (Yes the actual content of what was posted in the review. They are proud of what their clients are saying). I am not sure if using schema mark up and including the review body on the clients web site was safe or would it be considered duplicate content? Is there a "good practice" for this? Any assistance or suggestions are welcomed. Thanks!
Technical SEO | | mgordon0 -
Landing Page URL Structure
We are finally setting up landing pages to support our PPC campaigns. There has been some debate internally about the URL structure. Originally we were planning on URL's like: domain.com /california /florida /ny I would prefer to have the URL's for each state inside a "state" folder like: domain.com /state /california /florida /ny I like having the folders and pages for each state under a parent folder to keep the root folder as clean as possible. Having a folder or file for each state in the root will be very messy. Before you scream URL rewriting :-). Our current site is still running under Classic ASP which doesn't support URL rewriting. We have tried to use HeliconTech's ISAPI rewrite module for IIS but had to remove it because of too many configuration issues. Next year when our coding to MVC is complete we will use URL rewriting. So the question for now: Is there any advantage or disadvantage to one URL structure over the other?
Technical SEO | | briankb0 -
Is it possible for someone outside an organization to remove links?
The other day I used the Explorer tool to check my links. I checked again today, and the strongest links, which have been on my site for many months, are no longer listed on the report. I'm wondering if there has been some malicious activity. If the links have been removed, how do you get them back?
Technical SEO | | Stol0 -
Your opinion on using the markup from schema.org
I am attending SMXEast and one of the speakers is strongly encouraging to use the markup from schema.org. Does anyone have experience with the markup from schema.org and were you able to track any outcome in the search rankings based on adding this markup?
Technical SEO | | irvingw0