How to fix this issue?
-
I redesign my website from Wix to HTML.
Now URLs changed to _
http://www.spinteedubai.com/#!how-it-works/c46c
To
http://www.spinteedubai.com/how-it-works.html
Same for all other pages. How I can fix this issue and both pages were also indexed in google.
-
Hi Alexander,
While there are some server-side technical things you can do to force a 404 error for a given URL, the best thing to do is remove the content in question from your server. At the very least this should achieve getting a 404 status code when you attempt to visit the URL that once housed the content. Ideally, if you can configure a custom 404 page that is more user-friendly, that's even better.
Now, depending on how your server is configured, there may be instances when a URL should produce a 404 error, but doesn't. I only bring this scenario up as a possibility because it's something I am currently dealing with on one of the sites I manage.
In any case, you may need to work closely with your server administrator or Web developer to achieve what you need. Most likely, it's just a matter of removing the old content from the server. Hope that helps!
Dana
-
How can I add 404 error? What are the steps?
-
Hi Alexander,
It looks like you've implemented the canonical tags properly. It can, however, take Google a very, very long time (sometimes years) to remove old content. If you really want the old page/URL out of Google's index, the very best and quickest way to achieve that is to make sure that the old page produces a proper 404 status code then use GWT's Remove URL tool to request Google to remove it from their index. This still isn't immediate, but I've seen URLs removed in as little as a week using this method. Hope that helps!
Dana
-
Hi Alexander,
You can either 301 the old page http://www.spinteedubai.com/#!how-it-works/c46c into the new page http://www.spinteedubai.com/how-it-works.html
or you can set up rel=canonical tag if its the same content and you want the keep the URL.
You would then have to either wait or use this to remove the URL - https://www.google.com/webmasters/tools/removals
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issue with GA tracking and Native AMP
Hi everyone, We recently pushed a new version of our site (winefolly.com), which is completely AMP native on WordPress (using the official AMP for WordPress plugin). As part of the update, we also switched over to https. In hindsight we probably should have pushed the AMP version and HTTPS changes in separate updates. As a result of the update, the traffic in GA has dropped significantly despite the tracking code being added properly. I'm also having a hard time getting the previous views in GA working properly. The three views are: Sitewide (shop.winefolly.com and winefolly.com) Content only (winefolly.com) Shop only (shop.winefolly.com) The sitewide view seems to be working, though it's hard to know for sure, as the traffic seems pretty low (like 10 users at any given time) and I think that it's more that it's just picking up the shop traffic. The content only view shows maybe one or two users and often none at all. I tried a bunch of different filters to only track to the main sites content views, but in one instance the filter would work, then half an hour later it would revert to no traffic. The filter is set to custom > exclude > request uri with the following regex pattern: ^shop.winefolly.com$|^checkout.shopify.com$|/products/.|/account/.|/checkout/.|/collections/.|./orders/.|/cart|/account|/pages/.|/poll/.|/?mc_cid=.|/profile?.|/?u=.|/webstore/. Testing the filter it strips out anything not related to the main sites content, but when I save the filter and view the updated results, the changes aren't reflected. I did read that there is a delay in the filters being applied and only a subset of the available data is used, but I just want to be sure I'm adding the filters correctly. I also tried setting the filter to predefined, exclude host equal to shop.winefolly.com, but that didn't work either. The shop view seems to be working, but the tracking code is added via Shopify, so it makes sense that it would continue working as before. The first thing I noticed when I checked the views is that they were still set to http, so I updated the urls to https. I then checked the GA tracking code (which is added as a json object in the Analytics setting in the WordPress plugin. Unfortunately, while GA seems to be recording traffic, none of the GA validators seem to pickup the AMP tracking code (adding using the amp-analytics tag), despite the json being confirmed as valid by the plugin. This morning I decided to try a different approach and add the tracking code via Googles Tag Manager, as well as adding the new https domain to the Google Search Console, but alas no change. I spent the whole day yesterday reading every post I could on the topic, but was not able to find any a solution, so I'm really hoping someone on Moz will be able to shed some light as to what I'm doing wrong. Any suggestions or input would be very much appreciated. Cheers,
Technical SEO | | winefolly
Chris (on behalf of WineFolly.com)0 -
Pages Crawl Per Day Gone Drasitcaly Down, is it google issue?
Hello Expert, In search console in Crawl Stats Pages Crawl per day going day by day i.e. from 4 lac pages per day now it is reduce upto 2 lac in last 15 days. So where is the issue? Where I am going wrong or it is issue from google end? Thanks!
Technical SEO | | Johny123450 -
How to fix duplicate content errors with Go Daddy Site
I have a friend that uses a free GoDaddy template for his business website. I ran his site through Moz Crawl diagnostics, and wow - 395 errors. Mostly duplicate content and duplicate page title I dug further and found the site was doing this: URL: www.businessname.com/page1.php and the duplicate: businessname.com/page1.php Essentially, the duplicate is missing the www. And it does this 2 hundred times. How do I explain to him what is happening?
Technical SEO | | cschwartzel0 -
Crawl issue
Hi I have a problem with crawl stats. Crawls Only return 3k pages while my site have 27k pages indexed(mostly duplicated content pages), why such a low number of pages crawled any help more than welcomed Dario PS: i have more campaign in place, might that be the reason?
Technical SEO | | Mrlocicero0 -
Drupal issue
Hi seomozzers again, One of my clients uses DRUPAL(cms) and I have an issue when editing any pages. I access the edit section of the page, try to insert meta description tags, save and view the page source and NO meta description tags appear!! Why is that? Is there a specific setting that I need to implement? Under Meta Tags, apparently DRUPAL likes to put canonical tags by default(unless i can tweek one of the settings), and I would like to remove them. The weird thing is that even there are no canonical tags set, when viewing the page source I can still locate a canonical tag. Is there a setting that allow me to remove the canonical by default? Thank you mozzers:)
Technical SEO | | Ideas-Money-Art0 -
301 redirect issues
Hi all, I'm hoping someone will be able to help me with an extermley frustrating problem with 301 redirects in .htaccess. Basically I'm trying to redirect some old pages (from our old website) that stil rank to the new equivilent. For example - old url = www.domain.com/frames/news/company-news/news-reader.php?newsStoryID=395 New www.domain.com/news/article-title I've tried the simple redirect 301 /frames/news/company-news/news-reader.php?newsStoryID=395 http://www.domain.com/news/article-title But this doesnt work. I've also tried - RewriteEngine on
Technical SEO | | EclipseLegal
RewriteCond %{QUERY_STRING} ^newsStoryID=395$
RewriteRule ^/news-reader.php$ http://www.domain.com/news/article-title/? [L,R=301] Could anyone help? I've followed lots of tutorials that all match the above but it just doesn't work! The only other thing within the htaccess file is from wordpress for pretty permalinks - BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress Many thanks in advance!0 -
301 Redirect Issue
I'm having an issue with 301 redirects: Let's see if I can verbalize my thoughts on this one... So we just recently moved our site to Wordpress. One of our new 301 commands is redirecting oursite.com/news to oursite.com/blog . However there are other links from our previous site that look like oursite.com/news/XYZ and the issue is that, because wordpress structures its links differently, that URL is not equivalent to oursite.com/blog/XYZ. Instead, it might look something more like oursite.com/blog/yaddayadda/XYZ or something. Does that make sense? The issue is that when I find an old link of ours on google that looks something like "oursite.com/news/XYZ" or "oursite.com/news/ABC" it is automatically replacing "news" with "blog". When I try to go in manually and redirect anything that says "/news/XYZ" to "/blog/yaddayadda/XYZ" it still doesn't work. It still just replaces "news" with "blog." Wow I realize that might not make sense to anyone but if it does - please advise!! Thanks!!!!
Technical SEO | | EntrustSEO0 -
OnPage Issues with UTF-8 and ISO-8859-1
Hi guys, I hope somebody can help me figure this out. On one of my sites I set the charset to UTF-8 in the content-type meta-tag. The file itself is also UTF-8. If I type german special chars like ä, ö, ß and the like they get displayed as a tilted square with a questionmark inside. If I change the charset to iso-8859-1 they are getting displayed properly in the browser but services like twitter are still having the issues and stop "importing" content once they reach one of those specialchars. I would like to avoid having to htmlencode all on-page content, so my preference would be using UTF-8.. You can see it in action when you visit this URL for example: http://www.skgbickenbach.de/aktive/1b/artikel/40-minuten-fußball-reichen-nicht_1045?charset=utf-8 Remove the ?charset parameter and the charset it set to iso-8859-1. Hope somebody has an answer or can push me into the right direction. Thanks in advance and have a great day all. Jan
Technical SEO | | jmueller0