How to fix this issue?
-
I redesign my website from Wix to HTML.
Now URLs changed to _
http://www.spinteedubai.com/#!how-it-works/c46c
To
http://www.spinteedubai.com/how-it-works.html
Same for all other pages. How I can fix this issue and both pages were also indexed in google.
-
Hi Alexander,
While there are some server-side technical things you can do to force a 404 error for a given URL, the best thing to do is remove the content in question from your server. At the very least this should achieve getting a 404 status code when you attempt to visit the URL that once housed the content. Ideally, if you can configure a custom 404 page that is more user-friendly, that's even better.
Now, depending on how your server is configured, there may be instances when a URL should produce a 404 error, but doesn't. I only bring this scenario up as a possibility because it's something I am currently dealing with on one of the sites I manage.
In any case, you may need to work closely with your server administrator or Web developer to achieve what you need. Most likely, it's just a matter of removing the old content from the server. Hope that helps!
Dana
-
How can I add 404 error? What are the steps?
-
Hi Alexander,
It looks like you've implemented the canonical tags properly. It can, however, take Google a very, very long time (sometimes years) to remove old content. If you really want the old page/URL out of Google's index, the very best and quickest way to achieve that is to make sure that the old page produces a proper 404 status code then use GWT's Remove URL tool to request Google to remove it from their index. This still isn't immediate, but I've seen URLs removed in as little as a week using this method. Hope that helps!
Dana
-
Hi Alexander,
You can either 301 the old page http://www.spinteedubai.com/#!how-it-works/c46c into the new page http://www.spinteedubai.com/how-it-works.html
or you can set up rel=canonical tag if its the same content and you want the keep the URL.
You would then have to either wait or use this to remove the URL - https://www.google.com/webmasters/tools/removals
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Responsive Code Creating Duplicate Content Issue
Good morning, Our developers have recently created a new site for our agency. The site is responsive for mobile/tablets. I've just put the site through Screaming Frog and I've been informed of duplicate H2s. When I've looked at some of the page sources, there are some instances of duplicated H2s and duplicated content. These duplicates don't actually appear on the site, only in the code. When I asked the development guys about this, they advised this is duplicated because of the code for the responsive site. Will the site be negatively affected because of this? Not everything is duplicated, which leads me to believe it probably could have been designed better... but I'm no developer so don't know for sure. I've checked the code for other responsive sites and no duplicates can be found. Thanks in advance, Lewis
Technical SEO | | PeaSoupDigital0 -
Http and https issue in Google SERP
Hi, I've noticed that Google indexing some of my pages as regular http, like this: http://www.example.com/accounts/ and some pages are being indexed as https, like this: https://www.example.com/platforms/ When I've performed site audit check in various SEO tools I got something around +450 pages duplicated and showing me pairs of the same URL pages, one time with http and one time with https. In our site there is the possibility for people to register and and open an account, later on to login to our website with their login details. In our company I'm not the one that is responsible for the site's maintenance and I would like to know if this is an issue, and if this is an issue - to know what causing it and how to fix it so I'll be able to forward the solution to the person in charge. Additionally I would like to know in general, what is the real purpose of https vs. http and to know what is the preferred method that our website should use. Currently when URLs are typed manually to the address bar, all the URLs are loading fine - with or without https written at the start of each URL. I'm not allowed to expose our site's name, this is why I wrote example.com instead, I hope you can understand that. Thank you so much for your help and I'm looking forward reading your answers.
Technical SEO | | JonsonSwartz0 -
How do I Address Low Quality/Duplicate Content Issue for a Job portal?
Hi, I want to optimize my job portal for maximum search traffic. Problems Duplicate content- The portal takes jobs from other portals/blogs and posts on our site. Sometimes employers provide the same job posting to multiple portals and we are not allowed to change it resulting in duplicate content Empty Content Pages- We have a lot of pages which can be reached via filtering for multiple options. Like IT jobs in New York. If there are no IT jobs posted in New York, then it's a blank page with little or no content Repeated Content- When we have job postings, we have about the company information on each job listing page. If a company has 1000 jobs listed with us, that means 1000 pages have the exact same about the company wording Solutions Implemented Rel=prev and next. We have implemented this for pagination. We also have self referencing canonical tags on each page. Even if they are filtered with additional parameters, our system strips of the parameters and shows the correct URL all the time for both rel=prev and next as well as self canonical tags For duplicate content- Due to the volume of the job listings that come each day, it's impossible to create unique content for each. We try to make the initial paragraph (at least 130 characters) unique. However, we use a template system for each jobs. So a similar pattern can be detected after even 10 or 15 jobs. Sometimes we also take the wordy job descriptions and convert them into bullet points. If bullet points already available, we take only a few bullet points and try to re-shuffle them at times Can anyone provide me additional pointers to improve my site in terms of on-page SEO/technical SEO? Any help would be much appreciated. We are also thinking of no-indexing or deleting old jobs once they cross X number of days. Do you think this would be a smart strategy? Should I No-index empty listing pages as well? Thank you.
Technical SEO | | jombay3 -
Magento CMS Block Issue --- Help Please
Good Morning, We have a Magento shopping cart based site running on RedHat version of Linux. We had a CMS block created for the homepage of http://goo.gl/JgK1e designed to be visible only on the homepage only and nowhere else. We copied the entire site structure onto a new URL http://goo.gl/XUH3f . (this one running on CentOS) and have an odd situation on our hands... Even though the CMS block “static_after_footer_block” is “enabled”, it either completely disappears (moments later), or whenever it does display, it is visible in ALL levels of the site (not just the homepage it was designed for) Other than this anomaly, the site seems to be operating correctly… Anyone out there with some insight? Thanks!
Technical SEO | | Prime850 -
Local Search | Website Issue with Duplicate Content (97 pages)
Hi SEOmoz community. I have a unique situation where I’m evaluating a website that is trying to optimize better for local search and targeting 97 surrounding towns in his geographical location. What is unique about this situation is that he is ranking on the 1st and 2nd pages of the SERPs for his targeted keywords, has duplicate content on 97 pages to his site, and the search engines are still ranking the website. I ran the website’s url through SEOmoz’s Crawl Test Tool and it verified that it has duplicate content on 97 pages and has too many links (97) per page. Summary: Website has 97 duplicate pages representing each town, with each individual page listing and repeating all of the 97 surrounding towns, and each town is a link to a duplicate page. Question: I know eventually the site will not get indexed by the Search Engines and not sure the best way to resolve this problem – any advice?
Technical SEO | | ToddSEOBoston0 -
Any issues with lots of pages issuing 301 redirects?
Hi all, I'm working on a site redesign and it is possible the new site could issue a lot of 301 redirects as we may migrate from one forum solution to another. Is there any issue with crawlers getting a lot of 301 redirects from a site? Thanks Nick
Technical SEO | | nickswan0 -
What is with WordPress Dupe issues?
Hi, Just wondering if anyone can explain for me why it seems every tag that is entered in WP blog posts on a site creates a duplicate page (identified by ROGER and friends in SEOmoz crawl)? Obviously if you can offer a solution (apart from the extremely obvious "don't use tags") I would be immensely grateful. Thanks so much,
Technical SEO | | ShaMenz0 -
Crawl issues/ .htacess issues
My site is getting crawl errors inside of google webmaster tools. Google believe a lot of my links point to index.html when they really do not. That is not the problem though, its that google can't give credit for those links to any of my pages. I know I need to create a rule in the .htacess but the last time I did it I got an error. I need some assistance on how to go about doing this, I really don't want to lose the weight of my links. Thanks
Technical SEO | | automart0