Redirects and site map isn't showing
-
We had a malware hack and spent 3 days trying to get Bluehost to fix things. Since they have made changes 2 things are happening:
1. Our .xml sitemap cannot be created https://www.caffeinemarketing.co.uk/sitmap.xml we have tried external tools
2. We had 301 redirects from the http (www and non www versions) nad the https;// (non www version) throughout the whole website to https://www.caffeinemarketing.co.uk/ and subsequent pages
Whilst the redirects seem to be happening, when you go into the tools such as https://httpstatus.io every version of every page is a 200 code only whereas before ther were showing the 301 redirects
Have Bluehost messed things up? Hope you can help
thanks
-
I agree with what effectdigital said. It looks like everything is in place and your non-www and you http versions of the website are redirecting to the https-www version of the site.
-
That attachment shows that non HTTPS and non WWW URLs are being 301 redirected to the HTTPS-WWW version(s). That's what you want right? From your screenshot it seems like it is working how you want
Just so you know, when you put one architecture into Screaming Frog (e.g: you put in HTTP with no WWW), it doesn't limit the crawl to that specific architecture. If the crawler is redirected from non-WWW non HTTPS to HTTPS with WWW, then the crawler will carry on crawling THAT version of the site
If you wanted to crawl all of the old HTTP-non-WWW URLs, you would need to list all of them for SF in list mode and alter the crawlers settings to 'contain' it to just the list of URLs which you entered. I'm pretty sure then, you would see that most of the HTTP-non-WWW URLs are properly redirecting as they should be
As for the XML thing it's very common especially for people using Yoast. I think Yoast is really good by the way, but for some reason, on some hosting environments the XML sitemap starts blank-rendering. Most of the time hosting companies say they can't fix it and it's Yoast's fault but I don't really believe that. If a file (e.g: sitemap.xml) cannot be created, it's more likely they went in via FTP and changed some file read/write permissions and due to it being more locked down, the XML cannot be created anymore. If you were hacked by malware, they were likely over-zealous when locking your site back down and it's causing problems for your XML feed(s)
-
see attachement
-
Hi, are you able to please interpret this for me. It looks like the non www versions are showing as https://www version on 200. the home page looks like the only 301???
-
Hi Carrie,
For your 301 redirects on the root level, it sounds like the .htaccess file has changed on the server. Can you try validating those other http and non-www versions of the website through other tools like ScreamingFrog? If you're still getting 200 response codes, I would advise raising the issue with Bluehost as this is something they can fix.
As for the XML sitemap, do you mean that you're unable to upload a file to that location? Have you tried sFTP?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mobile first - what about content that you don't want to display on mobile?
ANOTHER mobile first question. Have searched the forum and didn't see something similar. Feel free to passive- aggressively link to an old thread. TL;DR - Some content would just clutter the page on mobile but is worth having on desktop. Will this now be ignored on desktop searches? Long form: We have a few ecommerce websites. We're toying with the idea of placing a lot more text on our collection/category pages. Primarily to try and set the scene for our products and sell the company a bit more effectively. It's also, obviously, an opportunity to include a couple of long tail keywords. Because mobile screens are small (duh) and easily cluttered, we're inclined _not _to display this content on mobile. In this case; will any SEO benefit be lost entirely, even to searchers on desktop? Sorry if I've completely misunderstood mobile-first indexing! Just an in-house marketing manager trying to keep up! cries into keyboard Thanks for your time.
Technical SEO | | MSGroup
Ross0 -
Redirecting .edu subdomains to our site or taking the link, what's more valuable?
We have a relationship built through a service we offer to universities to be issued a .edu subdomain that we could redirect to our landing page relevant to that school. The other option is having a link from their website to that same page. My first question is, what would be more valuable? Can you pass domain authority by redirecting a subdomain to a subdirectory in my root domain? Or would simply passing the link equity from a page in their root domain to our page pass enough value? My second question is, if creating a subdomain with a redirect is much more valuable, what is the best process for this? Would we simply have their webmaster create the subdomain for us an have them put a 301 redirect to our page? Is this getting in the greyer hat area? Thanks guys!
Technical SEO | | Dom4410 -
Should I disavow links from pages that don't exist any more
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist. There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file? Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not? Hope Im making sense 🙂
Technical SEO | | IgorMateski0 -
Crawl Test Report only shows home page and no inner site pages?
Hi, My site is [removed] When I first tried to set up a new campaign for the site, I received the error: Roger has detected a problem: We have detected that the root domain [removed] does not respond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information. I then ran a Crawl Test per the FAQ. The SEOmoz crawl report only shows my home page URL and does not have any inner site pages. This is a Joomla site. What is the problem? Thanks! Dave
Technical SEO | | crave810 -
Google Showing Multiple Listings For Same Site?
I've been optimizing a small static HTML site and have been working to increase the keyword rankings, yet have always ranked #1 for the company name. But, I've now noticed the company name is taking more than just the first position - the site is now appearing in 1st, 2nd, and 3rd position (each position referencing a different page of the site). Great.. who doesn't want to dominate a page of Google! ..But it looks kind of untidy and not usually how links from the same site are displayed. Is this normal? I'm used to seeing results from the same site grouped under the primary result, but not like this. any info appreciated 🙂
Technical SEO | | GregDixson0 -
Can't for the life of me figure out how this is possible !! Any ideas ?
I would imagine it's not all that easy to rank on 1st page ( not going for 1st position here ) for https://www.google.com.au/search?q=credi+cards. I am looking at the AU market. For some reason which I can't figure out Everyday Money Credit Card ( https://www.woolworthsmoney.com.au/ ) ranks number 4. The home page redirect to https://www.woolworthsmoney.com.au/wowm/wps/wcm/connect/wowmoney/wowmoney/home/home/ Why have your homepage in this format ? I would love to hear any theories you guys might have. It does not look like they have a strong link profile , I could not figure out how old the domain was or what other possible reason there is for the site to rank .
Technical SEO | | RuchirP0 -
We're working on a site that is a beer company. Because it is required to have an age verification page, how should we best redirect the bots (useragents) to the actual homepage (thus skipping ahead of the age verification without allowing all browsers)?
This question is about useragents and alcohol sites that have an age verification screen upon landing on the site.
Technical SEO | | OveritMedia0 -
Redirect Flash Site for Google Only - Is this against TOS?
A photographer client has a flash website, purchased as from a (well respected) template company. The main site is at the root domain, and the HTML version is at www.example.com/?load=html If I visit the site on a browser without Flash installed, I am re-directed automatically to the HTML version. I'm concerned as the site has some great links and the HTML version is well optimised, but doesn't appear anywhere in Google for chosen keywords (ranks perfectly for brand related searches). Google is indexing the Flash version of the site, but I would rather it didn't (there's no real content (just Javascript to load the SWF) and all of the pages load under one URL). How can I block the Flash version from Google but still make the incoming links count towards the HTMl version of the site? If I re-direct Google to the HTML version, is this cloaking, and is it frowned upon? Thanks for any advice you can offer.
Technical SEO | | cmaddison0