Domain added to the url
-
Hi,
I am having a problem with my Wordpress site http://www.food-and-garden. It seems the domain www.food-and-garden.com is added to the url, for example http://www.food-and-garden.com/recipe-items/blini-lumpfish-roe/www.food-and-garden.com
Right now I have 249 404 errors. That's a lot!
I found a Q&A similar to this, and as far as I understand, it has to do with relative links. I quess somewhere on my site there is href="www.food-and-garden.com" instead of href="http://www.food-and-garden.com"
My question is, how do I find the broken link?
Thank you!
-
Hey there,
This is Michael on the Moz Help Team! Sorry to hear about the trouble here. To see the source of those errors, you’ll need to export your full Crawl Diagnostics CSV, which you can do by clicking the "Email CSV" button at the bottom of the issues overview. Once the report generates, it will be emailed to you and you will want to open it up in something like Excel. In the first column, perform a search for the URL of the page you are looking for. When you find the correct row, look in the column labeled Referrer (column AM). This tells you the referral URL of the page where our crawlers first found the target URL. You can then visit this URL to find where we linked to that page.
Hope that helps!
-
You will need to open the CSV with a spreadsheet program (upload it to Google Docs). From there you can go to the referral column and it will tell you which page the error is linking from. You will then need to edit the place on the page where the error is occurring or find if you have a widget on the page that is throwing the error.
-
Hi, I did the site myself but I am very much a newbie to Wordpress (or any other kind of system). I have downloaded the CSV but don't know what to look for.
All I see is: http://www.food-and-garden.com/second-kitchen-garden/www.food-and-garden.com,false,false,true,false,false,false,false,false,false,false,false,false,false,false,false,false,false,false,false,false,404 : Received 404 (Not Found) error response for page.,1,0,0,0,,,,404,1,,76,,55,0,"",,http://www.food-and-garden.com/second-kitchen-garden/,"",Error attempting to request page
How do I go on from here?
Sorry about my poor English. I am Danish and not used to handling this kind of problem in English
-
Karin,
We had this same problem on our Wordpress site, it comes from improperly coded link on pages that had certain modules active (it was fixed by making the link absolute). You can use Moz's crawl diagnostic tool to email yourself a CSV then look for the errors and go across to the referral column to find which page has the erroneous link, from there you can edit your content to fix it.
-
Hi Ryan. It seems Xenu Link Sleuth only runs on Windows. But I am gonna try Screaming Frog and see if anything turns up. Thanks!
-
Hi Karin. Try out the crawl tool: Xenu Link Sleuth. It will show you broken links like these along with the page on which they're located.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved How do I test a single URL?
Hi,
Getting Started | | imranh70
I ran the crawler, and it found a few errors on the site. I have fixed these errors. I don't want to run the crawler against the entire site again. I just want to run the test against a single page. How can I do that?0 -
Site with 2 domains - 1 domain SEO opimised & 1 is not. How best to handle crawlers?
Situation: I have a dual domain site:
Getting Started | | DGAU
Domain 1 - www.domain.com is SEO optimised with product pages and should of course be indexed.
Domain 2 - secure.domain.com is not SEO optimised and simply has checkout and payment gateway pages. I've discovered that Moz automatically crawls Domain 2 - the secure.domain.com site and consequently picks up hundreds of errors.
I have put an end to this by adding a robots.txt to stop rogerbot and dotbot (mozs crawlers) from crawling domain 2. This fixes my errors in Moz reports however after doing more research into 'Crawler Control' I figure this might be the best option. My Question: Instead of using robots.txt to stop moz from crawing all of Domain 2 should I use on each page of domain 2? I believe this would then allow moz and google to crawl Domain 2 but also tell them both not to index it.
My understanding is that this would be best, and might even help my overall SEO by telling google not to give any SEO value to the Domain 2 pages?0 -
Can someone help me to gain Moz trust and domain authority?
Hi someone help me how to gain moz trust and domain authority for my web site. here is my web addresss: www.bassinotary.com/ please tell me how i can improve ranking for my site. please help. thanks.
Getting Started | | grbassi0 -
URL Length - Another Question
Question: Is the " http:// " included in the character count when figuring the length of a URL? (I am asking this as a separate question because the other URL question is shown as answered)
Getting Started | | BuyMachineryNow0 -
URL Length
I have a simple question. How many characters can a URL be before Moz flags it as too long?
Getting Started | | BuyMachineryNow1 -
Adding keywords to landing pages
Hey Forum, I have a quick question. So I have added in my keywords to track etc and want to then add the keywords to a specific landing page how do i do that? Is it manage and track and add the keywords and URL there then they appear on the landing pages? Thank you,
Getting Started | | TAT1001 -
Moz not recognising url to setup a new account
I am attempting to setup a new account for my client but Moz is having a problem and not registering the url http://www.esc.uk.net/. I get the "oops" message & read the info on the FAQ page it refers to. Why do I keep getting an "Oops" message when I type and submit my campaign URL? To create a campaign, we need to receive a valid HTTP response from the web server hosting your site. While the site may be accessible from a standard web browser, other user-agents and search engine bots may receive a different response. If you receive this error, we recommend running a test by sending an HTTP response using any online HTTP tool. Using an online HTTP tool I believe I received an HTTP response. What else can I do? Regards Terry
Getting Started | | Buzzin0 -
This does not appear to be a valid URL.
Hi Guys, Just tried to Create a Campaign in the form of: www.example.tv and I get the following message: Oops.. This does not appear to be a valid URL. any idea what's next !? [Edited to remove NSFW URL]
Getting Started | | mvag0