Login webpage blocked by robots
-
Hi, the SEOMOZ crawl diagnostics shows that this page:
www.tarifakitesurfcamp.com/wp-login.php is blocked (noindex, nofollow)
Is there any problem with that?
-
thanks!
-
thanks!
-
Unless you have relevant information for your users on the log in page (i.e. for your private use) then it's probably a good idea not to index it!
-
Nope, that's perfectly fine since that's your login page for Wordpress.
If you're linking to the page from anywhere on your site (which you really shouldn't be), you could update the meta robots tag to (noindex, FOLLOW), but since it looks like the page has no links, it shouldn't be necessary.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it urgent to have fewer than 100 internal links on a webpage?
Hi, Our website is set up so that our top menu is on every page, which means every page is going to have around the same amount of internal links (225-ish). Is this an issue that needs to be fixed for our pages to rank, or is it only a recommendation that doesn't really impact SEO that much? If it is the only issue listed for a particular page, is there another reason that page might not be ranking even though it has a 99 score? Or is because of having 225 internal links? I have many product pages on my website that have a 99 score on the Page Optimization with the only recommendation being an info that says not to have too many internal links. My understanding is that internal links are defined as any URL on a page pointing to another part of the same root domain/site. So, for example, my page: https://www.twowayradiosfor.com/Motorola-CP185-p/cp185-lkp.htm has 225 internal links in the source code for that page:
On-Page Optimization | | AllChargedUpWhere do I go to fix this issue if I need to get to below 100 internal links? Do I erase the links, or set up a no-follow tag? I appreciate any help or guidance. Thank you! Austin
2 -
Blocking internal search results
Hello Everyone, Does anyone know how I can block Google from indexing internal search results? Thanks. Ryan
On-Page Optimization | | RyanUK0 -
Two Robots.txt files
Hi there Can somebody please help me that one of my client site have two robot.txt files (please see below). One txt file is blocked few folders and another one is blocked completely all the Search engines. Our tech team telling that due to some technical reasons they using second one which placed in inside the server and search engines unable to see this file. www.example.co.uk/robots.txt - Blocked few folderswww.example.co.uk/Robots.txt - Blocked all Search Engines I hope someone can give me the help I need in this one. Thanks in advance! Cheers,
On-Page Optimization | | TrulyTravel
Satla0 -
Description tag not showing in the SERPs because page is blocked by Robots, but the page isn't blocked. Any help?
While checking some SERP results for a few pages of a site this morning I noticed that some pages were returning this message instead of a description tag, A description for this result is not avaliable because of this site's robot.s.txt The odd thing is the page isn't blocked in the Robots.txt. The page is using Yoast SEO Plugin to populate meta data though. Anyone else had this happen and have a fix?
On-Page Optimization | | mac22330 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
I have more pages in my site map being blocked by the robot file than I have being allowed to be crawled. Is Google going to hate me for this?
Using some rules to block all pages which start with "copy-of" on my website because people have a bad habit of duplicating new product listings to create our refurbished, surplus etc. listings for those products. To avoid Google seeing these as duplicate pages I've blocked them in the robot file, but of course they are still automatically generated in our sitemap. How bad is this?
On-Page Optimization | | absoauto0 -
Webpage with around 200+ inter links per page
HI, I have a client that I recently took on and I run a SEOmoz crawl on the website and a minor error was that .. 1. web pages have more than 200+ interlinks, reason for this is becuase the client offers a service to every place in the world and then has links to contact us/about us etc those kind of pages. is this something we need to strongly avoid? 2. The title tags are all over the suggested amount by 1/2 letters. - again is this really bad?
On-Page Optimization | | Prestige-SEO0 -
What's the best practice for implementing a "content disclaimer" that doesn't block search robots?
Our client needs a content disclaimer on their site. This is a simple "If you agree to these rules then click YES if not click NO" and you're pushed back to the home page. I have this gut feeling that this may cause an upset with the search robots. Any advice? R/ John
On-Page Optimization | | TheNorthernOffice790