DMOZ help
-
So yesterday I got a DMOZ editor account. I would like to know if Google indexes the editor profile pages on DMOZ:
http://www.dmoz.org/public/profile?editor=
here are some examples
http://www.dmoz.org/public/profile?editor=thehelper
http://www.dmoz.org/public/profile?editor=raph3988
http://www.dmoz.org/public/profile?editor=skasselea
I would like to know if it is worth while to build up this page so it will pass link juice. And can anyone tell me how frequently Google crawls for new editors (if that's possible?)
-
Hello,
I wouldn't bet on it, but there's no harm in trying
-
You can confirm this yourself.
First, do a Google search for site:http://www.dmoz.org/public/profile?editor=
You see the meta descriptions aren't indexed in the results? Instead, Google puts a default message, with a link to this page.. https://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449 - check that out. Note the paragraph:
"While Google won't crawl or index the content of pages blocked by robots.txt, we may still index the URLs if we find them on other pages on the web. As a result, the URL of the page and, potentially, other publicly available information such as anchor text in links to the site, or the title from the Open Directory Project (www.dmoz.org), can appear in Google search results."
So whilst they may appear in Google's index (and indeed the OSE one) because of the links pointing to them, the content isn't crawled at all (by any spiders that obey robots.txt).
-
Oh yes he is correct, good call Neil, I had no idea that the robot.txt would be publicly accessible. I actually never seen a site have their robot txt visible.. I guess it's the "open source"...
-
Can anyone co firm this?
-
Take a look at their robots.txt - http://www.dmoz.org/robots.txt
They disallow the /public and /editors subfolders. The editor pages, whilst indexed by Google, aren't crawled.. so whilst the location of the pages themselves is indexed (because of links to those pages), the contents of the pages aren't indexed. This means any links on the page too, obviously..
For this reason, I don't agree with Reload Media. For me, there's no point expending any effort promoting the page for link equity benefit.
The fact they show good authority on OSE is something of an anomoly. They can accrue authority (and indeed Google PR) from their inbound links, however, they are a bit of a dead end, due to the fact that no actual content is indexed.
-
Hi Raphael,
Well done on getting an editor account. Remember with great power comes great responsibility
Yes they do get indexed. The way to check this is to Google the url in "" i.e. "http://www.opensiteexplorer.org/links?site=www.dmoz.org%2Fpublic%2Fprofile%3Feditor%3Dthehelper"
Some of those editor pages have great Authority. http://www.opensiteexplorer.org/links?site=www.dmoz.org%2Fpublic%2Fprofile%3Feditor%3Dthehelper
If it's related to your niche, then would be worth pursuing.
Hope that helps
Iain - Reload Media
-
Using http://pro.seomoz.org/tools/on-page-keyword-optimization, you can check individual pages, in the keyword field I put the helper and in the URL I put http://www.dmoz.org/public/profile?editor=thehelper... So it seems like it does get indexed : )
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Language Tunnel - Help!
Hi, First post here. A few months back (before they were my client), my client updated their site to include a language tunnel. It looks like some other updates were made as well to "prettify" the site's URLs. Unfortunately, after this update, lots of well-ranking landing pages are now completely gone with no redirects in place. Normally, I would just give them a list of these old pages and say "301 Redirect" to X page. However, as part of this site update, they added country code into the mix. So now, instead of just 6 or 7 languages, we are looking at 30-40 permutations of language and country (with some countries having multiple languages). The functionality of the new site is fine, but all of the old 404s are not being kind to the search engine traffic. My question is: what's the best way to resolve this problem? These old pages usually specify a language code (but no country code). So, for example, I am thinking of redirecting all of the Spanish 404 urls to a Spanish "country tunnel". However, this is obviously not the same as what we had before, where the actual pages were indexed. Since my old pages no longer exist and I've got this country problem now (to stand in the way of a straightforward redirect), is there any way to appease the SEO gods on this?
Intermediate & Advanced SEO | | navdm0 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Help needed for a 53 Page Internal Website Structure & Internal Linking
Hey all... I'm designing the structure for a website that has 53 pages. Can you take a look at the attached diagram and see if the website structure is ok? On the attached diagram I have numbered the pages from 1 to 53, with 1 being the most important home page - 2,3,4,5, being the next 4 important pages - 6,7,8... 15,16,17 being the 3rd set of important pages, and 18,19,20..... 51,52,53 being the last set of pages which are the easiest to rank. I have two questions: Is the website structure for this correct? I have made sure that all pages on the website are reachable. Considering the home page, and page number 2,3,4,5 are the most important pages - I am linking out to these pages from the the last set of pages (18,29,20...51,52,53). There are 36 pages in the last set - and out of this 36, from 24 of them I am linking back to home page and page number 2,3,4,5. The remaining 8 pages of the 36 will link back to pages 6,7,8...15,16,17. In total the most importnat page will have the following number of internal incoming links: Home Page : 25 Pages 2,3,4,5 : 25 Pages 6,7,8...15,16,17 : 4 Pages 18,19,20...51,52,53 : 1 Is this ok considering home page, and pages 2,3,4,5 are the most important? Or do you think I should divide and give more internal links to the other pages also? If you can share any inputs or suggestions to how I can improve this it will greatly help me. Also if you know any references for good guides to internal linking of websites greater that 50 pages please share them in the answers. Thank you all! Regards, P.S - The URL for the image is at http://imgur.com/XqaK4
Intermediate & Advanced SEO | | arjun.rajkumar810 -
Help needed regarding 1:1 Redirection
Hi all, We are currently working on a very large site having approximately 5000+ pages and its going to be 301 redirected to a new domain. For this we need to redirect each and every page on a 1:1 basis as mentioned in the Webmaster Central guide. Now the site is in flt file and not in CMS and is becoming very tough to set manually redirection for these pages. The site is hosted in a Windows server and using IIS web config file. Now any help regarding any automated or easy way to do the 1:1 redirection will be appreciated. Thanks in advance,
Intermediate & Advanced SEO | | ITRIX0 -
Adding Meta Languange tag to xhtml site - coding help needed
I've had my site dinged by Google and feel it's likely several quality issues and I'm hunting down these issues. One of Bing's Webmaster SEO tools said my xhtml pages (which were built in 2007) are missing Meta Language and suggested adding tag in the or on the html tag. Wanting to "not mess anything up" and validate correctly, I read in **W3C's site and it said: ** "Always add a lang attribute to the html tag to set the default language of your page. If this is XHTML 1.x you should also use the xml:lang attribute (with the same value). Do not use the meta element with http-equiv set to Content-Language." My current html leads like: QUESTION:
Intermediate & Advanced SEO | | mlm12
I'm confused on how to add the Meta Language to my website given my current coding as I"m not a coder. Can you suggest if I should add this content-language info, and if so, what is the best way to do so, considering valid w3c markup for my document type? Thank you!!!
Michelle0 -
End of March we migrated our site over to HubSpot. We went from page 3 on Google to non existent. Still found on page 2 of Yahoo and Bing. Beyond frustrated...HELP PLEASE "www.vortexpartswashers.com"
End of March we migrated our site over to HubSpot. We went from page 3 on Google to non existent. Still found on page 2 of Yahoo and Bing under same keywords " parts washers" Beyond frustrated...HELP PLEASE "www.vortexpartswashers.com"
Intermediate & Advanced SEO | | mhart0 -
Corporate pages and SEO help
We own and operate more than two dozen educational related sites. The business team is attempting to standardize some parts of our site hierarchy so that our sitemap.php, about.php, privacy.php and contact.php are all at the root directory. Our sitemap.php is generated by our sitemap.xml files, which are generated from our URLlist.txt files. I need to provide some feedback on this initiative. I'm worried about adding more stand-alone pages to our root directory and as part of a separate optimization in the future I was planning to suggest we group the "privacy", "about" and "contact" pages in a separate folder. We generally try to put our most important pages/directories for SEO in the root as our homepages pass a lot of link juice and have high authority. We do not invest SEO time into optimizing these pages as they're not pages we're trying to rank for, and I've already been looking into even no-following all links to them from our footer, sitemap, etc. I know that adding these "corporate" pages to a site are usually a standard part of the design process but is there any SEO benefit to having them at the root? And along the same lines, is there any SEO harm to having unimportant pages at the root? What do you guys think out there in Moz land?
Intermediate & Advanced SEO | | Eric_edvisors0 -
Schema.org helps ranking?
Hello everybody! I want to know if Schema.org has any impact on ranking? Thanks!
Intermediate & Advanced SEO | | seomasterbrasil0