To answer your question with at least a hint of factual background I propose to start by looking at the timeline of creation of man
, info
and other documentation systems.
The first man
page was written in 1971 using troff (nroff was not around yet) in a time when working on a CRT based terminal was not common and printing of manual pages the norm. The man pages use a simple linear structure. The man pages normally give a quick overview of a command, including its commandline option/switches.
The info
command actually processes the output from Texinfo typesetting syntax. This had its initial release in February 1986, a time when working on a text based CRT was the norm for Unix users, but graphical workstations still exclusive. The .info
output from Texinfo provides basic navigation of text documents. And from the outset has a different goal of providing complete documentation (for the GNU Project). Things like the use of the command and the commandline switches are only a small part of what an Texinfo file for a program contains.
Although there is overlap the (Tex)info system was designed to complement the man
pages, and not to replace them.
HTML and web browsers came into existence in the early 90s and relatively quickly replaced text based information systems based on WAIS and gopher.
Web browsers utilised the by then available graphical systems, which allows for more information (like underlined text for a hyperlink) then text-only systems allow. As the functionality info
provides can be emulated in HTML and a web browser (possible after conversion), the browser based system allow for greater ease of navigation (or at least less experience/learning).
HTML was expanded and could do more things than Texinfo can. So for new projects (other than GNU software) a whole range of documentation systems has evolved (and is still evolving), most of them generating HTML pages. A recent trend for these is to make their input (i.e. what the human documenter has to provide) human readable, whereas Texinfo (and troff) is more geared to efficient processing by the programs that transform them.¹
info
was not intended to be a replacement for the man pages, but they might have replaced them if the GNU software had included a info2man
like program to generate the man pages from a (subset of a larger) Texinfo file.
Combine that with the fact that fully utilising the facilities that a system like Texinfo, (La(TeX, troff, HTML (+CSS) and reStructured Text provide takes time to learn, and that some of those are arguably more easy to learn and/or are more powerful, there is little chance of market dominance of (Tex)info
.
¹ E.g reStructured Text, which can also be used to write man pages
Best Answer
There's a list on Wikipedia, which includes the following:
khelpcenter relies on info2html which could also be used to enable reading info files with any browser. However, the converted pages lack tons of useful features, like search and access to the index; even if, like me, you find the
info
implementation of those features lacking, they are still better than nothing.