Humanist Discussion Group, Vol. 14, No. 502.
Centre for Computing in the Humanities, King's College London
<http://www.princeton.edu/~mccarty/humanist/>
<http://www.kcl.ac.uk/humanities/cch/humanist/>
[1] From: Randall Pierce <rpierce@jsucc.jsu.edu> (13)
Subject: information
[2] From: "Osher Doctorow" <osher@ix.netcom.com> (36)
Subject: Re: 14.0498 amounts of information
--[1]------------------------------------------------------------------
Date: Mon, 20 Nov 2000 07:15:25 +0000
From: Randall Pierce <rpierce@jsucc.jsu.edu>
Subject: information
While in graduate school I studied Interpersonal Classroom Interaction.
It was a method to increase student participation in the learning
process. Willard McCarty's comments are very apropos to this technique.
Through gestures and pertinent questions, the teacher was to generate
information exchange and development. This type of Socratic Method
encouraged the student to take a fact and expand it into new areas of
inquiry. This could be quantified and measured with appropriate
descriptive symbols. How would you determine how many information "bits"
an exchange generated.? Often the original concept was soon almost lost
in "cognitive connections" to other fields. A datum in history might
give rise to its political and sociological implications. Are these to
counted as one "byte", or several?. I think this concept gives a
different view of the quanification of units of information. Randall
--[2]------------------------------------------------------------------
Date: Mon, 20 Nov 2000 07:16:15 +0000
From: "Osher Doctorow" <osher@ix.netcom.com>
Subject: Re: 14.0498 amounts of information
From: Osher Doctorow osher@ix.netcom.com, Wed. Nov. 15, 2000 7:17AM
WM is correct in asking for clarification on information. Knowledge as used
in humanities and science has almost nothing to do with information/entropy
as used in engineering and computers. In fact, even information and entropy
are so filled with questions that only a (humanist) philosopher could begin
to examine their logical ontology. It would probably require
demystification of the whole field of information/entropy, which
philosophers are generally afraid of and engineers generally protect as
their private domain somewhat similarly to politicians in politics. In
logic-based probability (LBP), we study knowledge, or to include everything
in the field we study knowledge-information-entropy (KIE). An immediate
question for philosopher/humanists is: what can be do about the
contradictory/paradoxical behavior of engineering information near rare
events? You did not know about that? Neither does most of the world.
Engineers use logarithmic informtion and entropy, which blows up (because
negatively infinite, in fact) near and at probability zero (very rare)
events. KIE resolves the problem in those regions by replacing
logarithmic information by its reflection geometrically (technically about
the main diagonal y = x), called exponential KIE or technically negative
exponential KIE. This KIE does not behave paradoxically at or near very
rare, rare, or even probability zero events. You don't need any more
technical information to start exploring KIE, but if you want it, look at
abstracts of my 48 papers at http://www.logic.univie.ac.at, Institute for
Logic of the University of Vienna (select ABSTRACTS, then select BY AUTHOR,
then select my name), or read my recent paper just published in Quantum
Gravity, Generalized Theory of Gravitation, and Superstring Theory-Based
Unification, Eds. B. N. Kursunoglu, S. L. Mintz, and A. Perlmutter, Kluwer
Academic/Plenum: New York 2000. By the way, latent variable theory in
psychological/educational measurement/research and validity/reliability
theory has some concepts similar to knowledge quantitatively.
Osher Doctorow
information
> [1] From: Randall Pierce <rpierce@jsucc.jsu.edu> (7)
> >
> [2] From: Willard McCarty <willard.mccarty@kcl.ac.uk> (13)
> Subject: what's information?
>
This archive was generated by hypermail 2b30 : 11/20/00 EST