Is Information Some Kind of Data ?
We discuss here problems and ideas presented in the following papers:
Information Meaningful Data ?“ (2002), Menant “Information and Meaning” (2002), and
Roederer, “On the Concept of Information and its Role in Nature” (2002). The aim is to
separate and determine such concepts as genuine information, false information or
misinformation, disinformation, and pseudoinformation.
In the interesting paper “Is Information Meaningful Data ?“ (2002), Luciano Floridi
discusses the problem whether information is any meaningful data or meaningful data must
have additional properties to be information.
This discussion has three aspects that need clarification. The first one is methodological,
dealing with the scientific base of the discussion. The second aspect is ontological related
to the essence and nature of information. The third aspect is information theoretical aimed
at elucidation of the scope of this theory.
The first issue demands explication of the grounds for this discussion on the meaning of
the term “information.” Many assume that there is something that is called information and
we have to disclose what it is. This is the common sense approach, which puts names
ahead of things, attributing real life phenomena to words.
In contrast to this, the scientific approach corresponds words to real life phenomena.
According to it, the question is not what information is, but what is adequate to call by the
name “information.” We begin with a cluster of situations to which the word “information”
is usually related. The task of science is to explain these situations, formatting, if
necessary, a concept with the name “information.” So, the problem is not to explain how
people use this word, but to form a reasonable scientific content for it.
For example, people divided all objects that give light and are observed in the sky into
three categories: the Sun, the Moon, and stars. These categories were built based on the
observed size of these celestial bodies and the intensity of their light. Later, when
astronomers learned more about movements of these bodies, they discovered that some of
these objects that were called stars have different nature and introduced a new category,
using a new name “planet.” Much later physicists found that the Sun is also a star.
However, if we only discuss how people use the words “Sun” and “stars,” we would never
be able to come to this conclusion.
In a similar way, now many experts, including Floridi, call some kind of enhanced data
(meaningful, meaningful and true, organized etc.) by the name information. They believe
that information does not depend on interaction. However, information is an objective
phenomenon, for which it is more productive to investigate how it exists and functions,
than to deliberate on the meaning that different people assign to this word. Comprehensive
analysis of the phenomenon made possible to discover that information has very different
nature from data and knowledge. In contrast to this widespread opinion, the general theory
of information (Burgin, 2002) suggests that if we compare data with substance, then
information relates to data (and knowledge) in the world of structures as energy relates to
substance in the physical world. This correlates with the opinion of Warner (1996) that
data will need to be manipulated to give information. In this context, to speak about
representation of information, as does Roederer, looks similar to discussing representation
of energy. As energy, information may be produced, stored, extracted,
transformed/processed, sent, and received, but not represented.
It is interesting that while Floridi assumes that meaning is a necessary attribute of
information, Menant (2002) writes that “it is quite natural that information and meaning
are different things.” From this Menant derives that there is meaningful and meaningless
information. At the same time, the principles of information theory given in the paper of
Floridi imply that the concept “meaningless information” is as meaningless as the concept
“false information.” The general theory of information supports the idea of existence of
meaningless information and makes possible to develop further this concept, relating to it
quantitative estimations of meaning. The base of such estimation is the semantic measure
of information. Shannon’s quantity of information gives an example of a semantic measure
Many examples show that empirically it is more relevant to treat only information for a
system, i.e., the relative form of information. Objective information, which is independent
of any receptor, is only a theoretical construct that is modeled by operators in system state
spaces (Burgin, 1994; 1995).
At the same time, if we assume, following Floridi, that some kind of meaningful data is
information, then we have to admit that he is right, writing that the principle of genetic
neutrality (GN) “supports the possibility of information without an informed subject, to
adapt a Poperian phrase. Meaning is not (at least, not only) in the mind of the user.”
However, of data only contain information as a distinct essence, then it is also possible
to have information without “an informed subject.” The acceptor of information may, for
example, an artificial system such as computer. At the same time, according to the general
theory of information, to speak about “objective” information without an acceptor is
similar to discussing “objective” molecules without matter.
Floridi (2002) discusses possibility of existence of false information. To do this in an
adequate scientific manner, it is necessary to take into account three important issues:
historical context, multifaceted approach to reality, and personal context.
First, the dichotomic approach, which is based on classical two-valued logic and rigidly
divides any set into two parts gives a very approximate image of reality. Much better
approximation is achieved through the multifaceted approach based on multivalued logics,
fuzzy reasoning, and linguistic variables.
Second, the problem has to be treated in the historical or temporal context, i.e., we must
consider time as an essential parameter.
Third, there is a personal issue in this problem.
Let us consider at first the second point. Discussing whether false information is a kind
of information, Floridi demonstrates vivid examples of such terms as a “false constable” or
“forged banknote.” The fallacy of this argumentation is that these and other terms
(constable, banknote, signature etc.), which are used to enlighten the situation with the
term “information”, are names of artificial objects. These names are defined while the
objects they designate were created and are functioning in society. These objects together
with their names depend on social agreement.
In contrast to this, information is a natural phenomenon, which exists both in nature and
in society. People do build and determine information as a general phenomenon. They only
generate, produce, discover, store, transform, process, collect, send, and receive multiple
instances of information. People have to discover what information is as they discovered
before what atoms and molecules are.
Returning to the first point of the discussion about the false information, we can see that
in such a cognitive process, the dichotomic approach, which separates all objects into two
groups, A and not A, is not efficient. Thus, if we take the term “false information”, then
given a statement, it is not always possible to tell if it contains genuine or false
information. To show this, let us consider following statements:
1. “Pi is equal to 3.”
2. “Pi is equal to 3.1.”
3. “Pi is equal to 3.14.”
4. “Pi is equal to 3.1415926535.”
5. “Pi is equal to (4/3)2 .”
According to the definition of pi and our contemporary knowledge that it is a
transcendent number, all these statements contain false information. In practice, they are
all true but with different exactness. For example (4) is truer than (1). Nevertheless, in the
ancient Orient the value of pi was frequently taken as 3 and were satisfied with this value
(Eves, 1983). Archimedes found that pi is equal to 3.14. For centuries, students and
engineers used 3.14 as the value for pi and had good practical results. Now calculators and
computers allow us to operate with much better approximations of pi, but nobody can give
the exact decimal value of this number.
Another situation from the history of science that helps to understand better the situation
with false information. Famous Greek philosophers Leucippus (fl. 445 B.C.) and
Democritus (460-360 B.C.) suggested that all material bodies consist of small particles,
which were called atoms. “In reality,” said Democritus, “there are only atoms and the
We can ask the question whether this idea contains genuine or false information. From
the point of view of those scientists who lived after Democritus but before the fifteenth
century, it contained false information. This was grounded by the fact that those scientists
were not able to go sufficiently deep into the matter to find atoms.
However, the development of scientific instruments and experimental methods made
possible to discover such micro-particles that were and are called atoms. Consequently,
now it is a fact accepted by everybody that all material bodies consist of atoms. As a result,
now people assume that the idea of Leucippus and Democritus contains genuine or false
This shows how people’s comprehension of what is genuine information and what is
false information changes with time. Interesting examples of such situations in the history
of mathematics are given by Lakatos (1976) and Kline (1980).
All these examples give evidence to necessity to consider false information as we use
negative numbers, as well as not to discard pseudoinformation as we do not reject utility of
such number as 0. History of mathematics demonstrates that understanding that 0 is a
number and a very important number demanded a lot of hard intellectual efforts from
European mathematicians when they received knowledge about 0 from India through Arab
Going to the third point of the discussion about the false information related to the
personal issue, let us consider other examples from the history of science as we are
studying information by scientific methods.
In his lectures on optics, Newton developed a corpuscular theory of light. According to
this theory, light consists of small moving particles. Approximately at the same time,
Heygens and Hook built a wave theory of light. According to their theory, light is a wave
phenomenon. Thus, we can ask the question who of them gave genuine information and
who gave false information. For a long time, both theories were competing. Thus the
answer to our question depended whether the respondent was an adherent of the Newton’s
theory or of the theory of Heygens and Hook. However, for the majority of people who
lived at that time both theories provided pseudoinformation because these people did not
A modern physicist implies that both theories contain genuine information. So,
distinction between genuine and false information in some bulk of knowledge depends on
the person who estimates this knowledge.
Thus, we have seen that the problem of false information is an important part of the
whole problem on information and we need more developed scientific methods to treat
these problems in an adequate manner.
The general theory of information makes possible to explicate essence and meaning of
genuine information, false information or misinformation, disinformation, and
pseudoinformation in a consistent network of concepts that represent specific phenomena
from the real world.
The base of this theory is a system of principles. Here we need only three of them. A
reader can find other principles of the general theory of information in (Burgin, 1994;
1995;2001; 2002; and 2002a)
Ontological Principle O1. It is necessary to separate information in general from
information (or a portion of information) for a system R . In other words, empirically, it is
possible to speak only about information (or a portion of information) for a system.
Why it is so important? The reason is that all conventional theories of information
assume that information exists as something absolute, like time in the Newtonian
dynamics. Consequently, this absolute information may be measured, used, and
transmitted. In some abstract sense it is true, but on practice, or as scientists say,
empirically, this is not so.
Definition 1. The system R is called the receiver of the information I .
Ontological Principle O1 correlates with the idea of Roederer (2002) that interaction
plays very important role for information. In other words, there are no information without
interaction of the carrier of information with the receiver of information. However, it is
possible to speak of information not only when we have both a sender and a recipient
because the recipient can extract information from a carrier when the carrier do not send it.
Besides, even if information gives some image of a pattern from a sender, this
correspondence is not necessarily one-to-one.
Definition 2. A subsystem IF(R) of the system R is called an infological system of R if
IF(R) contains infological elements.
Infological elements are different kinds of structures (Burgin, 1997). Let us take as a
standard example of infological elements knowledge, data, images, ideas, fancies,
abstractions, beliefs, etc. If we consider only knowledge and data, then the infological
system is the system of knowledge, which is called in cybernetics a thesaurus.
When R is a material system, its infological subsystem IF(R) consists of three
components: a material component, which is a system of physical objects; a functional
structure realized by the material component; and the system of infological elements. For
example, the material component of the infological subsystem of a human being is her/his
brains. The corresponding functional structure is her/his mind. Infological elements in this
case will be the knowledge of the individual. Another example of an infological system is
the memory of a computer. Such a memory is a place in which data and programs are
Ontological Principle O2a. Information in the strict sense or, simply, information for
a system R, is everything that changes the infological system IF(R) of the system R.
This implies that for a complex system there are different kinds of information. Each
type of infological system determines a specific kind of information. For example,
information that causes changes in the system of knowledge is called cognitive
information. All papers from the Conference, including (Floridi, 2002; Menant, 2002; and
Roederer, 2002) consider only cognitive information. In particular, Floridi (2002) restricts
his study to declarative semantic information.
Any system can have an infological subsystem. Consequently, in contrast to the opinion
of Roederer (2002), information is important both for the biotic and abiotic worlds.
Information enters non-living physical world even without living beings.
At the same time, Roederer defines information as the agent that mediates the
correspondence between features or patterns in the source system A and changes in the
structure of the recipient B. This definition strongly correlates with the definition from the
Ontological Principle O2a. Taking such infological system as genetic memory, we come to
the concept of biomolecular information considered by Roederer (2002).
Menant (2002) base his approach to meaning on information as the basic phenomenon.
Different kinds of information and infological systems induce existence of different types
of meaning in the sense of Menant. It is possible to consider meaning for a given
infological system. A mathematical theorem has different meaning for a student,
professional mathematician, and engineer.
According to the ontological principles, information causes changes. Consequently, it is
natural to assume that measure of information is determined by the results that are caused
by reception of the corresponding portion of information. It is formulated in the first
Axiological Principle A1. A measure of information I for a system R is some measure
of changes caused by I in R (for information in the strict sense, in the infological system
IF(R) ) .
This principle implies that a unique measure of information exists only for
oversimplified system. Any complex system R with a developed infological subsystem
IF(R) has many parameters that may be changed. So, such systems demand many different
measures of information in order to reflect the full variety of these systems properties as
well as of conditions in which these systems function. Uncertainty elimination (which is
measured by the Shannon’s quantity of information) is only one of the possible changes,
which are useful to measure for information.
Now we can give exact definitions of genuine information, false information or
misinformation, disinformation, and pseudoinformation.
Let us consider such infological system as thesaurus or knowledge system T and fix
some measure m of information, which reflects changes of T.
Definition 3. A portion of information I is called genuine information with respect to
the measure m if m(I) > 0.
To have genuine information relevant to usual understanding, we take such measure as
correctness of knowledge or such measure as validity of knowledge.
Definition 4. A portion of information I is called false information or misinformation
with respect to the measure m if m(I) < 0.
Thus, we have misinformation when its acceptance makes our knowledge less correct.
It is interesting that there is no direct correlation between false information and
meaningless information. Bloch in his book “Apology of History” (1961) gives examples
when false information was meaningful for people, while genuine information was
meaningless for them.
Definition 5. A portion of information I is called pseudoinformation with respect to
the measure m if m(I) = 0.
Definition 6. A portion of information I is called disinformation with respect to the
measure m if it is misinformation with respect to the measure m.
Principles that Floridi introduces in his paper might be useful for investigation of
different kinds of information. However, it would be useful for better understanding to
explain those non-standard symbols that are used by Floridi in the description of principles,
as well as to give informal interpretation and examples of those principles.
To conclude, it is necessary to remark that Floridi and other modern researchers
consider only cognitive information. At the same time, the general theory of information
introduces other types of information (Burgin, 2001).
1. Bloch, M. (1961) Apologie pour l’Histoire ou Métier d’Historien, Paris
2. Burgin, M.S. (1994) Evaluation of Scientific Activity in the Dynamic Theory of
Information, Science and Science of Science, No. 1, pp. 124-131
3. Burgin, M.S. (1995) The Algorithmic Approach in the Dynamic Theory of
Information, Notices of the Russian Academy of Sciences, v.342, No. 1, pp. 7-10
4. Burgin, M.S. (1996) Information as a Natural and Technological Phenomenon,
Informatization and New Technologies, No. 1, pp. 2-5 (in Russian)
5. Burgin, M.S. (1997) Fundamental Structures of Knowledge and Information, Academy
for Information Sciences, Kiev (in Russian)
6. Burgin, M. (2001) Information in the Context of Education, The Journal of
Interdisciplinary Studies, v. 14, Fall 2001, pp. 155-166
7. Burgin, M. (2002) The Essence of Information: Paradoxes, Contradictions, and
Solutions, Electronic Conference on Foundations of Information Science (FIS 2002)
8. Burgin, M. (2002a) The Essence of Information: A Multifaceted Image of Information,
Electronic Conference on Foundations of Information Science (FIS 2002)
9. Eves, H. (1983) An Introduction to the History of Mathematics, Saunders College
10. Floridi L. (2002) “ Is Information Meaningful Data ?“ Electronic Conference on
Foundations of Information Science (FIS 2002)
11. Kline, M. (1980) Mathematics: The Loss of Certainty, New York, Oxford University
12. Lakatos, I. (1976) Proofs and Refutations, Cambridge, Cambridge University Press
13. Menant, C. Information and Meaning, Electronic Conference on Foundations of
Information Science (FIS 2002)
14. Roederer, J.G. On the Concept of Information and its Role in Nature, Electronic
Conference on Foundations of Information Science (FIS 2002)
15. Warner, T. (1996) Communication Skills for Information Systems, London, Pittman
Is Information Some Kind of Data ?
Department of Mathematics
University of California, Los Angeles