Semantics of Information

Key Points from a Conceptual Labyrinth

“Information is a conceptual labyrinth”, so begins the “Semantic Conceptions of Information[1]” from the online Stanford Encyclopedia of Philosophy.  Reference should be made to this excellent background material, which provides foundations for the conceptual pillar of Information as it is used in

Here we will highlight some key points from this Encyclopedia of Philosophy that are particularly relevant to the idea that innovation is making information valuable.

  1. The General Definition of Information (GDI) specifies that information is formed of data that is well formed with an appropriate syntax and has a meaning through the semantic interpretation of this well-formed data.

This general definition provides the necessary flexibility for what constitutes information.  The originating data can be anything that stands out from an entirely uniform background.  Data may be written in books, but even a blank page is data if it stands out from other pages that may be written.  Data may appear in the form of pictures, sounds, signals – anything that might be perceived, though it does not need to be perceived to be data.  Each of these forms of data should have the specific syntax that broadly covers the rules that enable the data to be understood as information.  This understanding then also has to be meaningful according to the semantics of the chosen system.

Essentially information is a broad and abstract concept and should not be constrained to take a particular form: linguistic, pictorial or otherwise.

  1. Environmental information: Two systems a and b are coupled in such a way that a’s being (of type, or in state) F is correlated to b being (of type, or in state) G, thus carrying for the information agent the information that b is G.

Exemplified by the growth rings that signify the age of a tree or the colour of litmus paper indicating acidity (pH) of a fluid, environmental information can have a meaning that is independent of any agent able to interpret that information.

Throughout this website a sequence of discrete events or outcomes is generically at the source of environmental information (a) in different systems.  These events may be directly interpreted.  Otherwise, an abstract and virtual simulation of the same process (b) can be correlated to (a), using a least squares method for example, and this virtual environment can be used to provide a semantic insight into the system that is not available through direct access to the system’s environmental information – much in the same way as litmus indicates acidity or alkalinity.

Correlation is not in itself an indicator of value.  Simulations can have a high parametric flexibility to replicate complex behaviours – artificial neural networks are one example.  Value should be related both to the quantity of information in a message and its veracity.  This takes us to a third relevant issue in information theory.

  1. The Mathematical Theory of Communication (MTC) developed by Shannon and Weaver[2] provides a means to quantify the amount of information in the terms of the probability of an outcome. The lower the probability or the higher the randomness, the higher is semantic content of a message. 

This rather counter-intuitive theory is explained by its founder Claude E Shannon as the more surprising an outcome is, then the more informative is that outcome.  MTC deals only with the quantity of information that can be communicated through physical channels with specified capacities.

Nevertheless, the concept can be extended to apply to semantic information through an inverse relation principle (IRP) whereby as the probability of a proposition or event or outcome decreases, the amount of semantic information it carries increases.

The IRP leads to an unfortunate Bar-Hillel-Carnap paradox that whenever a statement is a contradiction, making it impossible, it also then conveys the maximum information.

As far as innovation is concerned this is an issue.  “The goods will arrive tomorrow and the goods will not arrive tomorrow”, hardly seems to convey information that is useful.  Such an outcome can be avoided by insisting that the original statement or proposition must be true.  Semantic content that is not true is then not information but misinformation and contradictions are excluded.

The above points can be demonstrated in the following example of three weather predictions:-

  1. It will rain in Manchester this year
  2. It will rain in Manchester this week
  3. It will rain in Manchester at 2pm today

This example appears in the works of Sir Karl Popper who is cited as the first philosopher to have advocated IRP explicitly.  It is part of Popper’s rationale for scientific discovery[3] through which the most audacious and improbable outcomes used to challenge the veracity of a scientific theory, if verified, are able to provide the strongest support for that theory.  Conversely, the confirmation of a quite obvious outcome, or even worse a tautology, provides little that advances scientific discovery.

For Popper, the truth of a scientific theory is always contingent on the next series of experiments that are designed to test whether the theory is flawed.  Negative findings are important as a single incidence is sufficient to sink a theory that has been supported by many other positive observations.  Discounted theories are then consigned to the archives of scientific history.

Conversely, the provisional endorsement of a scientific theory should be awarded by a peer-group of experts in the field after their concerted efforts to test it to destruction have come to no avail.  Popper’s rationale for scientific discovery has provided strong foundations for an evolutionary approach to the creation of objective knowledge.  Through its evolutionary nature, this approach also enables its adaption to deal with innovation, in which provisional truth is replaced by the softer concept of appreciation of value of the associated information[4].

As the most surprising information can be considered to be the most informative, it potentially is also the most valuable when describing an innovative concept.  The third and least likely prediction of the above three statements would be most valuable to a resident of Manchester planning to go shopping.  Communicating what is highly probable, as in prediction 1, is practically worthless.



[1] Floridi, Luciano, “Semantic Conceptions of Information“, The Stanford Encyclopedia of Philosophy (Spring 2015 Edition), Edward N. Zalta (ed.), URL = <>.

[2] Shannon, C. E. and Weaver, W., 1949, The Mathematical Theory of Communication, Urbana: University of Illinois Press. Foreword by Richard E. Blahut and Bruce Hajek; reprinted in 1998

[3] Popper K.R., 1934, The Logic of Scientific Discovery, (as Logik der Forschung, English translation, Routledge 1959).

[4] We have considered the application of Popper’s rationale for scientific discovery to the translation of the science to innovations in: When Science Meets Innovation: a new model of research translation and A Popperian Analysis of Translation in Biological Science


greenballBack to Background Contents


Comments are closed.