Friday, March 21, 2008

The Basic Elements of Nature: Matter, Energy, and Information



I found this article interesting and I need to reread it because I was only able to skim through most of it. What is the universe made up of? Some say matter and energy. Others mention "intelligence" in addition to matter and energy. This article analyzed this very subject and points to information as one of the fundamental aspects of "it all." Perhaps information is directly linked to the term "intelligence." Anyhow, it's a worthwhile read, although I don't expect most people will get through or "get" most of it in one reading.


The Basic Elements of Nature: Matter, Energy, and Information

‎“Evidently nature can no longer be seen as matter and energy alone. Nor can all her ‎secrets be unlocked with the keys of chemistry and physics, brilliantly successful as ‎these two branches of science have been in our century.‎

‎“A third component is needed for any explanation of the world that claims to be ‎complete. To the powerful theories of chemistry and physics must be added a late ‎arrival: a theory of information.‎

‎“Nature must be interpreted as matter, energy, and information.” – Jeremy C Campbell. ‎Journalist and author of Grammatical Man: Information, Entropy, Language and Life, ‎Harmondsworth, Middlesex, UK: Penguin Books, 1984:16 (Reprint).‎

The Universe = Matter + Energy

Nature = Matter + Energy + Information

Many previous civilisations, traditions, and philosophies ventured to define the ‎basic building blocks of the universe. Most used a set of archetypal, classical ‎elements to explain patterns in nature. The Greek version of these ideas ‎dates from pre-Socratic times and persisted throughout the Middle Ages and ‎into the Renaissance, deeply influencing European thought and culture; but ‎the concept is far older in the Far East, and was widely disseminated in India ‎and China, where it forms the basis of both Buddhism and Hinduism, ‎particularly in an esoteric context.‎

The Greek philosopher, statesman, poet, religious teacher, and physiologist ‎Empedocles (circa 490-430 BCE), asserts that matter consists of four ‎elements – Earth, Air, Fire, and Water – a theory that was later supported and ‎embellished upon by the ancient Greek philosopher and scientist Aristotle ‎‎(384-322 BCE). This concept influenced the philosophical basis for the next ‎advance in the science of matter, namely: alchemy.‎

Plato mentions the elements as of pre-Socratic origin, a list created by ‎Empedocles. Empedocles called these the four ‘roots’; the term ‘element’ ‎‎(stoicheion) was used only by later writers.‎

The Greek classical elements represent the realms of the cosmos (the ‎‎‘ordered universe’) in Greek philosophy, science, and medicine, wherein all ‎things exist and whereof all things consist. The ancient Greek word for ‎element (stoicheion) literally meant ‘syllable’, the basic unit from which a word ‎is formed.‎

In Taoism, there is a similar system of elements, which includes Metal and ‎Wood, but excludes Air, which is replaced with the non-element chi ‎‎(mental/spiritual energy?), which is a force (Force = Mass x Acceleration) or ‎energy (Energy = Force x Displacement) rather than an element. ‎

In Chinese philosophy, the Universe consists of Heaven and Earth, Heaven ‎being made of chi and Earth being made of the five elements (i.e. ‎matter/energy).‎

The Pancha Mahabhuta, or ‘five great elements’, of Hinduism are Prithvi or ‎Bhumi (Earth), Ap or Jala (Water), Agni or Tejas (Fire), Vayu or Pavan (Air or ‎Wind), and Akasha (Aether). ‎

Hindus believe that God used Akasha (Aether) to create the other four ‎traditional elements, and that in it the Akashic record, the knowledge of all ‎human experience (information), are imprinted.‎

In the Pali literature, the mahabhuta (‘great elements’) or catudhatu (‘four ‎elements’) are Earth, Water, Fire, and Air. Pali is a Middle Indo-Aryan dialect ‎or Prakrit.‎

In Ayurveda it is believes that everything in this universe is made up of five ‎great elements or building blocks. These are Earth, Water, Fire, Air, and ‎Aether. Ayurveda, or Ayurvedic medicine, is an ancient system of health care ‎that is native to the Indian subcontinent. The word Ayurveda is a compound of ‎the Sanskrit word Ayur that means ‘life’ or ‘life principle’, and the Sanskrit ‎word Veda, which refers to a system of ‘knowledge’. Thus, Ayurveda roughly ‎translates as the ‘knowledge of life’.‎

Japanese traditions use a set of elements called the go dai, literally ‘five great’. ‎These five are Earth, Water, Fire, Wind (Air?), and Void (Aether?).‎

Qabalists also use a five-fold scheme (Aether, Air, Water, Fire, and Earth) that ‎dates at least as far back as to the time Empedocles, and probably even to ‎Noah’s Ark.‎

These basic elements of the ancients are the stuff of our physical universe – ‎matter and energy.‎

The modern periodic table of the chemical elements and the understanding of ‎combustion (fire) can be considered successors to these early models. The ‎current standard periodic table contains 117 confirmed elements as of ‎October 16, 2006 (while element 118 has been artificially synthesised). ‎Ninety-four (94) of these chemical elements (i.e. plutonium and below) occur ‎naturally on Earth.‎

However, general systems theorists often speak of matter, energy, and ‎information as fundamental categories. For example James G Miller’s living ‎systems theory is based on the idea that cells, organs, organisms, groups, ‎corporations, nations, and supranational organisations all process matter, ‎energy, and information. (Living Systems. New York: McGraw-Hill, 1978.)‎

Professor Miller’s research focuses on the physics of anisotropic, inherently ‎inhomogeneous media. These systematic studies of the anisotropic properties ‎of the heart have led to fundamentally new insights. He was the Albert Gordon ‎Hill Professor of Physics, Biomedical Engineering, and Medicine in the ‎Department of Biomedical Engineering, Washington University in St. Louis, ‎Missouri, USA.‎

Living systems theory is an offshoot of Ludwig von Bertalanffy’s (1901-1972) ‎general systems theory, created by the American biologist James Grier Miller ‎‎(1916-2002), which was intended to formalise the concept of ‘life’. According ‎to Miller’s original conception as spelled out in his magnum opus Living ‎Systems, a ‘living system’ must contain each of 20 ‘critical subsystems’, which ‎are defined by their functions and visible in numerous systems, from simple ‎cells to organisms, countries, and societies. In Living Systems Miller provides ‎a detailed look at a number of systems in order of increasing size, and ‎identifies his subsystems in each.‎

Miller alleged that systems exist at eight ‘nested’ hierarchical levels: cell, ‎organ, organism, group, organisation, community, society, and supranational ‎system. ‎

At each level, a system invariably comprises 20 critical subsystems, which ‎process matter/ energy or information except for the first two, which process ‎both matter/energy and information:‎

‎1.‎ Reproducer
‎2.‎ Boundary

The processors of matter/energy are:‎

‎1.‎ Ingestor
‎2.‎ Distributor
‎3.‎ Converter
‎4.‎ Producer
‎5.‎ Storage
‎6.‎ Extruder
‎7.‎ Motor
‎8.‎ Supporter

The processors of information are:‎

‎1.‎ Input transducer
‎2.‎ Internal transducer
‎3.‎ Channel and net
‎4.‎ Timer (added later)‎
‎5.‎ Decoder
‎6.‎ Associator
‎7.‎ Memory
‎8.‎ Decider
‎9.‎ Encoder
‎10.‎ Output transducer

In biological systems (e.g., cells, organs, and organisms), matter and energy ‎are so closely related that they are often treated as one entity – matter/energy. ‎

A social organisation such as a corporation/business processes matter (e.g., ‎by transforming raw materials into finished products – goods and services), ‎energy (including the fuel and electricity needed to operate machines and ‎heat buildings), and information (e.g., strategies, budgets, personnel records, ‎customer orders, advertising messages, and financial records).‎

Although matter/energy has been the subject of scientific investigation for ‎several hundred years, a scientific conception of information is relatively new. ‎

A variety of definitions of information has been proposed. The American ‎electrical engineer and mathematician, Claude Shannon (1916-2001), defined ‎information as “a reduction of uncertainty”. Gregory Bateson (1904-1980) ‎defined information as “that which changes us”, or “the difference that makes ‎a difference”. (Mind and Nature, New York: Bantam Books, 1988:72.)‎

A crucial point is that information, unlike matter/energy, is a function of the ‎observer (mind). For example, the same message may have different ‎meanings for different people. Although information requires the perception of ‎a difference, the difference will require a matter/energy carrier (e.g., a page in ‎a book, electrical circuits in a computer, or sound waves in air – collateral ‎energy). ‎

In addition, cognition requires a nervous system (formed out of matter/energy). ‎You can send information over the Internet, by fax, or by heliograph – one ‎cannot do the same with matter. When you fax a document, the information is ‎transferred, not the physical document.‎

Where then does information come from? Information is neither matter, nor ‎energy. Moreover, as the American philosopher and mathematician, William ‎Dembski (1960- ), correctly inferred, “Information is sui generis. Only ‎information begets information.” This is Dembski’s Law of the Conservation of ‎Information. (Intelligent Design, Downers Grove, Illinois: InterVarsity Press, ‎‎1999:183.) Information is the product of mind.‎

Thinking is a complex, internal, mental process that uses, and creates, ‎information (data organised within some context) as input and that allows ‎organisms to model (or map) their world, and so to deal with it effectively ‎according to their goals, plans, ends, and desires. Thinking integrates that ‎information into previous learned material and it may result in knowledge. ‎

Concepts akin to thought are sentience, consciousness, idea, and imagination. ‎Problem solving, decision-making, planning, information integration, and ‎analysis are five kinds of thinking. ‎

Information systems take data (facts, figures, pictures, numbers, symbols, ‎letters, codes, et cetera) and process the raw data into useful information for ‎decision-making, and control, purposes. ‎

A fact, or figure, in isolation has no meaning at all. If your body temperature is ‎‎39 degrees Centigrade, you might feel very ill and feverish, but if you also know that the ‎average body temperature of humans is 37 degrees Centigrade, you can now be certain that ‎you have a serious fever and help is needed.‎

Information is data that is related to each other in a very specific way to make ‎sense to some decision-maker (a mind). Thus, you need at least two bits of ‎data to create one piece of information. Information is a term with many ‎meanings depending on context, but is as a rule closely related to such ‎concepts as meaning, knowledge, instruction, communication, representation, ‎and mental stimulus. Information is organised data (raw facts and figures) in a ‎specific context. Information is a non-physical, immaterial entity completely ‎unrelated to matter/energy. ‎

‎“Thought can be about pigs or coconuts, but there are no pigs or coconuts in ‎the brain; and in the mind, there are no neurons, only ideas of pigs and ‎coconuts.” (Bateson, Mind and Nature, New York: Bantam Books, 1988:205) ‎

Consider, for example, Phillip Johnson’s (1940- ) so-called basic points ‎regarding information (Defeating Darwinism by Opening Minds, Downers ‎Grove, Illinois: InterVarsity Press,1997:75): ‎

• ‎‘First, life consists not just of matter (chemicals) but of matter and ‎information. [In fact, matter, energy, information, and progams (another ‎kind of information).] ‎

• ‎‘Second, information is not reducible to matter, but it is a different kind of ‎‎‘stuff’ altogether. A theory of life thus has to explain not just the origin of ‎the matter but also the independent origin of the information. ‎

• ‎‘Third, complex, specified information of the kind found in a book or a ‎biological cell cannot be produced either by chance or at the direction of ‎physical and chemical laws.’ [A case of, nothing can be made out of ‎nothing – information is obligatory.] ‎

Energy is defined as something that makes work possible. Work is possible ‎with, or without, information.

Things like zeroes {0} and ones {1} do not exist, except in mind. What you see ‎here is only symbols of the concepts, not the ‘things’ themselves – and they ‎have no energy, but they can carry information! Information is in essence only ‎patterns in matter/energy.

When I see the leaves falling around me, I know winter is approaching ‎‎(information). My brain (matter/energy) uses collateral energy in the process ‎of thinking, but the energy is not part of the information. Information is the stuff ‎of mind; matter/energy (including the brain) is physical. Information is never ‎physical, but it can be, and usually is, represented in some physical form.‎

Now, Bremermann’s limit (proposed by the German-American physicist Hans-‎Joachim Bremermann, 1926-1996) states a relationship between matter and ‎information. ‎

In addition, a link between matter and energy had been proposed by Albert ‎Einstein (1879-1955). ‎

Furthermore, a connection between energy and information had been ‎described by the Hungarian-American physicist Leo Szilard (1898-1964). ‎

With an association established between matter and information, it appeared ‎that the contributions of Einstein, Szilard, and Bremermann imply that matter, ‎energy, and information, on the level of atoms, are related. (Physical ‎Relationships among Matter, Energy, and Information, by Stuart A. Umpleby.)‎

Bremermann’s limit is the maximum computational speed of a self-contained ‎system in the material universe. It is derived from Einstein’s mass-energy ‎equivalency and the Heisenberg Uncertainty Principle, and is approximately 2 ‎x 10^47 bits per second per gram. This value (limit) is important when designing ‎cryptographic algorithms, as it can be used to determine the minimum size of ‎encryption keys or hash values required to create an algorithm that could ‎never be cracked by a brute-force search.‎

For example, a computer the size of the entire earth, operating at the ‎Bremermann’s limit could perform approximately 10^75 mathematical ‎computations per second. If we assume that a cryptographic key can be ‎tested with only one operation, then a typical 128-bit key could be cracked in ‎‎10^-37 seconds. However, a 256-bit key would take about a minute to crack. ‎Using a 512-bit key would increase the cracking time to 10^71 years.‎

However, how are these basic categories – matter, energy, and information – ‎related? Einstein presented the well-known relationship between matter and ‎energy, namely: E = m.c^2. Indeed, physicists now regard matter as another ‎form of energy – i.e. matter/energy.‎

Around 1900, the German theoretical physicist who originated quantum theory, ‎Max Planck (1858-1947), observed that electromagnetic energy is not emitted ‎over a continuous range but rather in bundles or quanta, the energies of ‎which are proportional to the frequency of the radiation. The expression ‘E = ‎h.f’ means that the energy of a photon is proportional to its frequency. The ‎constant ‘h’ is called Planck’s constant and ‘f’ is frequency. Planck’s constant ‎also appears in the equation that defines the uncertainty in observing ‎subatomic particles. ‎

The Uncertainty Principle was formulated by the German physicist and ‎philosopher (who discovered a way to formulate quantum mechanics in terms ‎of matrices), Werner Heisenberg (1901-1976). The Uncertainty Principle ‎states that, an experiment cannot be devised that simultaneously fixes the ‎momentum and the position of a particle with unlimited precision, but only ‎within a momentum-position range where Planck’s constant defines the limit ‎of how precise an experiment can be.‎

Szilard showed that there is a relationship between information and energy. ‎Szilard recognised that Maxwell’s demon (proposed by the Scottish physicist ‎James Clerk Maxwell, 1831-1879) would require information in order to sort ‎high and low energy particles. He demonstrated that the act of measuring the ‎velocity of gas molecules would produce more entropy than the sorting ‎process would remove.‎

Bremermann suggested that there is an upper bound on the rate at which, ‎symbols can be processed by matter. They can be processed at speeds not ‎exceeding Bremermann’s limit of 1047 bits/gram/sec. ‎

Bremermann’s limit is derived from the equations E = m.c^2 and E = h.f, when ‎one photon is considered equivalent to one bit. That is, combining the ‎relationship between matter and energy with a relationship between energy ‎and information yields a new relationship between matter and information, at ‎least at the atomic level. ‎

The English psychiatrist, William Ross Ashby (1903-1972), used ‎Bremermann’s limit in pointing out the dramatic physical impossibility of some ‎pattern recognition strategies used in the early days of artificial intelligence. ‎He urged more attention to how the human brain functions.‎

Alan Mathison Turing (1912-1954) was one of the great pioneers of the ‎computer field and artificial intelligence. He inspired the now common terms of ‎‎‘The Turing Machine’ and ‘Turing’s Test’. As a mathematician, he applied the ‎concept of the algorithm to digital computers – he was the first person to ‎realise that programs and information can be handled in exactly the same ‎manner in computers, i.e. both programs and information are software. ‎

His research into the relationships between machines and nature created the ‎field of artificial intelligence. His intelligence and foresight made him one of ‎the first to step into the information age. ‎

Turing helped pioneer the concept of the digital computer. The Turing ‎Machine that he envisioned is essentially the same as today’s multi-purpose ‎‎(i.e. programmable) computers. He described a machine that would read a ‎series of ones and zeros from a paper tape – what a brilliant brainwave. ‎

To me, DNA is matter/energy; it is somewhat like the program strips (paper ‎tapes) that the old computers used to employ. (Punched cards were originally ‎developed in 1804 as an aid to textile production by a Frenchman, Joseph ‎Marie Jacquard [1752-1834]). These ones and zeros described the steps that ‎needed to be done to solve a particular problem or perform a certain task. The ‎Turing Machine would read each of the steps and perform them in sequence, ‎resulting in the proper answer.‎

This concept was revolutionary for the time. Most computers in the 1950’s ‎were designed for a particular purpose or a limited range of purposes. What ‎Turing envisioned was a machine that could do anything (it was a ‎programmable computer), something that we take for granted today. The ‎method of instructing the computer was very important in Turing’s concept. ‎

He essentially described a machine that knew a few simple instructions – ‎today these instructions are saved on one or more ROM chips in personal ‎computers. Making the computer perform a particular task was simply a ‎matter of breaking the job down into a series of these simple instructions. This ‎is identical to the process programmers go through today. He believed that an ‎algorithm could be developed for almost any problem. The hard part was ‎determining what the simple steps were to be and how to break down the ‎larger problems.‎

If we consider all the possible carriers of information, it is clear that the ‎relationship between matter and signal is not fused and definite. The ‎relationship depends on the material in which a pattern appears. That is, a ‎pattern or set of differences can be observed at the atomic level (where ‎Bremermann’s limit applies), in molecules (DNA), cells (neurons), organs (the ‎brain), groups (norms), and society (culture).‎

Both Szilard and Bremermann used the term ‘information’. However, because ‎of the complexities introduced by having to specify one or more observers ‎‎(mind), the term ‘information’ is not an elementary concept. ‎

Difference denotes the elementary building block of data, signals, or ‎information. Therefore, when dealing with physical foundations, Stuart A ‎Umpleby (Department of Management Science, The George Washington ‎University, Washington) believes it is preferable to speak in terms of matter, ‎energy, and difference. ‎

According to Umpleby, ‘difference’ is a physical entity that can be noted by an ‎observer (mind). Drawing a ‘distinction’ is a purposeful act that creates two ‎categories.‎

Scientists today understand phenomena related to matter/energy more ‎thoroughly than phenomena related to information. Perhaps reflecting on the ‎physical relationships among matter, energy, and information can help natural ‎scientists and social scientists understand better the nature of their disciplines. ‎

Efforts to apply the methods of the natural sciences to social systems have ‎led some people to conclude that matter and energy relationships are the ‎appropriate subjects of attention for social scientists. However, in social ‎systems, distinctions are essential. Gregory Bateson made this point as ‎follows, “… my colleagues in the behavioural [social] sciences have tried to ‎build the bridge to the wrong half of the ancient dichotomy between form ‎‎[mind] and substance [body]. The conservative laws for energy and matter ‎concern substance [matter/energy] rather than form [information]. But mental ‎process, ideas, communication, organization, differentiation, pattern, and so ‎on, are matters of form rather than substance. Within the body of ‎fundamentals, that half which deals with form has been dramatically enriched ‎in the last thirty years by the discoveries of cybernetics and systems theory.” ‎‎(Steps to an Ecology of Mind. New York: Ballantine, 1972:xxv-xxvi.)‎

However, mystics go even further and propose that something is needed to ‎bind the four basic elements (Air, Water, Fire, and Earth – matter/energy) ‎together in infinite proportions and combinations in order to construct the ‎manifest universe. Something is also needed to keep the elements from ‎melding and turning the universe into one big glob of mush. They say that, ‎that something is Spirit. Furthermore, they also profess that the Manifest Spirit ‎comprises Mind and Matter/Energy, and without Mind, there is no information.‎

The Book of Formation or Sepher Yetzirah (translated by Knut Stenring, and ‎published by Ibis Press in Berwick, Maine, 2004:22), describes it as follows, ‎namely: ‎

‎1.‎ First, there only existed ‘the Spirit of the Living Elohim’.‎
‎2.‎ Elohim created Air from Spirit. ‎
‎3.‎ Then Elohim created Water from Air. He poured snow over the Air and ‎it became Earth. ‎
‎4.‎ After Elohim had created Water, Elohim then proceeded to create Fire ‎from Water. ‎
‎5.‎ In the next six steps, God created space and the ultimate illusion, time. ‎

Was this then the cosmologists’ Big Bang, the mystics’ Sundering? ‎


Willie Maartens
http://authorsden.com/williemaartens ‎

No comments: