Monday, May 23, 2005

Terminology A-Z of Internet Systematics

Previous posting 13/5/2005 introduced a long list of terms, that need some
definition. This is the goal of the present posting: Internet Computing Paradigm, Von Neumann computing/programming paradigm, Software Crisis, Proof Data, Meta-computing, Meta-mechanical process, Metasystem Transitions in the computer


We start with those, as they appear on the defining text about Internet Systematics.



Internet Computing Paradigm
The term should also include Communications. The global Internet, viewed as one entity, is made out of computers, communication lines
and software so it admits a kind of picture, our main concern here, a moving picture in fact where in every frame some new component emerges. One way to understand the picture is to interpret it with established concepts. Those that one is familiar with from the field of Computer Science. Whether such a picture is valid or not, it will take quite some blogging to show, so please be patient read on and comment, hopefully you will be rewarded. It is a moving picture, since the Net is evolving from the days of APRANET, to the CATENET and the days of INTERNETWORK to the WWW (Internet II for some) and perhaps to the presently awaited SEMANTIC WEB.



Von Neumann computing/programming paradigm
John Backus in his 1977 Turing Awards lecture coined the term "liberation from the von Neumann style ?",
a proposal for an alternative software construction technology. It was a paradigm shift over the
so called Von Neumann computer/programming architecture and corresponding programming languages. There two characteristic features:

Architecture: CPU,BUS,MEMORY (structural description)
Operation: FETCH, EXECUTE, REPEAT (functional description)

Innovative in the 50s, but a kind of a "bottleneck" for scaling in 80s, hence the need to move on with new concepts, and throw away the established "Imperative" programming style. Technology such as Structured Programming was considered as not innovative enough. New characteristics ?

Architecture: control of massive parallelism by static programming concepts
Operation: reliable software incorporating mathematical proof technology



Software Crisis
David Turner referred to the problem of software crisis in late 70s. By this term he meant that the cost of hardware was dropping sharply against the rising costs of software. This implied inability to exploit cheap computing power (due to Moore's law) to build massively parallel innovative computer architectures. It also implied inability to deploy reliable software where a package would be sold together with its "proof to specs".
Turner also saw that Functional or Applicative Programming may solve the software crisis problem. In fact, this vision still is alive today underthe same banner

Although the field of Functional Programming enriched important developments with its concepts such as virtual machine for software portability, automatic storage management (aka garbage collection) and so on, it has not fullfil its vision. For the time being some other technological
phenomena emerged that gave some different answers.
Free Software provided solutions such as cheap supercomputers with Gnu/Linux clusters to harness massive parallelism and "enough eye balls to make all bugs shallow" and beat software costs.



Proof Data
Surely the subject discussed here is not the easiest to follow, there is a lot of
subjective material. I had this problem the first years the idea of "Internet
Systematics" appeared infront of me. Following "net happenings" by Sackmann whenever I saw something that seemed to fit my picture about the Net or indeed my kind of approach to the problem, I marked it down as a form of evidence to the "theory" I was building.
It was from the field of Theoretical Computer Science
that my idea to associate Internet computing with "paradigm shifting" the von Neumann model got its first "proof", at least as an utterance. There is a definate number of such "proof data" concerning Internet Systematics. As the number grew, it gave the final push to go public.

Meta-computing
This term is a bit overloaded, the grid community for example is using it. Here the term is used as defined by Valentin Turchin, the creator of the Russian LISP, called REFAL
Turchin helped me understand deeper the field of Functional Programming. It gave me
the idea of "evolution" in software. He provided a practical demonstration of such a system.Turchin's concept of Metasystem Transition is central to all of his important works. If a software system includes even one step of self-modification then it exhibits the desirable property of Metasystem transition. It is the long-term goal of programming according to Turchin, to be able to induce the machine to perform a series of such advances rather than a throw-away after one hop improvement. This helped me form a different formulation about Metasystem Transition in order to interpet Net advances (not in the computer as done by Turchin but in the domain of joint efforts by Man-Machine interactions).
I consider Internet Systematics just a copy of the ideas of Turchin and applied to the new phenomenon of the Global Network comprising machines, lines and users.



Meta-mechanical process
This the last term to define in our version 1.0 of A-Z terminology.
Turchin is a strong believer in the constructive approach to Mathematics. He follows Hilbert for mechanizing the Ancient Greek proof, and Godel for showing the limits of it. He extends Turing's idea of mechanising the symbol shifting of the mathematical procedure (called Turing machine or computation) by considering a wider time interval than that the machine is working autonomously. He brings in the user of the mathematical machinery in order to formulate a finitist definition of infinity.The idea here is to reconstruct all foundational ideas in Mathematics by a simple symbol shifting (or bit fliping) mechanism.
He calls such an joint venture a "meta-mechanical process" where a Turing machine initiates and maintains a process while interacting with its human operator.
I borrowed this definition to formulate the idea of "Net evolution" as a particular kind of Metasystem Transitions.


Metasystem Transitions in the computer
Metasystem Transitions in the computer constitutes an approach to Artificial Intelligence according to Turchin. However complex the construction, it is made out of evolvable systems that change through simple symbol shifting. Turchin discovered such an evolution pattern in Nature's construction of Science. Now, the demo of it is in the so called Supercompiler software. There is another source of linking Turchin and the Internet by maintaining his profile on the net.

Meta-artificial intelligence
The term giving the name to the site (meta-artifical.blogspot.com) denotes a new concept in extension to the concept of artificial intelligence. The latter being the slogan under which computing advances took place. Coined by J.MacCarthy in the 60s and having been assigned its meaning described by Alan Turing that computing is a game playing paradigm where the machine executes chess moves. Here, the first statement to shed light about the nature of the novel entity was 'the net is the computer' but we went a little further and interpret Net advances as the replacement of man-machine steps by novel components. In any case, this term 'meta-artificial intelligence' was chosen to indicate that we have a new concept in front of us. Such a claim remains to be established of course.

Friday, May 13, 2005

What is 'Internet Systematics'

It is an evolving thinking pattern concerning an interpretation of the Internet Computing Paradigm. It is the result of the observation as well as of the participation into the process of building the global Internet over the last 15 years.

The background stage has been that of dis-locating the von-Neumann computing paradigm, a prominent line of thinking in Computer Science in the 80s, in order to solve the software crisis problem.

Starting date may be considered the early 90s where European Research was advancing the idea of the national research network. It was the issue of sorting out between the ISO/OSI model versus the IETF/TCPIP model that fired questions such as 'what is network ?' and 'which is the winner model?'

The above questions fussed with the on-going line of thinking above to give birth to InternetSystematics, a term coined in 1999. The core concept is the type of systems that make the global Internet and how they evolve.

Why talk about it now ?

Because there has been enough proof data collected about it that it will be useful to connect this line of thinking to other important contemporary domains such as that of IRTF's end-2-end group which seeks the successor to the next generation Internet architecture.

So it seems to us that a conceptual picture about the global Internet and its evolution is rather useful in general, so we will proceed with its publication.

Another reason it that it has taken us quite a long time to understand where InternetSystematics leads to. Why is important to talk about it ?. OK, it is a kind of Internet model but what is its use ? Why should one be interested in the concept of Meta-computing created and demonstrated by Valentin Turchin, whose works serve as our reference model.

The answer to the above is that we firmly believe that the world is staging the process of Meta-artificial Intelligence, the successor to Turing's artificial intelligence that was staged at the closed premises of Bletchley Park in the 50s, a term that had enourmous consequences to our life. Perhaps something similar is awaiting this new term.

It is only very recently that we are able to comprehend net-automation advances, our basic theoretical result, as the definition of new kind of process: man-machine intelligence evolution. Here again, Turchin's concept of a meta-mechanical process has been the inspirational force.


Internet Systematics talk is informationally quite complex, it needs an advanced medium to be communicated. The presently used 'interface' will be the front-end of a more sophisticated medium based on wiki technology currently being constructed.

Note: whatever is in bold needs further blogging and referencing, please await further communication on it.