Sunday, February 19, 2012

let's think "Computational Thinking"

At about 2009, on finding the term PNA I thought that I had finally managed to define "myThinking" (the shortest description I can find for the time being). The key word that supported the conclusion above was "pattern" in the realm of Network comprehension.

After a period of assimilation and reading the book "patterns in Network architecture - a return to fundamentals", I re-examined "myThinking" to be a kind of alternative introduction to John Day's subject matter, Network architecture. This was not ad hoc as years ago I had pursued the same goal by researching the term "network" and named the problem to be solved as "WITI- what is the internet".

But the linking to PNA/RINA although fruitful and logical left me with the sense that something is missing in relation to the content worked out within the so-called "myThinking". I do see John Day's point about producing clean consistent definitions of basic network concepts and move to clean implementations but "meta-artificial" has a more general direction.

My recent arrival to the term "Computational Thinking" provided an explanation to the above sense. I am embarking on the task to assimilate CT, looking deeper into it. The goal of defining CT (a hot issue for the National Academy in USA) offers me a new opportunity to re-vitalize "myThinking", slide way from PNA (RINA) and steer it towards firm self-definition and perhaps contribute to the established "what is CT?" dialogue.

Tuesday, February 07, 2012

Computational Science and Internet Systematics

Let me prologue this post by stating that my degree (1983) from St.Andrews University is in Computational Science. I asked onced the Head about his vision encapsulated in the term "computational". Prof. Jack Cole referred to knowledge somehow more general than specific computing tool usage. The fact that the department did not keep this naming, perhaps unable to work it out further.

In previous post titled John's Apocalypse I expressed a "eureka moment" concerning my goal of
Understanding Networking. I had to see Jon Crowcrof's review of John Day's book that states the book shows us (Internauts) how we arrived to the present state of the Net as a criterion of validity and analyzed my response.

John Day constructs an architecture which elucidates the basic concepts of Networking (address, name, directory, application) that the current running architecture of Internet failed to concentrate systematically in order to avoid hasty patching and ad-hoc solutions. The cause of the big problems: scaling, performance, security, mobility.

So, for some time I thought that the journey starting at around the end of 80s is almost finishing and I should be getting in line to co-march with RNA, PNA, RINA.

But ...

recently I have come across a new area of work called Computational Thinking which sheds new light in my work both to the grand problem "Understanding Networking"(late of 90s) and to the older problem "New Computing paradigm" (beginning of 80s) that my research degree hooked me into.

In this blog I use terms such as Meta-artificial, Meta-mechanical process, Internet Systematics, network theory (NET8) etc but one idea stands out which says that I am not quite sure about the actual logic of the corpus of ideas, concepts, schemes, research-notes etc that I have produced so far.

Initially I thought I have a "Network theory" (*) that tells fundamental things about the Net.

Next working further down this line and locating on my radar "Knowledge communication as Killer application on the Internet" I thought that since my base is Turchin's systems approach then my set of knowledge better be named "Internet Systematics".

Of course the set itself became an object of investigation, soon enough.

In addition,

John Day's RINA excludes Application Architecture and my NET8 model wants to ponder about social networking for example. What does it mean to have billion of minds thinking closer ? A new automation quantum is being cooked perhaps. I conclude that although I appreciate RINA deeply I cannot stop there, there has been some thinking that escapes RINA's lid.

Well, while thinking along these lines I came across Jeannette Wing's vision called Computational Thinking and quickly I realize there is common ground, she thinks about automating abstractions, about scaling computing education, about general concepts related to computing that engender innovation and problem solving.

Let me finish this note by quoting this blog's definition:

This blog is working out a picture of the evolving Net, it tries to formulate basic concepts that explain the nature of this global machine & its applications in contemporary Society. It tries to comprehend the ongoing (net) construction process by examining analogies drawn from the process that built its ancestor entities: the computer, the operating system, the programming language and the global telekom

Stay tuned...

(*) Network theory is concepts such as Automation Quantum, Virtual Von Neumann Machine, Meta-mechanical transitions (pls see older posts)

Wednesday, January 26, 2011

Network Theory

Network Theory has been the holy grail of the activity described in this blog. I coined the term NET8 (pronounced Net Theta) to designate this research
around 1998.

Meta-artificial is the name of this blog-site coined as soon as I became aware that my research about Net modeling was getting somewhere.

I picked the word "meta" to indicate the vision that the Net (as we say the Computer of Alan Turing) gives rise to a new kind of
Artificial Intelligence. I thought so being influenced by the work of V.F.Turchin under the term Metacomputing since the Turing/Von Neumann Computer gave rise to the vision of A.I.

Surely, there is a confusion with these terms. Also, I coined the term Internet Systematics which became the name of this blog-site because I thought of it as my first research result. It signaled the discovery of a systems approach to modeling the Net, a goal I set to solve being confronted in late 80s by two competing network technologies the OSI and the TCP/IP (I was the national rep in EU's COSINE and RARE activities in Networking).

I began the research problem to look for principles very similar to what John Day calls Patterns in Network Architecture - a return to fundamentals but took a different path because my initial conditions were very different as I lacked networking experience. My quest for fundamentals did not have the solid ground he was standing upon. My statement of the problem was fixed as a by product of my work in Declarative Programming (or Functional or Applicative or ..) otherwise it would be a total nonsense. I said to myself I used DP to understand what
a Programming Language is (via Denotational Semantics) why not pursue the same course to understand
the Net (this is a very linear version of the thinking process that took place over a long period).

But networking experience gradually accumulated with me as I was involved in large scale Network building projects and joined the global community of Internet developers (for example the RIPE community, the CERN community, the EBONE community etc) participating in network working groups.

After several years of participatory observation I managed to build a Model of Networking which I describe by the term Virtual Von Neumann Recursive Architecture - VVNRA (the term recursive replaced my initial term "sequence" or "chained" under Day's influence) motivated by the need to understand the "whole beast" I was handling.

The VVNRA result was a kind of dead-end did not know how to discuss it with the community, which community ? the engineers, the computer scientists ? the general systems people ? the power users ?.

I did try with a professor, talked some researchers in my work but had no luck (perhaps did not
pursue it properly).

I concluded that the results I had was some kind of pattern, an evolution pattern of the Net. One day googling with the terms "pattern" "architecture" "internet" I found
John Day's RINA (note initially his term was PNA - Patterns in Net Architecture) it provided legitimization for my usage of the concept of recursion, I re-called my 1999 result on the issue of Net modeling.

The same can be said about the concept of Virtualization a key feature of my Model above. In recent times we all know the importance this concept has for Network developments. At the time again I locate it as something vital that stemmed from the way the Net worked. I did not move any further than the assurance that I had constructed a Picture for the Net (see posts of earlier years here).

A theory needs predictions and I made some no doubt (for example the early detection of the importance of multimedia, the early WWW success, the information collection as killer Net application.

An important result was the mapping of Network's evolution phases (see older posting here)
detecting the Strowger Switch
as the initial phase along with the principle of the Quantum of Automation (embodying automation, see older postings in this blog).

In fact the model Virtual Von Neumann Recursive Architecture resulted as an abstraction over the above evolution path.

John Day in his analysis about the fundamentals cames to IPC - Inter-Process Communication (*)
in fact he talks about Layered IPC where the lowest layer is the wires.

In the diagram above APPL 1 is one application wishing to communicate with
another one APPL 2 using a facility IPC. In fact the Layer APPL
is very like the Layer IPC, just processes talking to each other.
So one recursive layer that unfolds with respect to scope, bandwidth, QoS:

IPC n over IPC n-1 over IPC over n-2 over ....

This is very similar to mine:

VVNRA n embedded over VVNRA n-1 embedded over VVNRA n-2 embedded over ....

I thought of this in 1999, the Net at phase N-1 becomes the
vBUS for the Net at phase N.

How come ?

Think the Net prior to DNS with manual naming with ftp HOSTS.txt,
from NIC and so on (see older posting).

So DNS is the introduction of some automation to the Net,

and what is automation ? well the Turing machine thing ?

Can I model onto it ?

Difficult. So move on to next near thing the "Neumann/Turing design".

Let me make it an abstract concept, as the diagram CPU-BUS-MEM shows.

Can I map ? Well let's see:

A Quantum of Automation occurs with DNS, which seems to form a Virtual Von Neumann Architecture where running the bind code etc is the vCPU loaded with this code,
the ZONE servers are its v MEM) and its vBUS is the existing Net used to
propagate the new phase.

Most Net advances fit this pattern going back to the manual switching
rooms automated by Mr. Strownger for the specific purpose of throwing out
the intermediaries, namely the phone ladies that stole his business away.

BGP Autonomous systems did a similar task for core routing.

MIME offered Multimedia to Email,

ARCHIE offered Networked Resources (via anon FTP)

WWW improved the "FTP + multimedia" manual facility and so on.

My diagram recurses over the BUS of the VN - architecture:

(*)Bob Metcalf 1972 : Networking is IPC,

Sunday, May 03, 2009

John's apocalypse

Several years ago (20?), I coined the term NET8 (pronoounced "Net theta" to mean network theory, later mutated to INTERNET SYSTEMATICS as I became more profficient with Metasystem Transition (+). this is how I perceived my attempt to understand the emerging the Net, that I happened to confront as a Computer Scientist in mid 80's (I blog my learning path slowly). Previous postings here indicate how the associated thinking developed along the practical task of building operational networks.

A couple of years ago, I read a paper by David Turner (my mentor into Functional Programming) on the significance of the theory of computing; the works initiated by Alonzo Church and Alan Turing that enable us to avoid the babel of Intel-computing vs Motorola-computing and Java-computing vs C-computing by leading us to a convergence of the corresponding concepts.

This somehow supported a further unfolding of my "naive" NET8 plan (a quest for fundamentals in computer networks).

A recent book by John Day "Patterns of Network Architecure" in 2008 by prentice hall came just in time that I was wandering about my next step.

First thing it does for me:

is to fill the gap to appreciate the case of OSI (due to my lack of the necessary experience and all round factual data concerning OSI). I know now that the "harm" I detected is confined to the OSI protocols but the OSI model is undeniably "good", for example the first to start the thread of separating the Application from the Protocol.

It also brought memories of MINITEL, CYCLADES, TRANSPAC, TELMAT SM90 my French initial-connection to Networking (due to Nicolas Malagardis of IRIA, INRIA).

It also gave me an immensely powerful insight to distinguish Engineering things from Science ones. I discovered many common things with the book like "General Systems Theory" (von Forester for John Day and Valentin Turchin in my case).

Now I can understand that nothing beats experience, I am a very simple seer on the same direction for the past 20 years as "seeking patterns" that understand the Network in Computer Science terms but I stayed at the periphery of the WITI problem (another coinage from me What Is The Internet) I never thought that the whole Net thing we have needs replacing. I somehow thought that the "onion layers" (an OS concept !) of the evolutionary attached Quanta of Automation (see previous postings) were the fittest that survived, optimal in the given circumstances for the Network ecology. John Day brings the realities that point to "quick fixes" (*) like the infamous NAT (Network Address Translation)

John Day is master and a seer (his term), he states:
In Computer Science we build what we measure

he sees too much Engineering and little if any Science on the Internet phenomenon because network things worked "off-the-shelf" from the start of the ARPANET unfinished Demo. There have been opportunities to rectify the situation but the inertia of success, vested interests and mainly lack of knowledge in the wider IETF community (!!!), "group-thinking can be dangerous" warns John Day, as an example he points to CLNS. (I did underline its prospects as a reporter in 1994 but only by accident and influenced by MTR's (M.Rose) ISODE package).

I could not have given such a focus to my research (my proud achievement in 1994 about "OSI can be harmful" apparently only tells half the truth). John Day says he did not even plan for a book but only for stock taking about lessons he had gained until then (2007-8) in order to understand Network independant of politics, religion and constrain of technology (i.e. in Science terms). Gradually he saw patterns forming that he worked out how to yield them.

In my case I thought of "theory" (again naively of course as I was extracting abstractions, not avoiding embarassment at times) because the Science style of Declarative Programming concepts conditioned my participatory observation. The above diagram shows a pattern I saw related to Network architecture, as an early attempt to codify my thoughts.

I will get to a comparative consideration between RNA (RINA) and NET8 but the most important issue that my mind is on the 6th in Boston where the PSOC meets (Louis Pouzin society, he invented datagrams), here is a note concerning how to clean slate the Net.

I wonder how many colleagues in Greece and Europe know that datagram is a French doing ? (pls leave a note if you can). In fact the French PTT (^) destroyed
CYCLADES the first datagram network !!! and developed TRANSPAC instead. I built the first
X.25 node link at 4.8K to it for Greece in 1987 collaborating with the Greek PTT (OTE).

I could not understand some negative comments the book got in Farber's list and waited for clarification. The list runs on a very tight self-controlled spirit so nothing came. But happily the Internet channel is amazing (Internet Protocol Journal) Jon Crowcroft of UCL wrote a very good review of the book for example he writes:

I found the book extremely readable and enjoyable, and although I might argue with some of the opinions in the book, I think that this is just more evidence that I should recommend the book to anyone interested in knowing why we are where we are in networking, and being better informed about where we should go next.

So my new discovery got a validation. I have followed such a course all along the NET8 path ie waiting for "proofs" because I realise my weak position to handle such
a big issue as Network Theory. This is why at times a give some sort of thumbs up sign to myself for encouragement.

(+) I found MST as a way out to explain to myself the fact that Functional/Applicative/Declarative model of computing did not conquer the world, as I had finished my research slot in that area. So I was looking for "killer aplications environments" to utilize the great works that C.Strachey had initiated. When I met ASN.1 (also LOTOS for a while) having entered networking for adventure in the mean time, it started ticking in the back of my mind that "the Net is a new kind of machine (SUN microsystems declared the infamous Net is the Computer phrase) that F/A/D model can be applied to". Next tick was with Active Networks of Farber so joined his previous IP maillist (F/A/D means Functional/Applicative/Declarative), the best spin-off of my Marco Polo like journey. Finally, the puzzle is solved this is why I use the title apocalypse above, Declarative networking exloits the separation of What from the How of protocol design. The "What" brings policy issues the "How" brings mechanism. So TCP and UDP for example in terms of "How" are the same protocol, in terms of "What" are different specification policies. Layering should shift 90 degrees left to separate things horizontally because fundamentally there is only one layer that recurses across different scope, bandwidth demands etc, John Day says. But recursion (ie Kurt Goedel's recursive functions) is the beginning of Computing, the ultimate of fundamentals. I went as far back (with my Automation Quantum as a fundamental concept of a model for the Net) to Turing Machine, so it is not a bad go I guess for an Net newbie
in 1999. Further, my VVNA - Virtual Von Neumann Architecture
does use the idea of a recursive phase in Net's evolution path (please see previous postings otherwise I seem cryptic here, in any case do query me ...).

(*) the phrase from BEATLES'LP Sergeant Pepper "I am fixing a hole where the rains gets in and stops my mind from wandering ..." is appropriate to the situation

(^) a peer to our greek OTE, I help plant ISP OTENET there in 1995 under the belief "first we take Manhattan from within", as the L.Cohen' song goes.

Friday, April 03, 2009

Internet evolution problem

I would like to say that Internet systematics (IS), as a framework of ideas about the nature of Network, set the controls to the issue of Internet evolution quite early in its course of unfolding.

Certainly IS was not influenced by the problems that have emerged over the good years of its
tremendous growth (scaling, multimedia, informational resources, routing, security) but only
from my need to understand its fundamentals. I concentrated on its whole system aspects because this was the only available paradigm I possessed (distributed computing as implementation of Functional Programming (FP) ) at the time of my job assignment (+).

Getting into development tasks and community participation real experience gradually was acquired and provided the real ground to walk on.

The whole system approach (aka macroscopic model) triggered the evolution cognitive alarm. Turchin's MSTT provided the reference model to try to use as a modeling mechanism.

The facinating thing with MSTT was that it made me understand the problem of why the PC made the scene instead of the LISP machine (*) (build by Thinking Machines inc.) but most important it gave me a stairway to start ascending towards the idea of "network theory". I coined the name "NET8 - Net theta)" to signal the intention.

It has been some time now that grand-scale evolution, not just the next killer appl that evolves the Network but the whole wide system scale change/replacement/re-design is stated as a problem. No doubt the following gathering will produce some interesting thread:

Reinventing the Internet - Can We and How Would We?

It has been said that no one engineered the Internet; rather it was cobbled together over time to address real and emerging needs. That cobbling built on the work of those who laid the foundations and necessarily required compromises to maintain compatibility. This panel will explore the hard problems challenging the Internet as we know it today, what we might do differently and how best to realize the next evolution of the Internet. The panelists have been asked to address what they see as the security challenges that must be addressed to ensure the continued viability of the Internet and how best to respond to those challenges.

Moderator: David Farber, Distinguished Career Professor of Computer Science and Public Policy

Panelists: Steve Crocker, CEO of Shinkuro, Inc.
Larry Roberts, Chairman of Anagran, Inc.
Paul Mockapetris, Chief Scientist and Chairman of the Board at Nominum, Inc.
Guru Parulkar, Executive Director of the Clean Slate Internet Design Program and Consulting

(*) I started my research on FP from G.Steele's student paper about parallel garbage collection.

Friday, February 06, 2009

web science on ACM

I did notice the ressemblance between "web science" and "internet systematics" in a post here on FEB 2007, I have just noticed a more developed version by the same authors on a recent ACM publication. It is titled Web science: an inter-disciplinary approach to understanding the web. The text on the front of the publication as well as the graphic indicate to me at least that the two terms are amazingly close. I read: model the web as a whole, keep it growing and undestand it as continuing social impact. A SYSTEMS APPROACH (cap by me) in the sense of "systems biology" is needed if we are to be able to understand and engineer the future web.

I will study the article and write about "internet systematics" to the authors,I hope that Sir Tim Berners-Lee will remember our co-existence in the NIR/USIS working group report. I will also confess that I did not follow the path he offerred, unlike colleague Peter Flynn because I chose to sort-out the first batch of CISCO routers in Greece (1 AGS, 2 MGS, 4 IGS) , thinking that being a software person there was time to catch-up. In fact I did forecast in 1992 to ARIADNE NOC personnel the success of WWW over GOPHER although that latter was king with 3000 % growth rate agaist WWW's 300 % (measured on NSFNET)

but global devs flew fast, I never made it to the devs team !


Thursday, December 11, 2008

intra-disciplinary in systems thinking

In previous postings I use the term Internet Systematics to denote an approach very similar to Systems Thinking as given by the full review of the field by C.Francois. A comment on previous posting underlined the problem what is a systems approach ?

George Adamopoulos, the constant postmaster (*) located a relevant paper
a very welcoming comment that has me going on this "systems" direction.

The title was "A bootstrap is cooking" signaling the feeling that a legitimization of the work is on the way. I am close but not quite there. How can you express something to Systemists that I am not certified in their full language versus how to introduce Systems thinking to Networkers without going into details that do not interest them for sure adn for what purpose ?

The best I have done so far is the book The phenomenon of Science a really simple introduction though is using the term cybernetics. But my real work is almost a redux of the book, for example Automation Quantum or Network Transition instead of Turchin's Metasystem Transition (see wikipedia) so it will be to talk about two evolutionary cases.

Systems, General Systems, Cybernetics is really a inter-disciplinary approach applied across Science fields like Biology, Economy etc.

Internet Systematics (or Systemics) seems to constitute an intra-discipline systems approach applied across the complexity layers of the Network phenomenon. Hum ?

Googling the term takes to a Phd program in MIT !

Intra-disciplinary knowledge areas

The intradisciplinary computing knowledge areas are organized into the three I's: interaction, informatics, and infrastructure.

Interaction refers to topics related to the combined action of two or more entities (human or computational) that both affect one another and work together when facilitated by technology. It in turn encompasses several subtopics relating to how people and technology interact and interface.

Informatics is the study of computational/algorithmic techniques applied to the management and understanding of data-intensive systems. It focuses on the capture, storage, processing, analysis, and interpretation of data. Topics include primarily algorithms, complexity, and discovery informatics.

infrastructure comprises aspects primarily related to hardware, software (both system software and applications), communications technology, and their integration with computing systems through applications. The focus is on the best organization of these elements to provide optimal architectural solutions. It includes, on the hardware side, system-level design (e.g., for system-on-a-chip solutions) and their building block components. On the software side, it covers all aspects of systems and applications software development, including specification and design languages and standards; validation and prototyping, and multi-dimensional Quality-of-Service management; software product lines, model-driven architectures, component-based development, and domain-specific languages; and product estimation, tracking, and oversight. The communications subtopic includes sensor networks and protocols, as well as active networks, wireless networks, mobile networks, configurable networks, and high speed networks as well as network security and privacy, quality of service, reliability, service discovery, and integration and interworking across heterogeneous networks. At the system level there are issues related to conformance and certification; system dependability, fault tolerance, verifiable adaptability, and reconfigurable systems; and real-time, self-adaptive, self-organizing, and autonomic systems.

(*) Constant Gardener, is a John Le Carre recent title that I think best describes Adamo's consistent attention to Network Administration, a noble profession and discipline still out of the radar of many Universities and Research Centers.