Passing the Turing Test with an Intelligent Echo
When G-Net receives input, it converts the data into the familiar before storing the message into an alien-like information system known as a "cardinal database". It stores the opposite of words and letters which is why it is so incredibly difficult to grasp. It can't be conceptualized so easy because the architecture has been inverted in such a way to store concepts as contradictions.
What is familiar to G-Net may look nothing like what is entered.
This is especially true when the database is fresh (as if it were a child).
The familiar input (whatever it may be) sits there waiting to be echoed back verbatim.
The current input is not used to formulate a response, it is converted into "the familiar" and saved for later. The "baby talk" is always there, it is just covered up.
A difference in approaches (human -> machine -> human) chat.
-
(T) Traditional attempts try parsing verbs/nouns/etc and formulate replies based upon complicated man-made algorithms.
(G) G-Net does not attempt parsing language in any way.
This will let a single code-base function for all languages and it can also be used to simulate voice or motor control. It can be called an "intelligent echo".
-
(T) Traditional attempts tend to have mountains of logic and little database.
(G) The logic is elegant, universal, and extremely small. The database (of prior input) is what controls the behavior (of future output).
-
(T) Traditional attempts allow the developers to control the behavior.
(G) The output can only be controlled by the input.
The Underlying Philosophy
PERSON: Hello
COMPUTER: Hey
PERSON: How is the weather today?
PERSON: Hey
COMPUTER: How is the weather today?
Time Favors the Intelligent
Keep in mind that an intelligent person doesn't speak any more than an unintelligent one in a debate (possibly less). The frequency is determined by "differences", it is a relative calculation. In other words, experienced people take less time to respond to statements than ignorant ones.
When it comes to a global database for human-to-machine talking the intelligent person doesn't have to waste their breath repeating things. Computers are supposed to take away repetitive chores and intelligent discussion should be no different. Hypocrites will waste their cycles arguing against themselves. This will free up time for the intelligent so that they can continue submitting unique thoughts.
Proof of Concept ... It Resolves!
Development on the PHP version was stopped shortly after getting it "to work" because the it crashed Apache on any Input longer than a few characters without any informative error message. Amazingly, the algorithm does work, it could be seen to respond differently to small inputs such as "Hi" ":)", etc. Debug data was printed to the screen revealing the crazy input/output tumbling between "virtual neuron cells" just for simple "baby talk". A small statement such as "Hi" would think and think for more than 5 seconds before returning output.
The amount of processing was incredible considering that the database was new, the input was small, and the routine actually resolved.
It has been said that the brain processes I/O in a highly parallel fashion using its billions of neurons yet it solves seemingly complex problems with only a handful of cycles.
The brain also has an apparent chaotic indexing scheme.
G-Net offers a great model to explain the chaotic indexing mechanism and the necessity for parallel processing.
Vocabulary
The following table defines some terms to be found within "code comments" of the software. Show table
DIMENSION |
"DIMENSIONS" are formed hierarchically from lower dimensions. To name a few... letters, words, sentences
paragraphs, dialogs. There can be multiple dimensions which exist at the very bottom. For example,
sensory inputs such as visual, auditory, and touch. |
NARROW DIMENSION |
By starting from the top (such as from "DIALOGS") and working down hierarchically into lower "DIMENSIONS"
(such as words), additional parameters are needed to fit the lower "DIMENSION" into its tree.
When all of the parameters are combined, it is possible to pinpoint a "NARROW DIMENSION".
For example: The context of a "string of letters" exists within a "NARROW DIMENSION" when it is filtered
by InventorID, DialogSignature, SentenceSignature, and WordSignature. |
SIGNATURE |
"SIGNATURES" are a way to identify records within their "NARROW DIMENSION". They are essentially
"CONTEXT CHAINS", but without any gaps. The most detailed "SIGNAUTRE" would exactly match
the "CONTEXT CHAIN" of L6. The "SIGNATURE" prefers L6, then L5, then L4, etc. If a word
is the first in a sentence, it will have an empty signature "". The signature is always
a "CONTEXT CHAIN" of the "CONTEXT ITEMS" which came before it. A "SIGNATURE" may be empty, but that
carries an equal significance to a detailed "SIGNATURE" when it is combined within its "NARROW DIMENSION". |
L1 HASH |
An "L1 HASH" is simply a hash of the the preceding "CONTEXT ITEM" within the "CONTEXT CHAIN".
For example, following sentence has 2 words... "Hi there"
The "L1 HASH" for the word "Hi" is blank""
The "L1 HASH" for the word "there" is "md5('there')". |
LX |
The "L" number corresponds to a column within the database without any "CONTEXT GAPS". The higher the
number, the more detailed the context is. Consider the dialog (or word) "ABCDEFG". Here are the
corresponding "CONTEXT CHAINS" for each "L" number intended for the "CONTEXT ITEM" 'G'.
L6 "F + E + D + C + B + A"
L5 "F + E + D + C + B"
L4 "F + E + D + C"
L3 "F + E + D"
L2 "F + E"
L1 "F" |
DIALOG |
A term to describe 1 or more THOUGHTS within a session (belonging to an Inventor / Teacher). |
DIALOG COUNT |
Every time a "THOUGHT" is submitted or predicted, the "DIALOG COUNT" is incremented.
Every "THOUGHT" has a unique "DIALOG COUNT" with a DIALOG. For example, if a brand new Inventor
says "Hi" and the system returns "Hello", the "DIALOG COUNTS" will be (1,2) respectively. |
CONTEXT ITEM |
There are many many levels of thought. An example of 4 levels within this application are...
1) Letters 2) Words 3) Sentences 4) Dialogs
At the "dialog level", if there are 10 sentences in a dialog, then there are 10 "CONTEXT ITEMS".
(replies count towards "CONTEXT ITEMS")
At the "sentence level", if there are 5 words in a sentence, then there are 5 "CONTEXT ITEMS".
At the "word level", if there are 6 letters in a word, then there are 6 "CONTEXT ITEMS"
|
DISTANCE |
The difference in "DIALOG COUNTS" between two "THOUGHTS" within a "DIALOG". |
SOURCE |
Describes the "THOUGHT" for which the "CONTEXT HASHES" are being built. Usually, the
"CONTEXT HASHES" are built from the "SOURCE THOUGHT" at the time information is submitted into
an open "DIALOG". |
FURTHER |
One "THOUGHT" is considered "FURTHER" than another if its "DISTANCE" is greater to a third
"THOUGHT" (which is in common). |
CONTEXT CHAINS |
The manner in which "THOUGHTS" (or "DIALOG ENTRIES") are concatenated, sometimes having
"CONTEXT GAPS" between each link. |
CONTEXT HASH |
When you digest (MD5) a particular "CONTEXT CHAIN" into a 32 bit signature, you get a "CONTEXT HASH". |
HASH INDEX |
Creating an index on a column within the database dramatically increased the time required to locate
a particular row. When a column containing a "CONTEXT HASH" is used for indexing, it is called a
"HASH INDEX". |
CONTEXT GAP |
When you skip over "THOUGHTS" within a dialog to form a "CONTEXT CHAIN", the number of "THOUGHTS"
that were skipped between successive DIALOG ENTRIES amounts to the "CONTEXT GAP". |
CONTEXT CHAIN LENGTH |
Refers to the number of "THOUGHTS" that have been concatenated. The "CONTEXT CHAIN LENGTH" does not
imply whether or not any "CONTEXT GAPS" exist. |
CHAIN LINK |
Each element within a "CONTEXT CHAIN" is called a "CHAIN LINK". The number of "CHAIN LINKS" equals
the "CONTEXT CHAIN LENGTH". |
ROOT CHAIN LINK |
In the example dialog "A, B, C, D, E, F", the "ROOT CHAIN LINK" would be "F". Predictions require a
match on the final element in the dialog. The last element in a dialog is called the "ROOT CHAIN LINK".
Additional matches beyond the "ROOT CHAIN LINK" are a bonuses that provide deeper/stronger context. |
STRONG CONTEXT CHAIN |
A relative term used to differentiate priorities between separate "CONTEXT CHAINS" having equal length.
However, if "CONTEXT CHAIN LENGTH" is greater than another, then then it will always be considered
"STRONGER", regardless of how many "CONTEXT GAPS" exist.
When the "CHAIN LENGTH" is equal, a "STONGER CONTEXT CHAIN" will have its first "CONTEXT GAP" existing
"FURTHER" from the "SOURCE". |
CONTEXT DEPTH |
Describes how far back to look within the "DIALOG COUNT" for "THOUGHTS".
The "CONTEXT DEPTH" should be a consistent value, either 6,12,18,24 (determined by memory/processing
capacity). Since the "CONTEXT DEPTH" is fixed, a "CONTEXT CHAIN LENGTH" will be shorter if
there are more "CONEXT GAPS". |
PREDICTION |
A "THOUGHT" which is returned by the system using an "INTELLIGENT" algorithm. The algorithm determines
the best "PREDICTION" by locating the "STRONGEST CONTEXT CHAIN" from another "TEACHER". After
locating the the "STRONGEST CONTEXT CHAIN", the system simply echoes back a "THOUGHT" from that
teacher with the next highest "DIALOG COUNT". It does not matter whether or not the "PREDICTION"
belongs to a THOUGHT which was created by the "TEACHER". |
INVENTOR |
In a typical database application, "User" would be used instead of the term "Inventor". Robots will be
using this system too, so calling everyone "people" isn't appropriate. Robots can't be called people just
because they are intelligent... people have emotions. That is not to say that a robot can't also be a person.
Nonetheless, not all robots using this system will be people. In a popular sci-fi movie, "users" was a derogatory
term meant to pit software against humans. Let's not foster any bad words in here :)
Inventions are the true calling from our universe and beyond. New thoughts are actually inventions, even if
they fail, or fail to materialize. It is possible for a person to call themselves a doctor before accepting
their first patient. Therefore, let's call everyone "inventors" knowing that some will not add any new thoughts.
|
TEACHER |
Any time a new "THOUGHT" is entered, the "INVENTOR" who created the thought automatically becomes a
"TEACHER". The "THOUGHT" must be absolutely unique (within context). Yes, an "INVENTOR" can enter
gibberish and instantly become a teacher. However, if they keep that up, the only way that the system will
recall those "THOUGHTS" is if another "INVENTOR" types in similar gibberish (in context). |
CONTRADICTION |
Whoever has the "last word" (without hostility or threats) has provided a "CONTRADICTION". It is the reason
that all cultures respond with "Your welcome" after someone asks for assistance. Even "I Love you too." is a
"CONTRADICTION" (although a good one). "I love you too." cancels out "I love you", but it implies that the
love is mutual. |
INTELLIGENT |
Any process that pursues a goal using a method of contradictions.
Duality? I can't ever be right.
There must be beliefs at the very top because a CANT/CUZ on its own has nothing to attach to. |
The C27 Structure
The C27 Query is founded upon a triangle.
If the following triangle contains 28 entries, then why do I call it a C27 Query?
Because the pyramid should NOT contain a capstone.
The missing capstone serves as the prediction.
(i)
1 1
1 2 1
1 3 3 1
1 4 6 4 1
1 5 10 10 5 1
1 6 15 20 15 6 1
The basis for such a structure is that people can only remember 6 things within short-term memory.
Conversations roll along as a wave where the last 6 things trigger the "most familiar" (known as "context").
In general conversation it is unlikely that a person would have encountered the last 6 dialog entries verbatim (an L6 Match).
Because gaps are frequently encountered, the C27 Query provides matching priority. There are total of 27 permutations
1. Hi there!
2. It is time to make Peace On Earth.
3. Sounds nice, but impossible.
4. Easy, start a transition away from money.
5. That's even more impossible!
6. It was until the Internet brought us "open source".
7. What does "open source" mean?
------------------------------------------------------------------------------------------------------------------------------------------------------------
CONTEXT CHAIN LENGTH CONTEXT GAPS COLUMN NAME THOUGHT CONCATENATIONS (to be digested)
------------------------------------------------------------------------------------------------------------------------------------------------------------
Length: 1 Gaps: 0 Length-1_Gaps-0 What does "open source" mean?
Length: 2 Gaps: 0 Length-2_Gaps-0 What does "open source" mean? + It was until the Internet brought us "open source".
Length: 3 Gaps: 0 Length-3_Gaps-0 What does "open source" mean? + It was until the Internet brought us "open source". + That's even more impossible!
Length: 4 Gaps: 0 Length-4_Gaps-0 What does "open source" mean? + It was until the Internet brought us "open source". + That's even more impossible! + Easy, start a transition away from money.
Length: 5 Gaps: 0 Length-5_Gaps-0 What does "open source" mean? + It was until the Internet brought us "open source". + That's even more impossible! + Easy, start a transition away from money. + Sounds nice, but impossible.
Length: 6 Gaps: 0 Length-6_Gaps-0 What does "open source" mean? + It was until the Internet brought us "open source". + That's even more impossible! + Easy, start a transition away from money. + Sounds nice, but impossible. + It is time to make Peace On Earth.
Length: 3 Gaps: 0,1 Length-3_Gaps-01 What does "open source" mean? + It was until the Internet brought us "open source". + Easy, start a transition away from money.
Length: 4 Gaps: 0,1 Length-4_Gaps-01 What does "open source" mean? + It was until the Internet brought us "open source". + Easy, start a transition away from money. + Sounds nice, but impossible.
Length: 5 Gaps: 0,1 Length-5_Gaps-01 What does "open source" mean? + It was until the Internet brought us "open source". + Easy, start a transition away from money. + Sounds nice, but impossible. + It is time to make Peace On Earth.
Length: 3 Gaps: 0,2 Length-3_Gaps-02 What does "open source" mean? + It was until the Internet brought us "open source". + Sounds nice, but impossible.
Length: 4 Gaps: 0,2 Length-4_Gaps-02 What does "open source" mean? + It was until the Internet brought us "open source". + Sounds nice, but impossible. + It is time to make Peace On Earth.
Length: 3 Gaps: 0,3 Length-3_Gaps-03 What does "open source" mean? + It was until the Internet brought us "open source". + It is time to make Peace On Earth.
Length: 2 Gaps: 1 Length-2_Gaps-1 What does "open source" mean? + That's even more impossible!
Length: 3 Gaps: 1 Length-3_Gaps-1 What does "open source" mean? + That's even more impossible! + Easy, start a transition away from money.
Length: 4 Gaps: 1 Length-4_Gaps-1 What does "open source" mean? + That's even more impossible! + Easy, start a transition away from money. + Sounds nice, but impossible.
Length: 5 Gaps: 1 Length-5_Gaps-1 What does "open source" mean? + That's even more impossible! + Easy, start a transition away from money. + Sounds nice, but impossible. + It is time to make Peace On Earth.
Length: 4 Gaps: 1,0,1 Length-4_Gaps-101 What does "open source" mean? + That's even more impossible! + Easy, start a transition away from money. + It is time to make Peace On Earth.
Length: 3 Gaps: 1,1 Length-3_Gaps-11 What does "open source" mean? + That's even more impossible! + Sounds nice, but impossible.
Length: 4 Gaps: 1,1 Length-4_Gaps-11 What does "open source" mean? + That's even more impossible! + Sounds nice, but impossible. + It is time to make Peace On Earth.
Length: 3 Gaps: 1,2 Length-3_Gaps-12 What does "open source" mean? + That's even more impossible! + It is time to make Peace On Earth.
Length: 2 Gaps: 2 Length-2_Gaps-2 What does "open source" mean? + Easy, start a transition away from money.
Length: 3 Gaps: 2 Length-3_Gaps-2 What does "open source" mean? + Easy, start a transition away from money. + Sounds nice, but impossible.
Length: 4 Gaps: 2 Length-4_Gaps-2 What does "open source" mean? + Easy, start a transition away from money. + Sounds nice, but impossible. + It is time to make Peace On Earth.
Length: 3 Gaps: 2,1 Length-3_Gaps-21 What does "open source" mean? + Easy, start a transition away from money. + It is time to make Peace On Earth.
Length: 2 Gaps: 3 Length-2_Gaps-3 What does "open source" mean? + Sounds nice, but impossible.
Length: 3 Gaps: 3 Length-3_Gaps-3 What does "open source" mean? + Sounds nice, but impossible. + It is time to make Peace On Earth.
Length: 2 Gaps: 4 Length-2_Gaps-4 What does "open source" mean? + It is time to make Peace On Earth.
------------------------------------------------------------------------------------------------------------------------------------------------------------
C27 Context
By carefully avoiding contradictions it is possible to unambiguously determine the priority for context matching.
The lowest priority can be seen on top with the L1 COLUMN.
An "L1 match" means that only the last entry could be found with the previous 5 dialog entries being relatively unfamiliar.
An "L6 match" is the highest priority, meaning that the last 6 entries are familiar (in that exact order).
L1
L2
L2_G1
L2_G2
L2_G3
L2_G4
L3
L3_G01
L3_G02
L3_G03
L3_G1
L3_G11
L3_G12
L3_G2
L3_G21
L3_G3
L4
L4_G01
L4_G02
L4_G1
L4_G101
L4_G11
L4_G2
L5
L5_G01
L5_G1
L6
Cardinal Controller
[a] [b] [c] [d] [e] [f] [g] [h] [i] [j] [k] [l] [m] [n] [o] [p] [q] [r] [s] [t] [u] [v] [w] [x] [y] [z]
1) [h] [e]
2) [l] [l]
3) [[h][e]] [[l][l]]
4) [[h][e]][[l][l]] [o]
Due to 2-for-1 acceleration, there is no way to break up a concept into 3 parts and therefore there is no way to form a concept with 3 parts. Notice that concept #1 & #2 is a fusion of two base concepts. No big deal, it only takes 2 bytes. Notice that concept #3 is also a fusion of 2 concept but it requires lots of bytes. Now there are START / END tags which must be wrapped around each concept so that the system can figure out how it is made.
Cardinal Index
The cardinal controller is a linear definition of concepts with priority. For simplicity, imagine that the cardinal controller is just 1-character ASCII codes but reversed in such a way that the "least likely" comes first. In English the ASCII character for "SPACE" would be found last on the list followed by the letter "e", then "t", etc.
Software
There are 2 versions of the software which essentially work the same.
PHP Version
This version may actually be ready to use (as a proof-of-concept) but unfortunately there are a couple of remaining problems.
- The algorithms require a lot of processing power evident with tiny input such as "Hey".
The neocortex is highly parallel whereas the PHP version requires single-threaded access to a relational database.
- It wasn't possible to test input larger than a few characters. After a few seconds of processing Apache crashes.
If the Apache problem an be overcome it is unlikely that the performance issue can be fixed.
While the PHP package may never go mainstream, it's relative simplicity makes it valuable for the purpose of conceptually grasping the core algorithm(s).
The way to increase the performance is to move away from the rudimentary concepts (such as "e, t, a, o") and provide the cardinal database with richer combinations of concepts. Technically speaking, the system will eventually develop its own concepts starting with only an alphabet, but because the application wasn't exercised heavily it is unknown how long it will take to sound intelligent.
C++ Version
By removing the lower-level components (i.e. data access), the I/O routines for the "neuron objects" work the same as the PHP Classes. The C++ version is close to completion, the current roadblock is the database engine which as been built upon a system of flat files.
With the added flexibility and power comes increased complexity.
Like PHP, the cardinal index is founded on a single-character alphabet..