Visualization
of the Technology
GpuScript
GpuScript is what enables GEM (Geometric Empirical Modeling) AI by turning one
laptop into a supercomputer. Below is a video of a Julia fractal set written in
GpuScript running on a laptop. Each frame is 32
microseconds, or 63,000 times faster than a CPU.
GpuScript can make a 200,000 GPU cluster supercomputer a million times faster and
more powerful. It will also enable exponential growth of Augmented Reality (AR)
and Virtual Reality (VR) technologies, which are currently limited by computing
power. Below is a demonstration of Ray Tracing created using GpuScript, modeling light waves reflecting off of surfaces
and objects in the scene.
Right now there are few if any GPU debugging tools. No GPU language
supports OOP (object-oriented programming) or functional programming styles. GpuScript is the first language that supports advanced
features in GPU programming. It allows GPU programming and debugging to be done
entirely in C# by any beginner programmer. It generates HLSL and ShaderLab GPU code and allows development of much larger
and complex GPU programs. The code generator generates up to 50 lines of code
for every one line written, making programmers 50 times more productive. The
video below shows a particle simulation of one million spheres written in GpuScript.
GEM (Geometric Empirical Modeling) Neural Network /
AI
To illustrate how GEM AI works from the very
basic, below is the smallest and simplest GEM AI neural network, with one
input, one output, and draws a straight line through two points. It has 689
concurrent layers, 1380 nodes, and 2760 links. It assembles and learns in 2
milli-seconds (ms), thinks every 4 ms, and can perform a million evaluations in 7 ms. A single evaluation requires 161 million floating point
operations. GEM size increases logarithmically with complexity, so a GEM neural
network with millions of training examples and large numbers of inputs and outputs
will only be about 1000 times larger.
The video below on the left is a visual
depiction of a GEM AI neural network with 300 hidden layers. This GEM neural
network has 4 inputs at the top: age, height, gender and smoking, and one
output at the bottom: the lung exhale volume. The blue and green nodes show the
left and right hemispheres. The animation on the right shows neuron activation
operating in training mode in slow motion. Depending on the application, the
number of hidden layers can grow to thousands. The network still learns,
thinks, and runs instantly because the layers are concurrent, not sequential or
recurrent. Each layer can have thousands and thousands of neurons, and each
neuron can be connected to thousands of other neurons.
Most artificial neural networks model neurons
as a simple linear summation of the inputs with an offset threshold. Biological
neurons can perform complex linear and non-linear operations on input signals.
The GEM (Geometric Empirical Modeling) AI neural network enables each neuron to
compute non-linear operations on inputs, a key component for instant
construction, training, and thinking. GEM relies heavily on non-linear
mathematics, a field that is sadly lacking in both academia and industry. Such
things can only be learned from the great mathematicians of the past, who used
their biological neural networks as opposed to computers and GPU
super-clusters. Here's a peek inside the neural network trained on 100,000
examples for predicting diabetes, with over 300 concurrent hidden layers,
18,000 neurons, and 1 million connection links.