The blueprint for a new type of AI
Definitions: Morality (any morality): A set of rules for behavior -- "Do this", "Don't do that." Values: Objects that are necessary for the survival of an organism. (e.g., water, food, ...) Virtues: Behaviors necessary for obtaining values, according to the nature of the valuer and the nature of the value.
For every prime number, there is a unique ruler. And since such rulers can be constructed through paper folding, then if there is a general method for folding paper into unique rulers, one can use the same method for constructing primes.
For each color in the preceding rainbow (and there are 6 of them), the RGB components are "golden ratios" of each other. Take for example the "red" first from the left: rgb(100%, 38.2%, 61.8%); Here r/b = b/g = golden ratio = 1.61803398875. Going through all the
A class is a list of properties common to all objects it subsumes. For example, the class "Tree" lists the properties common to all trees, like color or height or age. But this is also the common property of all classes, making the class of all classes look like this:
The hope, if not the goal, is to make an AI that is smarter than us; an AI that can teach us new things and show us our mistakes in math, physics and elsewhere. If so, then this machine cannot be limited by our own conclusions about what is true