#| 2021-02-26. NNI.lisp Neural Network program in ABCL, Dr. Clark Elliott, DePaul University Copyright 2021-10-09 by Dr. Clark Elliott, DePaul University. All rights reserved. Refer to Jose Luis Bermudez, Cognitive Science (2nd Ed, Chapter 8 / 3rd Ed Chapter 5) for background on implementing logic gates with neurons. Basic assignment: 0. Read all the way to the bottom of this file before you start. 1. Become comfortable running this program and training neurons for NOT, AND, OR 2. Save some of your play output to a file. (Copy / Paste from the LISP console? Screenshots?) 3. Create the train4nor and train4nand functions. (NOR is the opposite of OR where 0s become 1s and 1s become 0s. NAND is the opposite of AND: Not OR and Not AND) 4. Save output to a file to prove you have fully implemented the two new functions (tested with 4 inputs each: 0,0 / 0,1 / 1,0 / 1, 1) 5. Change epsilon several times and run again each time 6. Save some of your new output to a file. Comment on the effect of changing epsilon. 7. If there is a checklist, complete it. See sample runs at the bottom of this file Use the LISP resources I provide to look up these functions: COND IF WHEN DOTIMES We use LET to create temporary variables we'll use in the function being DEFUNed (DEFUN means Define Function). After the function has run, the temporary variables go away. |# (defun help () (format t "(new-rand) to get random values for your neuron~%") (format t "(notgate n) (gate n n) where n is 0/1 to see current logic.~%") (format t "(train4NOT) (train4AND) (train4OR) are the training functions.~%") (format t "(help) to replay this list.~%") ) (help) #| FOUR GLOBAL VARIABLE VALUES: [w1, w2, threshold], representing a neuron, and [epsilon] our learning constant. Ordinarily, we would put these in a closure variable, and name them *w1* etc., but for this basic assignment, let's keep the typing simple. CDE: For generating unique neurons, use defstruct with accessors. |# (setf threshold 2) (setf w1 0) (setf w2 0) ; Our three global variables to define the neuron (setf epsilon 0.25) ; Use as the learning constant, but can EXPERIMENT with this value. (defun new-rand () ; Set random starting values for input weight 1 [w1], input weight 2 [w2] and threshold: (setf threshold (random 3)) ; (random 3) returns an integer from 0 to 2 (setf w1 (random 3)) (setf w2 (random 3)) (format t "T: ~s, W1: ~s W2: ~s~%" threshold w1 w2)) ; Show the values with formatted output (defun notgate (val) (let ((summ 0)) (setf summ (* val w1)) (if (>= summ threshold) 1 0))) ; Greater than, or equal to the threshold? Yes? return 1 / No? return 0 (defun train4NOT () (dotimes (junk 20) ; We don't actually need the loop variable value, hence we name it junk. (train-notgate 0 1) (train-notgate 1 0) (format t "T: ~s, W1: ~s~%" threshold w1))) (defun train-notgate (v1 target) ; For this input value, what is the target? (let ((delta 0)(actual 0)) ; Make some local variables. (setf actual (notgate v1)) (setf delta (- target actual)) (cond ((equal delta 0) t) ; Output is correct, (CDE: setf assoc of input val?) (t (setf threshold (+ threshold (* -1 epsilon delta))) (setf w1 (+ w1 (* epsilon delta v1))))))) (defun gate (v1 v2) ; Send two input values to the gate function. What is the output? (let ((i1 (* v1 w1)) (i2 (* v2 w2)) (summ 0)) ; Multiply inputs by weights / set sum to 0 (setf summ (+ i1 i2)) ; Sum the weighted inputs (if (>= summ threshold) 1 0))) ; Greater than, or equal to the threshold? Yes? return 1 / No? return 0 (defun train-gate (v1 v2 target) (let ((delta 0)(actual 0)) (setf actual (gate v1 v2)) ; Return actual current output of function (setf delta (- target actual)) (cond ((equal delta 0) t) ; Output is correct, so no change to threshold or weights, return t [True] (t (setf threshold (+ threshold (* -1 epsilon delta))) ; Apply perceptron convergence rule 3x... (setf w1 (+ w1 (* epsilon delta v1))) (setf w2 (+ w2 (* epsilon delta v2))))))) (defun train4and () (let ((rval nil)) (dotimes (junk 30) ; 30 is the maximum cycles we will train (let ((old-threshold threshold)(old-W1 w1)(old-w2 w2)) ; Get the current values to start (train-gate 0 0 0) ; Train for every input combination, passing the correct output for it... (train-gate 0 1 0) (train-gate 1 0 0) (train-gate 1 1 1) (format t "T: ~s, ~cW1: ~s ~cW2: ~s~%" threshold #\tab w1 #\tab w2) (when (and (equal old-threshold threshold) (equal old-w1 w1) (equal old-w2 w2)) ; Training complete (setf rval t) ;;; True that training is complete (return)))) ;;; Return from dotimes rval)) (defun train4or () (let ((rval nil)) ; Set our default return value to NIL. (dotimes (junk 30) (let ((old-threshold threshold)(old-W1 w1)(old-w2 w2)) ; Get the current values to start (train-gate 0 0 0) (train-gate 0 1 1) (train-gate 1 0 1) (train-gate 1 1 1) (format t "T: ~s, ~cW1: ~s ~cW2: ~s~%" threshold #\tab w1 #\tab w2) (when (and (equal old-threshold threshold) (equal old-w1 w1) (equal old-w2 w2)) ; Training complete (setf rval t) ;;; True that training is complete (return)))) ;;; Return from dotimes rval)) ; If we have not completed the training, then return: NIL, otherwise with success: T (defun train4nor () ;; Your NOR code goes here: (format t "My NOR code goes here~%") ) (defun train4nand () ;; Your NAND code goes here: (format t "My NAND code goes here~%") ) (new-rand) ; make a new random-valued neuron #| SAMPLE OUTPUT: c:\Users\E\2021\NN>java -jar "C:\e\abcl\abcl.jar" Armed Bear Common Lisp 1.3.3 Java 1.8.0_181 Oracle Corporation Java HotSpot(TM) 64-Bit Server VM Low-level initialization completed in 0.217 seconds. Startup completed in 3.369 seconds. Type ":help" for a list of available commands. CL-USER(1): (load "NNI.lisp") (new-rand) to get random values for your neuron (notgate n) (gate n n) where n is 0/1 to see current logic. (train4NOT) (train4AND) (train4OR) are the training functions. (help) to replay this list. T: 1, W1: 2 W2: 2 T CL-USER(2): (new-rand) T: 1, W1: 0 W2: 1 NIL CL-USER(3): (gate 0 0) 0 CL-USER(4): (gate 0 1) 1 CL-USER(5): (gate 1 0) 0 CL-USER(6): (gate 1 1) 1 CL-USER(7): (train4and) T: 1.0, W1: 0.25 W2: 1.0 T: 1.0, W1: 0.5 W2: 1.0 T: 1.25, W1: 0.5 W2: 0.75 T: 1.25, W1: 0.5 W2: 0.75 T CL-USER(8): (gate 0 0) 0 CL-USER(9): (gate 0 1) 0 CL-USER(10): (gate 1 0) 0 CL-USER(11): (gate 1 1) 1 CL-USER(12): (train4or) T: 0.75, W1: 0.75 W2: 1.0 T: 0.75, W1: 0.75 W2: 1.0 T CL-USER(13): (gate 0 0) 0 CL-USER(14): (gate 0 1) 1 CL-USER(15): (gate 1 0) 1 CL-USER(16): (gate 1 1) 1 CL-USER(17): (notgate 0) 0 CL-USER(18): (notgate 1) 1 CL-USER(19): (train4not) T: 0.75, W1: 0.5 T: 0.75, W1: 0.25 T: 0.5, W1: 0.25 T: 0.5, W1: 0.0 T: 0.25, W1: 0.0 T: 0.25, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 T: 0.0, W1: -0.25 NIL CL-USER(20): (notgate 0) 1 CL-USER(21): (notgate 1) 0 CL-USER(22): (help) (new-rand) to get random values for your neuron (notgate n) (gate n n) where n is 0/1 to see current logic. (train4NOT) (train4AND) (train4OR) are the training functions. (help) to replay this list. NIL CL-USER(26): (load "nn-solutions") STYLE-WARNING: redefining COMMON-LISP-USER::TRAIN4NOR in NIL (previously defined in #P"C:/Users/E/Dropbox/dp/587/2021/NN/NNG.lisp") STYLE-WARNING: redefining COMMON-LISP-USER::TRAIN4NAND in NIL (previously defined in #P"C:/Users/E/Dropbox/dp/587/2021/NN/NNG.lisp") T CL-USER(27): (train4nor) T: 0.5, W1: -0.5 W2: 0.5 T: 0.5, W1: -0.5 W2: 0.25 T: 0.5, W1: -0.5 W2: 0.0 T: 0.25, W1: -0.5 W2: 0.0 T: 0.25, W1: -0.5 W2: -0.25 T: 0.0, W1: -0.5 W2: -0.25 T: 0.0, W1: -0.5 W2: -0.25 T CL-USER(28): (gate 0 0) 1 CL-USER(29): (gate 0 1) 0 CL-USER(30): (gate 1 0) 0 CL-USER(31): (gate 1 1) 0 CL-USER(32): (train4nand) T: -0.25, W1: -0.5 W2: -0.25 T: -0.25, W1: -0.5 W2: -0.5 T: -0.5, W1: -0.5 W2: -0.25 T: -0.5, W1: -0.5 W2: -0.25 T CL-USER(33): (gate 0 0) 1 CL-USER(34): (gate 0 1) 1 CL-USER(35): (gate 1 0) 1 CL-USER(36): (gate 1 1) 0 CL-USER(37): |#