nineties
this paper is an interesting case, as it’s the first to bring in the topic of neural networks. it mentions three previous related papers, but they’re pay-walled. in an earlier post, mention is made of the resonance between the fortran era and the cold war, from 57 to 77. this paper is part of the final stages of that era, twenty years later.
for those involved with astronomy and aerospace engineering in the nineties, around rlm and wrw at utaustin for example, the world changed. aerospace shrank dramatically, as did neural networks. thirty years later, nearing 2027, both have revived in an equally surprising fashion. all this is an interesting aspect of the project starid story, to be explored gradually over time. for an overview of neural networks the brain makers is recommendable, and made for interesting reading around 2010 while relaxing ‘at the office’ in the mcc building, since mcc plays a prominent role within the book.
a notable thing about cold war era neural networks is that they generally involved alternative ‘parallel’ hardware schemes, in direct opposition to classic, sequential, ‘von neumann’ hardware. simd ‘single instruction multiple data’ hardware was downplayed as a compromise, and it’s interesting to consider how simd in fact became completely dominant. it’s possible this was less of a surprise for numerical computing / fortran people from the ‘hard’ physical fields. this paper discusses using commercial ibm hardware and serves as a time-capsule from the brief ascendancy of neural network chips and lisp machines.
one motivation was speed. parallel hardware neurons within the chip simultaneously process a star tracker image into a connectionist classification. in some sense starid becomes a single ‘wide operation’, as opposed to a long sequence of ‘narrow operations’ with classic hardware. another motivation was robustness and ‘fuzzy’ resistance to noise. the processing performed by the neurons essentially checks the dot products of pairs of vectors. a match can be ‘close enough’, there’s a built in tolerance to variations.