15–18 Oct 2024
Purdue University
America/Indiana/Indianapolis timezone

ANNA (Analog Neural Net Architecture)

Not scheduled
20m
Steward Center 306 (Third floor) (Purdue University)

Steward Center 306 (Third floor)

Purdue University

128 Memorial Mall Dr, West Lafayette, IN 47907
Poster

Speaker

Cristian Barinaga (Purdue University)

Description

Modern AI model creation requires ample computational power to process data in both predictive and learning phases. Due to memory and processing constraints, edge and IoT electronics using such models can be forced to outsource optimization and training to either the cloud or pre-deployment development. This poses issues when optimization and classification are required from sensor and data-collection devices themselves, or when these devices are in stand-alone environments and cannot receive such external support. In response, analog neural nets have been explored to achieve rapid net input-output speeds; however, the optimization of these nets remains managed by digital means, continuing to strain microprocessors with the bulk of the computations. To overcome such constraints in these applications, the present poster explores using analog electronics to develop the framework of a self-optimizing neural network in an idealized SPICE environment. More specifically, the design of an analog ReLU neuron is included and simulated. This design operates by optimizing with respect to time, accomplished by implementing an array of time differentiator and integrator Op-Amp circuits, along with mixed-signal derivative-sign identification using conventional XOR gates. The results of this inquiry suggest a future in real-time, stand-alone, self-optimizing neural nets using miniaturized low-power analog electronics. Included results in parameterization optimization also suggest the viability of training digital-domain AI models on embedded systems using built-in timers as a short-cut for optimization.

Author

Cristian Barinaga (Purdue University)

Presentation materials