
AbstractThis paper, the second one in a three-paper sequence, presents the nesC model of a Hopfield neural network configured for a static optimization problem, the maximum independent set, in fully parallel and distributed mode for TinyOS-based wireless sensor networks. Actual nesC code that implements the required neural computing functionality is presented. The graph representation of the maximum independent set problem is used as the basis for the topology of the Hopfield network as well as the wireless sensor network since each mote is conceived to house one neuron in order to facilitate fully parallel and distributed computation. The nesC implementation of a multitude of phases of computation is detailed including initialization of the neural network, relaxation, convergence detection, and solution detection all while the neural computations are performed on the wireless sensor network. Simulation of the presented nesC-TinyOS model is deferred to the third paper in the sequence.
static optimization, wireless sensor network, TinyOS, nesC, parallel and distributed computation, maximum independent set problem, Hopfield neural network
static optimization, wireless sensor network, TinyOS, nesC, parallel and distributed computation, maximum independent set problem, Hopfield neural network
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 6 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
