CLICK HERE TO EARN MONEY FREE

Wednesday, January 11, 2012

PAPER IN ARTIFICAL

ARTIFICAL  NEURAL NETWORKS


In this paper, we describe the artificial evolution of adaptive neural controllers for an outdoor mobile robot equipped with a mobile camera. The robot can dynamically select the gazing direction by moving the body and/or the camera. The neural control system, which maps visual information to motor commands, is evolved online by means of a genetic algorithm, but the synaptic connections (receptive fields) from visual photoreceptors to internal neurons can also be modified by Hebbian plasticity while the robot moves in the environment. We show that robots evolved in physics based simulations with Hebbian visual plasticity display more robust adaptive behavior when transferred to real outdoor environments as compared to robots evolved without visual plasticity. We also show that the formation of visual receptive fields is significantly and consistently affected by active vision as compared to the formation of receptive fields with grid sample images in the environment of the robot. Finally, we show that the interplay between active vision and receptive field formation amounts to the selection and exploitation of a small and constant subset of visual features available to the robot.

Keywords

Active vision, visual receptive fields, artificial evolution, learning, neural networks, mobile robots

1. INTRODUCTION

Biological vision systems filter, compress, and organize the large amount of optical stimulation as electrical signals proceed from the retina towards deeper structures of the brain. This data reduction is achieved by a layered, distributed, and topologically organized set of neurons that individually respond to specific aspects of the optical stimulus. In mammals, for example, neurons in the early stage of the visual cortex selectively respond to particular features of the environment, such as oriented edges [9], which are linear combinations of the pattern of retinal activations. Neurons in later stages of the visual cortex respond to more complex patterns that also take into account the direction of movement of the stimulus and cannot easily be reduced to a linear combination of lower-level features [1].
The features that trigger the response of a neuron represent the receptive field of that neuron. The receptive fields of cortical visual neurons are not entirely genetically determined, but develop during the first weeks of the newborn baby and there is evidence that this process may already start before birth. Studies of newborn kitten raised in boxes with only vertical texture show that these animals do not develop as many receptive fields for horizontal features as kitten raised in normal environments [8] and therefore see the world in a different way. The development of visual receptive fields occurs through Hebbian synaptic plasticity, an adaptive process based on the degree of correlated activity of pre- and post-synaptic neurons . This amounts to a bottom-up, data-driven, and self organizing process that captures the statistics of the environment where the animal lives. Simple computational models, in the form of feed-forward neural networks with Hebbian learning, develop receptive fields that resemble those found in the early stages of the mammalian visual cortex when exposed to input signals taken from uniform contrast distributions  or from large sets of natural images [6].
CLICK HERE TO
                         DOWNLOAD FULL PAPER

0 comments:

Post a Comment

 

TEKONOLOGY Copyright © 2011 -- Template created by TEKONOLOGY -- Powered by Blogger