ABSTRACT: Two-layer and three-layer feed-forward artificial neural networks were trained to predict behvioural performance from single-trial EEG in autistic and normal subjects in a task involving response to rare stimuli and shifting of attention between vision and audition. Eyeblink artefacts were removed from the data using a frequency-domain filter. Performances of the networks on separate test sets varied across subjects but were usually at least 80%. The networks usually converged faster and attained a somewhat greater level of performance when input was presented in the frequency domain instead of in the time domain. Analysis of the network's failures in classifying the autistic auditory data turned up a variable response in which N270 and P700 were only occasionally present.
DESCRIPTORS: neural networks, EEG, ERP, P3, autism, attention, artefact correction, eyeblink
Address for correspondence: firstname.lastname@example.org
DOWNLOAD REPRINT (requires password)