Graduate Thesis Or Dissertation
 

The suppression of learning at the hidden units of neural networks

Public Deposited

Downloadable Content

Download PDF
https://ir.library.oregonstate.edu/concern/graduate_thesis_or_dissertations/6h440w94v

Descriptions

Attribute NameValues
Creator
Abstract
  • Under certain conditions, a neural network may be trained to perform a specific task by altering the weights of only a portion of the synapses. Specifically, it has been noted that certain three layer feed-forward networks may be trained to certain tasks by adjusting only the synapses to the output unit. This paper investigates the conditions under which this hobbling of the process may be possible. The investigation assumes that the existence of a set of weights which perform a task with low error implies that the task is learnable. Thus an algorithm is developed which attempts to find such a set, given the weights at the hidden units as fixed. Success of the method is equated with the ability to suppress learning at the hidden units. The result is a classification of tasks for which suppression is possible. In general, classification problems require learning at the hidden units, but approximation of simple continuous functions does not.
Resource Type
Date Available
Date Issued
Degree Level
Degree Name
Degree Field
Degree Grantor
Commencement Year
Advisor
Academic Affiliation
Non-Academic Affiliation
Subject
Rights Statement
Publisher
Peer Reviewed
Language
Digitization Specifications
  • File scanned at 300 ppi (Monochrome) using Capture Perfect 3.0.82 on a Canon DR-9080C in PDF format. CVista PdfCompressor 4.0 was used for pdf compression and textual OCR.
Replaces

Relationships

Parents:

This work has no parents.

In Collection:

Items