Graduate Thesis Or Dissertation
 

A case for application-based linear transformers : applied towards text-to-spectrogram

Público Deposited

Contenido Descargable

Descargar PDF
https://ir.library.oregonstate.edu/concern/graduate_thesis_or_dissertations/3x816v73n

Descriptions

Attribute NameValues
Creator
Abstract
  • Various natural language processing (NLP) tasks necessitate deep models that are fast, efficient, and small based on their ultimate application at the edge or elsewhere. While significant investigation has furthered the efficiency and reduced the size of these models, reducing their downstream latency without significant trade-offs remains a difficult task. This thesis proposes the modular utilization of state-of-the-art attention linearization techniques that results in fully linearized run-time with respect to the size of the sample at inference for autoregressive decoding tasks for neural machine translation (NMT), text-to-spectrogram (TTS) tasks, and other applications while minimizing the associated costs to downstream model performance.
License
Resource Type
Fecha de Emisión
Degree Level
Degree Name
Degree Field
Degree Grantor
Commencement Year
Advisor
Committee Member
Academic Affiliation
Declaración de derechos
Publisher
Peer Reviewed
Language

Relaciones

Parents:

This work has no parents.

En Collection:

Elementos