Graduate Thesis Or Dissertation

 

A case for application-based linear transformers : applied towards text-to-spectrogram Public Deposited

Downloadable Content

Download PDF
https://ir.library.oregonstate.edu/concern/graduate_thesis_or_dissertations/3x816v73n

Descriptions

Attribute NameValues
Creator
Abstract
  • Various natural language processing (NLP) tasks necessitate deep models that are fast, efficient, and small based on their ultimate application at the edge or elsewhere. While significant investigation has furthered the efficiency and reduced the size of these models, reducing their downstream latency without significant trade-offs remains a difficult task. This thesis proposes the modular utilization of state-of-the-art attention linearization techniques that results in fully linearized run-time with respect to the size of the sample at inference for autoregressive decoding tasks for neural machine translation (NMT), text-to-spectrogram (TTS) tasks, and other applications while minimizing the associated costs to downstream model performance.
License
Resource Type
Date Issued
Degree Level
Degree Name
Degree Field
Degree Grantor
Commencement Year
Advisor
Committee Member
Academic Affiliation
Rights Statement
Publisher
Peer Reviewed
Language

Relationships

Parents:

This work has no parents.

In Collection:

Items