Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp014b29b896t
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Narasimhan, Karthik | - |
dc.contributor.author | Bechara, Jad | - |
dc.date.accessioned | 2020-08-12T12:57:45Z | - |
dc.date.available | 2020-08-12T12:57:45Z | - |
dc.date.created | 2020-05 | - |
dc.date.issued | 2020-08-12 | - |
dc.identifier.uri | http://arks.princeton.edu/ark:/88435/dsp014b29b896t | - |
dc.description.abstract | We present MeNTAL, a Transformer-based model for neural signal processing. This model is trained on the task of translating ECoG time series from two subjects into English sentences corresponding to their speech, by minimizing the perplexity of the next token. We compare a classifier restriction of the model to current benchmarks on the same dataset, and show that it performs similar to the best known model. We then observe that our full model provides an improved framework for neural signal research, through its relaxation of the problem setting. | en_US |
dc.format.mimetype | application/pdf | - |
dc.language.iso | en | en_US |
dc.title | MeNTAL: Models for Neural Transduction using Attention-based Learning | en_US |
dc.title | wollack_thesis_3.pdf | - |
dc.title | MeNTAL: Models for Neural Transduction using Attention-based Learning | en_US |
dc.title | MeNTAL: Models for Neural Transduction using Attention-based Learning | en_US |
dc.title | ORIGINAL | - |
dc.type | Princeton University Senior Theses | - |
pu.date.classyear | 2020 | en_US |
pu.department | Computer Science | en_US |
pu.pdf.coverpage | SeniorThesisCoverPage | - |
pu.contributor.authorid | 961164243 | - |
Appears in Collections: | Computer Science, 1988-2020 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
BECHARA-JAD-THESIS.pdf | 1.38 MB | Adobe PDF | Request a copy |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.