NLP: + text summarisation + unsupervised * extractive text summarisation * textrank algorithm * lexrank * abstractive text summarisation + predictive text + derived variables for text
author detection
domain specific compression
translation
before transformers: bidirectional encoder representations from transformers (BERT)
LLM: retrieval augmented generation (RAG)
h3 on time series (are these supervised, or supervised by nature of data? eg text? + lstm + rnn + transformers/BERT + using these all for generative purposes?
section on transformers in architecture (related to NLP?) encoders. decoders. encoder-decoder models. decoder only models