วันเสาร์ที่ 24 กันยายน พ.ศ. 2565
วันพฤหัสบดีที่ 22 กันยายน พ.ศ. 2565
NPU, TPU
https://m.thaiware.com/tips/2138.html
AWS' TPU/NPU service is AWS Trainium: https://aws.amazon.com/machine-learning/trainium/
วันจันทร์ที่ 5 กันยายน พ.ศ. 2565
Latent variable, Latent Dirichlet Allocation (LDA)
https://towardsdatascience.com/latent-dirichlet-allocation-lda-9d1cd064ffa2
A latent variable in machine learning refers to a variable that is not directly observed or measured but is inferred from the observable data. These variables represent hidden factors that influence the observed data and help explain patterns or relationships within that data.
### Examples and Applications:
1. **Principal Component Analysis (PCA):**
- In PCA, the principal components are latent variables that capture the directions of maximum variance in the data. These components summarize the data by reducing its dimensionality while preserving as much information as possible.
2. **Hidden Markov Models (HMM):**
- In HMMs, the hidden states are latent variables that represent the underlying process generating the observed sequence of data, such as the true emotional state of a person inferred from their speech or behavior.
3. **Latent Dirichlet Allocation (LDA):**
- In LDA, a topic model, the latent variables are the topics that explain the observed words in a collection of documents. Each document is assumed to be a mixture of these topics.
4. **Autoencoders:**
- In autoencoders, the encoded representation (bottleneck layer) is a latent variable that captures the most essential features of the input data, which is then used to reconstruct the original input.