1 A few words about Generative Models 1.1 Introduction 1.2 Latent variables 1.3 Examples of generative models 1.3.1 Generative Adversarial Networks (GANS) 1.3.2 Variational autoencoders 1.
I recently made a presentation at the regular Hong Kong Machine Learning meetup organised by Gautier Marti.
The presentation was an introduction to Julia and used as an example a SEIR model COVID-19 I had written.
Recurrent Neural Networks (RNN) From simple RNNs to LSTMs Long/Short Term Memory RNNs Attention Beyond LSTM: Transformers Transformer-XL Compressive Transformers Introduction Compression scheme Compression training Summary This is the first post of series dedicated to Compressive Memory of Recurrent Neural Networks.
Both capstones for the HarvardX certificates are now available. Just click on the Projects link!
If Gitbooks are not your thing, at the top of their main page, there is a download link to a pdf version.
After 3 months of work, the final report for the HarvardX Data Science course was submitted.
It is based on the LendingClub dataset. LendingClub is a peer-2-peer lender. This is a matching of private borrowers and investors.
DRAFT 1 Background Singular matrix decomposition Where next? Back to SVD Regularisation Vector coordinates Eigenvalues Threshold 2-by-2 decision matrix [TODO] Other Principal Components methods Limitations and further questions Limitations Further questions Litterature DRAFT 1 We all have laptops.
Review Exercises and grading Summary Review I recently completed the Stanford online version of the Machine Learning CS229 course taught by Andrew Ng. There is no need to introduce this course which has reached stardom.