univariate linear regression and gradient descent

this is a short summary of the first week of the machine learning course by andrew ng. supervised ml: giving the computer a data set with sample answers of interest and telling it “find the correlation between the dataset and the answers of interest”. can be: regression (used for in discrete and continuous answers) or classsification (used for discrete incontinuous answers, typically called classes)> regression: can be linear, logistic or more idk yet. ...

January 20, 2025 · 5 min

this week i started learning ml

i want to continue posting online regularly. hence this first post. i started learning ml with andrew ng’s ml specialization. notes: last week, i took a hpc course from uc boulder and took some notes. i am going to make a post about it. i learned qchem prints S0 -> SX transition energies at SX optimized geometry, so this is technically not adiabatic excitation energy. at the back of my mind, i am going “why would they do that instead of SX energy at SX optimized geometry - S0 energy at S0 optimized geometry?”, which is indeed what i typically need when doing calculations. apparently, S2 energy at S2 energy - S0 energy at S2 minimum is called the vertical emission energy, while, the adiabatic excitation energy (AEE) is instead the S2 energy at the S2 minimum - S0 energy at the S0 geometry and i didn’t know the difference. i learned a trick to color the files and folders differently in the terminal - ls --color. added this as an ls alias in my .bashrc and i like the view.

January 9, 2025 · 1 min