The time of the end is coming. The end in general, but, more immediately, the end of our travels in Machine Learning.
This past week the content tilted more towards Python than R as we covered ground in the former that we had already covered in the latter. This delayed discussionn covered topics such as Logistic Regression, K Nearest Neighbors, Decisions Trees, Random Forest, Cross Validation, Feature Selection, and Hyperparameter Tuning. The one common ground was Support Vector Machines which we saw for the first time in both Python and R during the past seven days. Then there were the topics where Python served as the theater for our first viewing. These algorithms were Linear Discriminant Analysis and Naive Bayes, and next week we’ll see them in R as well. Seeing this host of Machine Learning algorithms together better helped to put them in context, especially with regard to the various tradeoffs. As per usual, the homework proved to be a tremendous help as well.
Outside of classwork, the cohort continued to juggle other projects, either personal or through those established with companies. It’s a time when one wishes that there were more hours in the day. My main sights are on Kaggles Rossman store sales competition, and the expansion of my Scraping Kickstarter project. The latter isn’t going well due to my webscraping script constantly failing to run to completion on an AWS instance I created. It’s a bit disheartening, and it seems that my only way forward is to rewrite my script yet again.
We also had our first onsite interviews as Spotify came as part of their drive to expand their analytics operations.
Next week is extremely short due to Thanksgiving, but it is long enough for Machine Learning to be left in the distance as the cohort leaves for pastures new. The time of arrival is set for the Monday after Thanksgiving. The destination is the Hadoop ecosystem, and we will remain there until the end of the bootcamp on December 18.