Sorry, everybody, for being incommunicado for a while. I was taken violently ill Thursday evening and am just now emerging from the dead. My inbox has only 287 unread messages, so I have hope that I can mostly dig out today. If you’ve emailed me since Thursday, be patient, and apologies for the delay!

Paige has just informed me that homework #7 is now due 11:59pm on April 30th (or, if you’d like 2 extra minutes, 12:01am on May 1st), a.k.a. the night before the final exam.

If you happen to get an error that says “JSON must be str not bytes” when you process your API query results, try calling .decode('utf-8') on the content you get back. (So, if resp is the name of the variable you stored the result of requests.get() in, do json.loads(resp.content.decode('utf-8')) instead of just json.loads(resp.content).)

Quiz #8, the final Canvas adventure of the year, has been posted and is due the last day of class. Warning: don’t try to take this quiz until after Wednesday’s class, or until after you have read lesson 16 in its entirety!

From Friday’s class, the association analysis sample code (using mlxtend), and the feature selection analysis we ran on the Jedi stuff.

Quiz #7, the penultimate quiz of the semester, has been posted to Canvas and is due on April 22nd at midnight. Note that unlike other quizzes, this one is open Python, so be sure to have Spyder up and ready when you take it.

Good luck!

Sorry, sudden family emergency arose today, and I won’t be at my afternoon office hours (Thurs, 4/18). Send email instead: I should be on for much of the rest of the day.

Also, I’ll be available tomorrow Friday 4/19 noon-2pm for office hours in case you want to drop by then.

From Monday’s class, the code snippets that pull data from the Star Wars API and the API.

Homework #7, the last one of the year, has been posted and is due next Wednesday. Good luck, and have fun!

From Wednesday’s class, the Naive Bayes classifier that used a numeric variable (height) instead of all categoricals, and the revised .csv file that contains the training data.