Various Class Projects in Python, September - December 2021
Objective: Practice coding and implementing algorithms from scratch in Python. Numerically compute derivatives using first and second-order finite difference approximations. Numerically approximate integrals using the midpoint method, trapezoidal method, and Simpson’s method. Solve smooth unconstrained optimization problems using gradient descent, stochastic gradient descent, and Newton’s method. This includes learning what the gradient is and how to compute it using some techniques from vector calculus.
What I Did: Coded python functions that implement the midpoint method, trapezoidal method, and Simpson’s method for numerical integration.
Optimized the differential equation for the population growth function using Newton’s method, Secant method, and Bisection method to solve for f(lambda) = 0. Coded in python with some handwritten work.
Implemented linear regression from scratch in python, using the California Housing dataset from scikit-learn. Created models of the objective function value versus the iteration.
Implemented logistic regression from scratch using the breast cancer and MNIST dataset from scikit-learn. Created models of the objective function value versus the iteration.
Wrote python code to minimize (or optimize) the logistic regression objective function using Newton’s method and the stochastic gradient descent method with a batch size of 1. Created models of the objective function value versus the iteration or epoch. Used the breast cancer and MNIST dataset from scikit-learn for comparison.
Outcomes: Demonstrated skill in coding algorithms from scratched. Learned the foundations of numerical analysis.
Download Files to Run Code:
I also do freelancing Web Development! If you need a website made, feel free to reach out to inquire.
Based in San Francisco, CA 94118