My First Steps toward Machine Learning
I'm just getting my feet wet, but here's a few thoughts anyways.
Over the last month or so, I've been working through Andrew Ng's famous introductory class on Machine Learning. I've completed all the programming assignments in the class and just have to finish the last two weeks of lectures and quizzes. I've learned a lot through the process, and I thought I'd share from a developer's perspective what this class brings to the table.
SO MUCH MATH
As a Cognitive Sciences major in undergrad and an experienced programmer, I thought I would have a leg up on your average person doing this class. And I did! It definitely helped to have the concepts of neural networks down, and my programming experience allowed me to do the assignments without distractions.
However, I underestimated the amount of math involved. Multi-variable Calculus and especially Linear Algebra are used repeatedly throughout the course, and without an understanding of those concepts it will be difficult to understand what the algorithms are doing. If you have taken these classes in school before, then you should be able to get up to speed fairly quickly. On the other hand, if you are a self-taught programmer, especially if you are coming from the web side, you may want to brush up on the basic concepts involved before attempting this class.
For me, thankfully I had taken these classes so although it was more than a decade ago (!), I was able to clean out the old math engine in my brain and get it working again.
Theory, Theory, Theory
If you're taking this class looking to implement something immediately into your current application, you are looking in the wrong place. This class is a standard introductory computational mathematics class, which means that it focuses a lot on the theory and the algorithms involved. The programming exercises are not really focused on programming but rather on turning mathematical functions into code. As an experienced programmer, the difficulty was not in writing the code but in figuring out which portion of the formula I was trying to write.
In the real world, most of these algorithms will already be implemented for you in a library, so you might wonder, "Why even bother taking this class?" I would agree that most developers could use pre-existing libraries to get something implemented without taking this class. Perhaps it's just a quirk of mine, but I always feel the need to understand what is going on behind the scenes. Hopefully understanding the various concepts and theories will allow me to better select which tools need to be used in the future.
What Next?
At this point in the class, I've realized a couple different things.
It's a Big World Out There
Again, this was naive of me, but I thought that with my background in Cognitive Sciences, I would be able to quickly assimilate the information in one class and then turn to implementation. Unfortunately for me, my degree is now ancient, and I need to catch up to over a decade of advancements and people who have been spending the past decade doing nothing but machine learning.
I am planning on continuing my learning by working through some additional classes. I haven't fully decided which one to start with yet, but I am leaning towards Google's Machine Learning Crash Course to give myself some more practical implementation experience with TensorFlow. After that, I'll probably still need to take at least another couple classes while working on some test projects. So it's a still a long road ahead. That said, there was one encouraging thing.
Advancements Don't Happen That Fast
One of my key takeaways from my undergrad Cognitive Sciences degree is that frankly, we don't know that much about the human brain or how it works. Clearly, advancements in computing have led to the democratization of machine learning, where what was formerly only available to researchers with computing clusters is now able to be done by almost anyone with an internet connection. However, many of the concepts are fundamentally the same as they were more than a decade ago.
I'm betting that my learning speed is fast enough that I'll be able to catch up even though I am more than a decade behind. I'll keep writing as I go, stay tuned for more.