What was it like to complete Neural Networks and Deep Learning Course ?

Saurabh Karn
5 min readJan 26, 2021

--

Deep Learning Specialisation is a series of 4 courses by Prof. Andrew Ng on Coursera that attempts to democratise neural nets. Prof. Andrew Ng is an Adjunct Professor at Stanford University and the Co-Founder of Coursera.

Towards the last quarter of 2020 I tried exploring the idea of completing a more rigorous course than the ones that I had taken earlier on Neural Nets. I wandered around the internet in search of courses and libraries to figure out what sticks. In this blog post I try to summarise different phases of the learning journey.

Call to adventure: I have been trying to educate myself on the potential of AI to disrupt Law and Justice space for sometime now. Since Law tends to be status quoist and is generally resistant to change. The thought that one could fundamentally challenge the way law and justice works works seemed really cool. This primarily gave me the impetus to learn Neural Nets and Deep Learning and I jumped straight into it.

Crossing the threshold: I am well acquainted with Python and OOP but because programming has not been my primary work for many years now, I felt my skills went a little rusty. I tried my hands on writing some webscraping stuff to gather data to compensate for that. In addition, I was confused between the choice of frameworks, the MOOC and the infrastructure choice that is available for building deep learning project. Below are the choices and my rationale for them:

  1. Libraries: Pure Python and then both PyTorch and TensorFlow (in that order). Pure Python gave me a lot of intuition into how NN was going to work for modelling algorithms like logistic regression. It was not so much of a choice in the DL course but a good constraint to work with. I wanted to also understand that once I start building real stuff which will be a better choice: PyTorch or TensorFlow? Dr. Jon Krohn in one of his talks says that “In your real work you will likely face both PyTorch and TensorFlow environment so might as well learn both”. (https://www.youtube.com/watch?v=Be5QwA-yDJE)
  2. MOOC: Coursera Deep Learning specialisation, Deep Learning Illustrated and Fast.ai (in no specific order). I ended up paying for Coursera DL specialisation which was around Rs. 3500 (per month till you finish all four courses). I really liked the way Prof. Andrew combines the mathematics with programming to give a wholesome flavour of the course. I started with Fast.ai and it does a great job of getting you started blazing fast (~6 lines of code for an image classifier). I felt that I did not understand a lot of inner workings which was mildly discomforting. I needed some grounding on the linear algebra side of things. Later I realised that Jeremy covers this in the second part of Fast.ai MOOC. (Note: (1) If you are thinking of doing fast.ai MOOC I highly recommend watching first videos of both part 1 and part 2 to get a broad sense of what’s where. (2) nbdev is mind boggling and if you don’t want to continue with Fast.ai to learn DL please do look for nbdev tutorial from Jeremy.)
  3. Infrastructure: Coursera provides everyone enrolled jupyter lab environment access as part of their lab. There are several other choices available like Google Colab, AWS, Paperspace (read first section), Pythonanywhere. I loved using Google Colab. Its a beautiful experience and while AWS does a fantastic job of making infrastructure available which is useful probably when you are building a product or a service. I did not quite enjoy Paperspace experience which is recommended by Fast.ai.

The road of trials: Phew! That was a lot of stuff. I must have seen 10–15 different DL tutorials on YouTube and I was super focused on Fast.ai. I think it also comes from how I have traditionally been taught these technical subjects that Coursera Deep Learning specialisation turned out to be a very good choice. Prof. Andrew Ng says, that DL Specialisation is one of the most efficient ways to learn about the subject. You can get his advice on DL in this video. You can watch it for motivation, inspiration or just to listen to a good AI podcast.

Abyss: I finally paid for the DL specialisation on Coursera and the journey started. I started really hard and finished 2 weeks worth of content in 3 days with very focused study. What really helped me was to keeping notes of what I was learning and trying to teach myself. It also helped me to reproduce some of the equations and do some mental math on the Matrix sizes. I used this tool to visualise matrix multiplication etc. But then I stopped!

It was tough to maintain the momentum as the course has a very good mathematical bent from the very beginning which tests your stamina to take it. I felt I needed some inspiration. I looked through the internet to find something that has the balance of both the depth as well as the breadth. For next week or two I just did not have the strength to move on and I looked for other resources a.k.a resource hunting.

Metamorphosis: One of the things that I have started to do since past few years is to read books on topics that I am trying to learn. While this may seem rather elementary as an idea to many I have been amazed at what and how things can be learned if you bring it into practice. I was trying to set up my computer at the same time as I was trying to finish my Coursera course and that is when I encountered Dr. Jon Krohn’s talk on PyTorch vs TensorFlow. I really liked the talk and ended up buying Deep Learning illustrated and its been amazing. This book gives you the vocabulary and the familiarity with DL space as a whole. From Coursera I got a good grasp of math and vectorization, the book gave me the language to express my understanding. It got exciting after this as I supplemented the rigour of the specialisation course with such a a fun read!

The ultimate boon: The last 2 weeks were very exciting as I was able to link multiple stuff in my head and they made sense. I think the idea of moving from explicit for-loop to a vectorized implementation generally was really cool and similarly how some of the common debuging issues might be related to different matrix sizes. I kind of breezed through the next couple of weeks and finished my specialisation. Though this time 2 weeks actually took 2 weeks and I did not squeeze it. Finally I completed my DL specialisation course and I was very happy :)

Returning the boon: Okay so now what? Can I build a chatbot? — No. Can I write a DL program to distinguish between Cats and Dogs? — Yes. Do I have enough skills to get a job? — Probably not. I don’t think I am skilled enough to program all of these things from scratch. No one does it. I could use existing tools and hack a program together that will work best, but having fundamentals clear gives me confidence that I will be able to identify and resolve issues I might encounter in the future. I am moving to the next course in the series and I hope it turns out just as exciting a journey as this one was.

--

--

Saurabh Karn

Teacher,Product Manager, Programer, Policy Advocate in no specific order