Settings

Theme

Andrew Ng updates his Machine Learning course

deeplearning.ai

328 points by carlosgg 4 years ago · 130 comments

Reader

criddell 4 years ago

Does anybody know if it is still free?

I took this course and Dan Boneh's cryptography course and both were truly excellent.

  • dwallin 4 years ago

    They say you can "audit" the course for free, but they employ a ton of grey patterns to get you to pay for it. I haven't been able to find out where to audit it yet.

    Update: You have to go into the individual courses within the specialization and the enroll popup will have an audit option.

    First Course is here: https://www.coursera.org/learn/neural-networks-deep-learning...

    • rg111 4 years ago

      All videos of all courses in Coursera are free. You can watch them fully without providing your credit card info.

      There are two types of courses in Coursera- free and paid.

      In case of the paid courses, you can go to the course and navigate to the "Buy Subscription" page and click on "audit the course". You can watch all the videos for free, but you don't get access to quizzes and programming assignments (you never know what a web search will turn up ;)) ⊕. You do not get a certificate by completing a course or completing all courses of a "Specialization".

      In the case of a free course, you get access to all the videos, quizzes, and assignments. You don't get any kind of certificate. Instead of going to subscription page, you can just click "Enroll" and choose the no certification option.

      There are some great courses in the free tier (videos + assignments, no certs) as well. Dan Boneh's Cryptography and Grossman's Programming Languages A, B, C come to mind. Also Model Thinking by Scott Page.

      There were some great discussions on HN in the past. [0][1][2]

      ⊕ There are courses where duplicates of paid assignments and quizzes are provided under "Practice Assignment" as opposed to "Graded Assignment". Like Martin Odersky's Functional Programming Principles in Scala MOOC.

      [0]: https://news.ycombinator.com/item?id=25245125

      [1]: https://news.ycombinator.com/item?id=16745042

      [2]: https://news.ycombinator.com/item?id=22826722

      • jklinger410 4 years ago

        > All videos of all courses in Coursera are free

        > There are some great courses in the full free tier as well.

        So the full free tier courses offer a free certification? Or else what would be the difference?

        • anvuong 4 years ago

          They give you access to assignments. For Andrew's courses it gives access to Jupyter server to run your codes.

    • rahimnathwani 4 years ago

      IIRC you need to pay if you want your assignments to be (auto-)graded.

      • cheriot 4 years ago

        ^ this has been the case for other Coursera classes I've done recently

    • redox99 4 years ago

      That link says "Enroll for Free" and no audit button. Maybe it's because I'm not logged in?

      • zerkten 4 years ago

        Select Enroll and then the dialog has the audit option at the bottom.

  • lijogdfljk 4 years ago

    I'll ask the opposite question.. how much do these courses cost? Some quick googling has led me to Coursera, but their pricing model seems a bit obtuse. So if they're going to try and grey-pattern me into paying i'm trying to understand how much i would pay. I don't care about a degree from these places, i'd just like to learn.

    (specifically the crypto course sounds interesting)

    • no-reply 4 years ago

      Coursera has a monthly $45 fee for the whole specialization. But, the specializations also include a 7 day trial. You get access to all course material and assignments and all courses. Back in my college days(2-3 years ago), I would start a specialization and blaze through in 6-7 days. Money saved and time well spent. Ofc now that I am working, it's not going to be that easy.

    • no-reply 4 years ago

      You can also purchase an yearly Coursera subscription for $299 or $399 and get access to all the specializations/projects on Coursera for one year.

  • synergy20 4 years ago

    To get a certificate you pay $49 a month flat rate, there are 5 courses in total and all could be done in 5 months at regular speed.

xwdv 4 years ago

Although this is the best course on ML, is it really practical for anything? Has anyone built products for things they’ve learned from this course?

  • NelsonMinar 4 years ago

    I thought it was useful but awfully low level. For example I hope to never, ever implement backpropagation again; I'm going to use whatever code is in TensorFlow or PyTorch or whatever. But as a student I'm glad I did implement it myself, once, so I understand what is going on. More broadly it demystifies the black box of machine learning methods and you can see it for the giant pile of statistical categorizing functions that it is.

    The most practical takeaway I got from Ng's course was the dangers of under and overfitting your data and techniques for detecting when you make that mistake.

    • ghaff 4 years ago

      I still remember a talk by a woman from Google at a fairly long ago now O'Reilly conference (R.I.P). Part of what she discussed was Research AI vs. Applied AI. The gist of it was that a lot of the things in university course, graduate programs, etc. are tilted towards Research AI and you can get away without a lot of that stuff by using pre-built tooling for practical machine learning applications.

      Of course, you want to have some understanding of what's going on under the covers but, for a lot of people, starting from first principles is quite hard and isn't really necessary.

    • vasili111 4 years ago

      Is the knowing only Algebra I enough for this course?

      • NelsonMinar 4 years ago

        Not the course I took. It relies on basic linear algebra like matrix multiplication. You can probably get through it with just coding and not understanding the math but it wouldn't be much fun.

        Not sure about the new course.

  • MafellUser 4 years ago

    I took this course as a defensive mechanism against BS at work, especially when the consulting Data Scientists were around. In that sense it's super practical.

    ML is dominated by gigantic datasets and massive computing powers, something individuals will not have a lot of.

  • screye 4 years ago

    That's like asking if a CS101 course is useful.

    It is unlikely that you could build a major product with it, but it could tech you neat tricks to speed up some parts of work. Also, similar to cs101, it is a necessary first step towards a career in ML. So might as well do it.

    I know a bunch of business analysts and data analysts who have gotten a job based on what they learnt in this course. Ofc, they also got some stem degre alongside it, but this course made a difference.

  • aaaaaaaaaaab 4 years ago

    In 2012 I did Andrew's original machine learning course, and implemented a bespoke OCR engine for iOS, which was released in a banking app for scanning utility bills. Back then deep learning was just taking up, so I did my own backprop training in Matlab based on Andrew's code as well. It was a pretty fun end-to-end experience, much better than just throwing stuff at tensorflow like we do nowadays.

    • vasili111 4 years ago

      Do you thing now days Deep learning does not requires much math? If yes, to what extend of knowing math is enough to be truly good deep learning specialist? By deep learning specialist I mean the person who is building a commercial software that uses deep learning but not tools for deep learning.

  • rg111 4 years ago

    Oh yes.

    The things I learned here helped me gain a solid foundation, which, in turn helped me learn Deep Learning.

    And Deep Learning feeds me now.

    The good thing about this course is that it is not Math-shy. It is not rigorous in terms of Math, like there are no proofs and so on. But Math is omnipresent here.

    Andrew Ng's MOOC is among the best game in town. Ng is among the best teachers I have ever seen.

  • UmbertoNoEco 4 years ago

    No. The ugly truth is that these courses will be useless to 99% of the people. Machine learning is dominated by big corporations with gigantic amounts of data and processing power. If you want to work in one of them or create competing ML companies you need pedigree (a PhD from a well know university), and those guys arent taking courses with fake credentials.

    You could use ML in your job/company but then you dont need this course, you just use a ML product.

    See this course as a hobby thing, or if you are in HS and want to start preparing for college, otherwise there are better uses of your time.

    • Tenoke 4 years ago

      There's a lot of ML happening outside of big corporations, which you can confirm by just searching 'machine learning' on any job site. While it's true that often you can use ready-made ML solutions, you often will benefit from additional knowledge for improving or adjusting them for your company's specific problem and while interviewing you will often be asked the kind of questions those courses cover.

    • woah 4 years ago

      You can get almost unlimited GPU time on Google Colab for $50 a month. I don't know why or how they pay for this, but it does bring "real research" into the reach of individuals.

      • UmbertoNoEco 4 years ago

        You can get more processing power and true unlimited time with any semi-competent graphic card (probably costing less than 1 year of Colab Pro+). Pro+ is a scam, you are not told what kind of instances you will be running at, and you dont have any guaranteed continuous running time. And even if you were given full 24/7 access to a top of the line card that would be like 0.001% of the power used to train any big modern ML model.

        Users complain all the time: https://www.reddit.com/r/GoogleColab/comments/sq0lia/colab_p...

    • mupuff1234 4 years ago

      > You could use ML in your job/company but then you dont need this course, you just use a ML product.

      ML product?

      • benrow 4 years ago

        For example, Google Vision API can do some out-of-the-box classification on arbitrary images with no training needed. Covers super common cases such as explicit content detection and object detection.

        There are more customisable products within Google where you can provide training examples and labels using a UI (AutoML I think it's called). The result is an endpoint you can use to do inference, based on the model created behind the scenes.

        I just mention these examples because I've spent a little time researching them at top-level.

    • queuebert 4 years ago

      Or maybe people want to understand what's going on under the hood of the ML products they use?

      • asey 4 years ago

        Of course, though in fairness parent was answering grandparent's specific question (and accurately in my experience)

    • sydthrowaway 4 years ago

      How about joining FAANG as SWE, and then internal transfer?

  • Choco31415 4 years ago

    I used it to help learn ML before I could start taking the classes at my university and it was enough to land me a research position at the Air Force.

    Admittedly I also bought textbooks and worked through tutorials as well.

melling 4 years ago

Announcing that he updated his course certainly gained more attention than saying it will be available in June

https://news.ycombinator.com/item?id=31204055

I certainly was excited when I saw this headline. Thought maybe it was early

laurex 4 years ago

Though I had almost zero ways I would actually use the learning from this course (and indeed really never did any ML after and have probably forgotten it all) it was still a really fun brain exercise to revisit some math and then see how ML thinking worked! I have recommended it quite a few times.

farzatv 4 years ago

This is one of the best courses on ML.

  • smnrchrds 4 years ago

    What are the others? Any recommendations?

    • nicd 4 years ago

      I highly recommend https://course.fast.ai/. It's much more top down: in the first lesson or two, you train a NN image classifier, rather than starting with first principles and linear algebra. I found this structure to be more motivating and effective.

      • woah 4 years ago

        Fast AI teaches you a little bit more about being a practitioner, dealing with datasets, pointing the right algorithms at the right data and checking whether you get good results.

        Andrew Ng for me did a lot more to demystify how stuff actually works

      • saynay 4 years ago

        I went through both, but I definitely think fastai is the better starting point.

        • bmitc 4 years ago

          Can you say more?

          • jpgvm 4 years ago

            I didn't do these particular courses but I found it a lot easier to stay motivated with the top down approach. First demonstrate usefulness, then deepen fundamentals.

            When I was younger and didn't work full time + have other commitments the bottoms up approach appealed to me more, I think partially because I had bigger time blocks to allocate. i.e I could spend a whole weekend just learning fundamentals of some particular thing I was interested in and reach the first levels of usefulness in that one "session".

            These days smaller time blocks mean that I need to walk away with something the keep the spark going for most curiosities.

            • rg111 4 years ago

              Jeremy Howard came off as anti-intellectual to me. He is always like "oh math is nothing... you do not need math... math is not needed" and stuff like that.

              Other than that, fast.ai is a great resource, and Jeremy Howard is a great instructor.

              You will learn very practical tools and tricks, and a lot of recent research is demystified, but don't expect to achieve deep, general insights.

              Also, fast.ai is a very very limited and poor library compared to PyTorch, JAX, TF, etc.

              Programming, design, and architecture decisions are outright terrible.

              I got paid to write fast.ai in one job. I still have nightmares. I never did it again.

              But it is a nice learning resource.

              • saynay 4 years ago

                I wouldn't say Jeremy is anti-intellectual, but he does know that a lot of people get turned off from the AI field because they are afraid of the math, and a lot of other courses (used to?) start with the math. So he makes sure early and often to tell people that you don't need to understand the math that is happening deep under the hood in order to do productive, even state-of-the-art, research with AI.

                • vasili111 4 years ago

                  What is the required math for starting Ng course?

                  • saynay 4 years ago

                    An understanding of linear algebra and calculus would be useful, but I don't remember it being required.

                    I was more referring to other resources/classes I had looked at, besides Ng's course or FastAI.

            • Simon_O_Rourke 4 years ago

              > When I was younger and didn't work full time + have other commitments

              I second this - while both are great courses, I found I could only dedicate very short amounts of time recently to any kind of study, and going from the ground-up more thoroughly seemed like I was making no progress. The fast.ai top down approach worked a bit better for me for those reasons, otherwise it would have been interesting starting with the deep dive.

              • saynay 4 years ago

                Yeah, this is probably the reason it worked better for me as well. I had bounced off of some other courses because I didn't have the time to dedicate 16+ hours of lectures before I would get to the fruit of all that foundational knowledge. Starting with some high-level abstractions, then digging down into how each of those abstractions ticked, kept the number of concepts I had to remember at once to something more manageable while dealing with distractions like a full-time job.

          • jdminhbg 4 years ago

            I did (old versions of) both of these and liked both. What I liked about the top-down approach of fast.ai is that it worked the way I approach working with other programming systems. You have a thing you want to do and APIs that promise to do that thing for you, and you plug them together. Then you decide you want to change it from the default behavior, so you tweak the parameters, then you need to learn why they're set up the way they are, and how they work, etc.

            Similarly, when I learned web development with Rails over a decade ago, I didn't start by building an HTTP stack. I started by doing the build-a-blog-in-fifteen-minutes tutorial. Now I had a working project. Eventually I needed to learn all of the underlying technologies, but it's much easier and more rewarding to have something running first.

          • saynay 4 years ago

            I found that starting with the big picture and a tangible result made it easier to stay engaged. At the end of the fastai course, however, I felt there were some gaps in my understanding especially at the low-level side. Andrew Ng's course helped fill in those gaps.

    • rripken 4 years ago

      I took these courses from Georgia Tech via OMSCS but they are also on udacity.

      https://omscs.gatech.edu/cs-7641-machine-learning

      https://omscs.gatech.edu/cs-7642-reinforcement-learning (I took this before ML but its supposed to come after. There is some overlap. Probably my favorite graduate course.)

      https://omscs.gatech.edu/cs-7646-machine-learning-trading (IMO not amazing)

      Much more basic (took this before OMSCS):

      https://www.udacity.com/course/intro-to-machine-learning--ud...

      I'm sure there are many more.

    • _odey 4 years ago

      Not a full course I'd say, but I've used this one to learn the math behind deep neural networks and code my own from scratch in elixir and C:

      http://neuralnetworksanddeeplearning.com/

      • vasili111 4 years ago

        What is required math for starting Ng course?

        • _odey 4 years ago

          Can't tell you about Andrew Ng's coruse as I haven't done that, but for Michael Nielsen's course it was Matrices and Partial Derivatives. I'm assuming it's quite similar.

      • lagrange77 4 years ago

        Yes, this is really good! Andrew Ng, too.

    • ForHackernews 4 years ago

      "Learning from Data" is outstanding: https://work.caltech.edu/telecourse.html

      It's a recorded version of a real Caltech undergrad course, and it's focused on understanding the math behind these algorithms, not just applying black-box ML libraries.

      It's much less practical, but I feel like it teaches you more.

    • UmbertoNoEco 4 years ago

      Depends, how much linear algebra, probability and python do you know?

beckingz 4 years ago

It's amazing how hard it is to stay up to date in the data space, so it will be interesting to see how this course has been updated.

pm2222 4 years ago

I finished machine-learning[1] long time ago and it's so good. Look forward to this [2].

[1] https://www.coursera.org/learn/machine-learning/ [2] https://www.coursera.org/learn/neural-networks-deep-learning...

  • rg111 4 years ago

    The only downside of [2] is that is is taught in Keras + Tensorflow rather than PyTorch.

xtracto 4 years ago

I took this when it was mlcourse along with the aicourse by Peter Norvig. I was in research at the time. They were entertaining, but certainly mainly an intellectual curiosity for both academics and practitioners. Nowadays Practitioners would most likely use an ML library.

kache_ 4 years ago

Really great course, highly recommend it. It demystifies so much :)

  • colordrops 4 years ago

    If one is a seasoned software engineer, but has little experience in ML or deep learning, is this course still suitable?

    • kache_ 4 years ago

      If you remember highschool AP math, you're good. Otherwise, check out ISLR for a faster intro, capped by your ability to read

      • vasili111 4 years ago

        Do you think ISLR is not outdated? I just looked at the date of publication and was wondering if it is still relevant.

    • vitorbaptistaa 4 years ago

      Definitely. It starts from first principles, linear algebra, and goes from there. It's an amazing course.

      • vasili111 4 years ago

        So you need to know Linear algebra to understand? I knowing Algebra I enough to understad it?

karaterobot 4 years ago

I'll say that that waitlist registration form is very sketchy. Agreeing to receive marketing updates is required to join a waitlist for a course? Classy move.

octagons 4 years ago

Is the registration broken? I am getting errors to "Please complete this required field" on two fields that I cannot see (or fish out of the div soup that is this signup page.)

  • rg111 4 years ago

    It is a dark pattern, but you need to agree to receive promo emails.

    No thanks.

    Come June, I will just check manually.

  • carlosggOP 4 years ago

    I checked the checkbox and it worked.

AiFoGhost 4 years ago

Very very excited to check this out.

bitL 4 years ago

So what, concretely, has changed?

cypress66 4 years ago

Checked if he moved it away from Matlab, and yes he did! That's what steered me away from his course.

  • rg111 4 years ago

    I did everything in Octave. And it was a great learning experience.

    Octave is very easy to learn if you have previous programming experience.

    You won't _write_ programs a lot. There will be cookie-cutter code, and you will fill in some blanks. A line here a line there.

    Trust me, Octave wasn’t a deal-breaker if you tried. And a lot of formulae were the code.

  • wcoenen 4 years ago

    I had no problems completing the original course in Octave. No Matlab license required.

  • czbond 4 years ago

    Question: Why was the original version in Matlab? I am familiar with Python, R, and others.... I get that those languages until recently might not have been great over the ancient predecessors (LISP, etc) for ML related.

    But I've never seen actual production anything in Matlab. Did Matlab provide something at the time others did not? If so, how did they transfer MatLab to running production models? Or did they create a model with basic outcomes - and then code a representation of it in C++, etc?

    • telotortium 4 years ago

      Before Python (Numpy/Scipy) really came into its own (which didn't quite happen until early 2010s), Matlab was among the easiest-to-use scripting languages for writing scientific computing programs. I was in university (Bachelor's + Masters) from 2007-2012 and learned Matlab extensively in my numerical computing classes (I was a Physics major, for what that's worth). When you're ready to run your scientific computing codes on a supercomputer cluster, you'd usually rewrite it in C++ or Fortran (my research group used the latter), but to develop and debug at a small scale, you'd usually use Matlab, although the younger people like me might use Numpy, and there were one or two people who used Mathematica even for numerical computing (as opposed to symbolic algebra, which everybody used Mathematica for).

      It was around the time I was in university that Python really matured for numerical computing, but professors (as opposed to grad students) were likely to be already familiar with Matlab, so there wasn't much reason for them to learn Python. Andrew Ng was already a mid-career researcher when he made his course, which was probably based on older materials (I also learned basic neural networks in my numerical computing class in 2008), so it made sense for him to continue to use Matlab, especially because Octave exists as an open-source reimplementation of the basic functionality.

      These days, you wouldn't use anything else but Python for ML, at least until you really productionize the implementation at a large scale, at which case you might rewrite in C++ or Rust (I don't know if they even bother rewriting these days when most of the computation happens in GPUs or TPUs). And it's my understanding, although I'm not really too familiar these days, that Matlab has mostly pivoted into providing a toolbox of all sorts of esoteric numerical methods for engineering-related tasks like finite element analysis, as well as hardware simulation (using Simulink).

      • atleta 4 years ago

        Well, matlab didn't pivot into that esoteric toolbox of numerical methods for engineering tasks. It has always been that. That's probably the very reason ML researchers picked it up in the first place, because everyone was using it already anyway for scientific (esp. linear algebra) calculations.

        I've done my degree a bit before you (in Electrical Engineering, also learned all I could about NN and other AI methods back then) and most people would use MATLAB for whatever scientific algorithms/calculation they needed to do. We had free student licences at the university so that we could use it for lab work and for our theses. I remember it had all kinds of numerical optimization algorithms/packages, control theory algorithms, etc.

      • czbond 4 years ago

        @telotortium - hey thank you so much for adding that context, it is incredibly helpful. Now I can internalize why it was Matlab. The update to python is going to scale it to a new level of learners!

  • arberx 4 years ago

    You can do them in python and submit them! https://github.com/dibgerge/ml-coursera-python-assignments

  • jcadam 4 years ago

    I took it when it was still in Matlab :) Haven't used Matlab since.

  • quux 4 years ago

    What's the class using now?

    • oogetyboogety 4 years ago

      10 bucks says it's python

      • rashkov 4 years ago

        “Graded assignments and lectures have been rebuilt to teach in Python instead of Octave“

        • shankr 4 years ago

          I almost forgot it was in Octave. I had done it when it was offered for the first time by stanford.

          • rashkov 4 years ago

            I really like the commitment to open source (and student accessibility) that led the course author to choose octave.

  • lern_too_spel 4 years ago

    The programming assignments were one or two lines in Octave. They'll turn into 10 lines of Python with indentation errors. Python is a worse pedagogical language for any course in applied linear algebra.

    • bonniemuffin 4 years ago

      OTOH, the time I spent learning Octave/Matlab for Andrew Ng's course was 100% wasted time, because I've never used it again in the 10+ years since I took the class, whereas time spent learning Python would've been useful to me in myriad other ways.

      • rg111 4 years ago

        > the time I spent learning Octave/Matlab for Andrew Ng's course was 100% wasted time

        Really?

        If you have programming experience, you don't really need to learn Octave.

        Some formulae were the code.

        In case of others, the whole program was written, with one or two missing lines that you had to implement.

        I spent zero time learning Octave, because there was nothing to learn.

      • nytesky 4 years ago

        So sad to hear Mariah disparaged. I’m an Gen X engineer and Matlab is one of our first languages. Use it today still in aerospace but I would imagine Python suits software shops much better. Does Python handle matrix math as well?

        • sampo 4 years ago

          > Does Python handle matrix math as well?

          If you're playing around interactively, it's a bit easier to write (in Matlab)

              m = [1 0 0 ; 0 0 -1 ; 0 1 0]
          
          than (in Python)

              m = np.array([[1, 0, 0], [0, 0, -1], [0, 1, 0]])
          
          Also a bit longer example:

              m = rand(3,4)
              a = [0.1 0.2 0.3]
              m \ a'
          
          versus

              m = np.random.rand(3,4)
              a = np.array([0.1, 0.2, 0.3])
              np.linalg.lstsq(m, a.T)
              wtf?
              google...
              fine!
              a = np.array([[0.1, 0.2, 0.3]])
              np.linalg.lstsq(m, a.T)
          
          But if you're developing software, you can't really easily and reliably deploy Matlab or Octave to run in the cloud in your production systems, whereas Python you can.
        • DoubleFree 4 years ago

          Matrix math in python is a bit clunkier, because matrices are not native to the language. That said, numpy, the standard for matrix math in python, is quite nice. Its documentation is, imo, miles ahead of matlab's and the APIs are a bit more sane.

          • jcadam 4 years ago

            This brings up a good point. If the goal is to understand the underlying concepts, it's quite possible Octave is a better tool. Matrix math is fairly clunky in any mainstream programming language.

        • bowsamic 4 years ago

          MATLAB is still in heavy use in physics, mainly for experiments bc of simulink and the control systems toolbox

      • londons_explore 4 years ago

        It's fairly common I have a little dataset I need to do some fft's on and draw a graph or some other similar one-off task. MATLAB still wins for getting the job done.

        I wish someone would make "MATLAB with all its toolboxes, but with python syntax, in a colab-like IDE".

        • AlotOfReading 4 years ago

          Is that not what numpy/scipy with Jupyter (as well as the various distributions that package them) essentially provide? If you just want an all-in-one that has a supported Matlab FFI interface, there's Julia.

      • melling 4 years ago

        The amount of Octave needed for the class was manageable.

        Learning Octave made me wish all languages supported matrices, vectors, and the necessary operations.

    • BeetleB 4 years ago

      The programming assignments in the original course were mostly useless. They provided you with a template with 90% of the problem solved, and you just had to enter an equation to solve the problem.

      • londons_explore 4 years ago

        But you also had to understand the rest of the code in the template.

        And reading code tends to be a quicker way to learn roughly how something works than writing it from scratch.

        • thfuran 4 years ago

          Writing code tends to be a much better way of making sure you actually know how something works though.

        • BeetleB 4 years ago

          > But you also had to understand the rest of the code in the template.

          Not really. They had lots of comments that explained what the code did. You didn't need to read most of the code.

          My point is that compared to real university courses, the HW in this course would be labeled as "trivial". Writing those few lines of code was no more instructive than an in class paper test. It's more comparable to answering simple questions than building anything.

          I don't think I had to debug even once in that course. It was that easy.

    • Vaslo 4 years ago

      Yes - learn yet another language in the Tower of Babel of languages that will be useless after this course. There's a reason he switched to Python.

    • DaedPsyker 4 years ago

      I mean python has issues (indentation errors isn't one I would list) but that's besides the point isn't it. The Lingua Franca for ML is currently python. Teaching octave, when most things they search for will be python just seems unnecessarily stubborn. Some day it might be Julia but we aren't there yet.

      • lern_too_spel 4 years ago

        Indentation errors are a big problem for pedagogy. Imports are a big problem for pedagogy. When you are teaching how algorithms work, you want to implement in a language as close as possible to the language of the domain as possible. Hence, Python is a terrible pedagogical language for linear algebra, and Octave is a reasonable language.

        • cypress66 4 years ago

          In a vacuum, maybe

          But in the real world

          1) Python is the lingua franca for ML. You WILL need to learn python. All other resources are in python. Matlab you'll likely never use again, so it's kind of a waste.

          2) Probably more people have existing python knowledge than Matlab knowledge. And if you already know python, and you know python is the lingua franca, it's annoying having to learn Matlab knowing that in the real world you'd be better off with python.

          • lern_too_spel 4 years ago

            1. You'll use Matlab in any other linear algebra or numerical methods course as well as in any digital signals course.

            2. Optimization algorithms, of which gradient descent is a subset, are deployed in production in many languages, very often not Python.

            3. There is almost nothing to learn. For the programming assignments in the course, Octave is used as a succinct DSL for matrix math. The assignments were to simply write the math in a computer and watch what happens when you run the computations.

            4. You wouldn't learn Python by completing the programming assignments because you're just calling numerical routines, not dealing with anything else. Writing the code in Python simply adds more opportunity for error with no pedagogical benefit.

    • DiogenesKynikos 4 years ago

      Python is an easy language for beginners, and with numpy, it has excellent support for linear algebra.

      • lern_too_spel 4 years ago

        Octave is an easy language for beginners and has excellent (less ceremony than numpy) support for linear algebra out of the box without having to learn any libraries. The point of the class isn't to teach you how to use libraries but to teach you at a high level how to use gradient descent to optimize parameterized models. Once you understand how it works, it is easy to translate what you know to run well on different systems or to use existing frameworks already implemented on different systems.

        • LosWochosWeek 4 years ago

          What's the difference between learning a library providing functionality versus learning inbuilt functionalities?

          • klyrs 4 years ago

            Numpy is kind of a funky library with some weird (but good!) syntactic sugar that doesn't translate to the rest of Python. Scipy is a different beast. And pandas. I could go on. Making, and using matrices, feels weird in python and interoperability/efficiency doesn't come for free.

            Compare to matlab, where matrices are first-class, syntactic sugar is consistent and rather lovely. But then the rest of the language is detestable.

          • DiogenesKynikos 4 years ago

                import numpy
    • adamsmith143 4 years ago

      But it's the language dujour for ML and in particular Deep Learning so theres no point doing it in any other language.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection