Join 3,512 readers in helping fund MetaFilter (Hide)


CS221
August 6, 2011 10:41 AM   Subscribe

Stanford's 'Introduction to Artificial Intelligence' course will be offered free to anyone online this fall. The course will be taught by Sebastian Thrun (Stanford) and Peter Norvig (Google, Director of Research), who expect to deal with the historically large course size using tools like Google Moderator.

There will two 75 min lectures per week, weekly graded homework assignments and quizzes, and the course is expected to require roughly 10 hours per week. Over 10,000 students have already signed up.

Btw, Sebastian Thrun & Google's driverless car recently crashed
posted by jeffburdges (38 comments total) 85 users marked this as a favorite

 
This rocks! Of course, A solid understanding of probability and linear algebra will be required. I wonder if I can Khan Academy my way into having a solid grasp of linear algebra before the class starts.
posted by jopreacher at 11:00 AM on August 6, 2011 [4 favorites]


I would listen to Norvig lecture again
posted by infini at 11:02 AM on August 6, 2011


Btw, Sebastian Thrun & Google's driverless car recently crashed

Those who can't do ... teach.
posted by geoff. at 11:10 AM on August 6, 2011 [2 favorites]


This rocks! Of course, A solid understanding of probability and linear algebra will be required.

Excellent linear algebra material.
posted by atrazine at 11:13 AM on August 6, 2011 [3 favorites]


Seconding atrazine. But you'll have to work hard to get through it all in time :)
posted by jewzilla at 11:15 AM on August 6, 2011


I've always liked Hoffman & Kunze's Linear Algebra, but it's aimed towards mathematics majors.
posted by jeffburdges at 11:17 AM on August 6, 2011


Btw, Sebastian Thrun & Google's driverless car recently crashed

While under manual control. The robotic cars aren't trying to bump us off.

Yet.
posted by ddbeck at 11:19 AM on August 6, 2011 [1 favorite]


Those who can't do ... teach.

Alternatively, those who can't teach... do.
posted by Brian B. at 11:22 AM on August 6, 2011 [2 favorites]


Who's going to do the grading? (This is a serious question.)
posted by madcaptenor at 12:11 PM on August 6, 2011 [2 favorites]


Will they be using captchas to filter out bots, or is that extra credit?
posted by Nomyte at 12:12 PM on August 6, 2011 [2 favorites]


I'd imagine that quizzes will be multiple choice webforms and homework assignments will be programs graded by output evaluation, or maybe runtime tracing, as well as plagiarism detectors, i.e. all grades will be assigned by software. And the questions themselves will prohibit bots way better than any captchas.
posted by jeffburdges at 12:18 PM on August 6, 2011


I took this class last winter quarter, from these professors. At that point, the class was a survey course with fantastic, wide-ranging lectures, a couple of programming assignments, and a sort of evil midterm, and at the end we did a big final project. I honestly think the most beneficial part of the course was doing the project - we could pick pretty much any topic we wanted, which was really fun, and we had to do a lot of legwork finding prior work in our area and just generally acting like grad students. So this course will lose a lot for the local students, compared to what it was like before. I'm wondering whether there will be Stanford-only course content - maybe the people who are paying the big bucks to take the course in person will have a graded project to do that other people don't have?

madcaptenor, I think all the grading is automated. What this means for the depth of the exam and homework questions, I think will probably be clear.

I take all my Stanford courses online, and I'm hoping this means they'll maybe they'll actually invest in their online course infrastructure to the point where they can start delivering lecture videos in formats other than Windows Media and Silverlight. I'm not holding my breath, though.
posted by troublesome at 12:20 PM on August 6, 2011 [3 favorites]


madcaptenor, I think all the grading is automated. What this means for the depth of the exam and homework questions, I think will probably be clear.

That's what I was thinking. Given the scale of this course they'd need a whole army of TAs to do anything else.
posted by madcaptenor at 12:21 PM on August 6, 2011


I should add that in winter, the programming assignments were almost entirely autograded already, so the TAs were only on the hook to answer questions and grade written answers. However, I seem to remember there were some autograder answers that were disputed, with students claiming ambiguity in the problem statement. I have to imagine that'll be a bit harder to deal with in a course with potentially 3-4 orders of magnitude more students. I bet the entire AI research group is betaing the homeworks before the quarter starts.
posted by troublesome at 12:24 PM on August 6, 2011


I thought the name and face were familiar. For anyone who didn't click Sebastian Thrun's bio (or didn't already know), his team won the DARPA Grand Challenge in 2005.

The NOVA documentary "The Great Robot Race", which covered the entire Grand Challenge, is highly recommended.
posted by I Havent Killed Anybody Since 1984 at 2:21 PM on August 6, 2011 [1 favorite]


@troublesome: How well did you need to know Statistics and Linear Algebra when you took the course? Would things like eigenvalues come up regularly?

I took an undergrad course in each around 2004, but haven't really used either since graduating in late '06. I would love to take this AI course, but I would really like to get a handle on how much work I would need to invest in boning up on the pre-reqs.
posted by I Havent Killed Anybody Since 1984 at 2:39 PM on August 6, 2011 [1 favorite]



Who's going to do the grading? (This is a serious question.)


From the link:


Grading will be automated. But we are recording video specifically to help students who got the answers wrong. We will use the exact same questions for everyone, including the Stanford students. In this way we can actually compare how well everyone is doing.

We will use something akin to Google Moderator to make sure Peter and I answer the most pressing questions. Our hypothesis is that even in a class of 10,000, there will only be a fixed number of really interesting questions (like 15 per week). There exist tools to find them.

posted by TheShadowKnows at 3:12 PM on August 6, 2011


This too is a serious question: I wonder if there is a way (well, of course there's a way) to have a Metafilter study group. That might add to the enjoyment.
posted by TheShadowKnows at 3:13 PM on August 6, 2011 [8 favorites]


I studied some AI in university, but have been wanting to revisit it since graduating and gaining a better understanding of how the real world works. So in!
posted by mantecol at 3:19 PM on August 6, 2011


Little do you all know that asking 10,000 people to "take the course" and be submitting questions and comments is just a way of generating their new dataset for AI research in mining coherent/good/interesting questions.
posted by olinerd at 3:26 PM on August 6, 2011 [4 favorites]


It looks like you need to buy Norvig's AI book. Maybe it's actually a study of torrent effects on textbook sales.
posted by underflow at 4:43 PM on August 6, 2011


It looks like you need to buy Norvig's AI book.

I own his book, which is a good investment either way, but you don't need to buy it:

Access to a copy of Artificial Intelligence: A Modern Approach is also suggested.

I really doubt you'll need the text to follow along, though it would probably help some problem sets out.

+ 10 on a metafilter study group. I'm in.
posted by geoff. at 5:12 PM on August 6, 2011 [2 favorites]


Re Study Group: I'll make a google wave! (umm only kinda-kidding).
posted by stratastar at 5:20 PM on August 6, 2011


I don't think having extensive knowledge of statistics and linear algebra is really required. I mean, I'd say the more you know the more you'll get out of the class, but thinking back to the assignments, I think the course felt reasonably self-contained. Certainly plenty of people take it having taken a single quarter of undergrad linear algebra and a single quarter of statistics for engineers. The course had loads of undergrads in it when I took it.

That said, I have done a reasonable amount of graduate coursework in statistics, and I don't know what it would be like to approach the course not having that coursework under my belt. I may be overestimating the extent to which things are common sense. Also, the format has changed some; we didn't have any homeworks after the midterm because we were working on our term project, and I can see some of the more advanced topics in the course requiring a bit more mathematical sophistication.

If you're thinking about it, why not sign up? It's free! Worst case, you discover you're well out of your depth. I expect a lot of the lectures will be intros to various areas of modern AI and at least you'll learn a fair bit about where the field is right now. The intent of this class for Stanford students is to get a broad view of what's up in the department and what the problems are like so you can pick your more advanced AI classes and potentially your research group, so it's not really intended to go too deeply into much.
posted by troublesome at 6:40 PM on August 6, 2011


I've found some material from a previous offering of the CS 221 course if anybody wants to see how much math is needed. It looks like they provide a basic summary during the lecture of any required statistics or linear algebra concepts. I looked at the bayes networks and unsupervised clustering lectures and they mention Bayes Theorem and eigenvectors/eigenvalues respectively, but there doesn't seem to be too much in terms of rigorous proofs of the mathematical results. The old problem sets are similarly application-focused.
posted by formless at 7:36 PM on August 6, 2011


On a related note, I enjoyed Peter Norvig's Amazon reviews of CS texts, such as Abelson's Structure and Interpretation of Computer Programs, Schütze's Foundations of Statistical Natural Language Processing, or Bishop's Neural Networks for Pattern Recognition.
posted by needled at 7:51 PM on August 6, 2011 [1 favorite]


nb. If you are a statistics grad student, you owe it to yourself to teach an intro class every now and then. "Why is there a pi sign on the board? Does that mean we add everything?" (which of course leads to, "yes, that's exactly what you should do, after taking logarithms -- add it up, and then exponentiate the result if necessary"... most students are smarter than they realize.) If you're not a statistics grad student, poke around on Wikipedia until you find something interesting. Bayesian models, particularly hierarchical models for language, are a good start.

Anyone who doesn't appreciate why eigenvalues are important in statistics, and why they come up over and over and over again, might want to acquaint themselves with the multivariate normal distribution. It effectively underpins all interesting statistical methods. (There, I said it.) By rotating data matrices into orthogonalized, lower-dimensional subspaces, you can simultaneously reveal the complexity (or lack thereof) in your data, and create a situation where certain combinations of variables can be modeled independently of the others. This is immensely powerful, and directly relates fundamental mathematics to useful bits of the physical (or at least actual) world. Principal components and canonical correlation (spectral decompositions of one or more matrices) are quite useful and have many applications. (Basically, they save you from having to work with tensors all the time, which is not trivial)

Anyways, I signed up for the course to see whether I can keep up with the Stanford undergrads... probably in for a rude surprise, but why not give it a whack. Cheers, all.
posted by apathy at 10:55 PM on August 7, 2011 [3 favorites]


Shit and onions. I used the word "rotating" where I should have written "projecting", as in "you are projecting your data into a lower-dimensional subspace". The latter is what you want to do nearly all the time, and it's this task which singular value decomposition (the impatient man's PCA) helps to accomplish. A truncated SVD reveals interesting relationships in matrices of data.
posted by apathy at 10:58 PM on August 7, 2011


I'm definitely in for a metafilter study group. I have a feeling my math skills are not up to snuff for this, but I've been contemplating forsaking the code monkey world for a more academic life anyways.

As a side note, if I were a Stanford undergraduate paying $13,350 per quarter and 10,000 internet noobs got signed up for my class, I would be royally pissed.
posted by whir at 10:20 AM on August 8, 2011


As a side note, if I were a Stanford undergraduate paying $13,350 per quarter and 10,000 internet noobs got signed up for my class, I would be royally pissed.

You aren't paying $13,350 for the class, you're paying $13,350 to be first in line when Google hire's a graduating class. My school cost a quarter of that, but I didn't have a 25% of getting into Google.
posted by geoff. at 5:57 PM on August 9, 2011 [1 favorite]


58 000 sign ups!

I am one and also might be interested in a metafilter stanford ai course study group. No idea what would be the best way to organize it would be.
posted by bukvich at 10:55 AM on August 16, 2011


One more vote for a MeFi study group.
Anticipate eponysterical posts from me.
Signups now up to 76,000.
I have the prior edition of the textbook, FWIW.
posted by neuron at 9:16 PM on August 16, 2011


They're also offering Machine Learning and Introduction to Databases! Not cool Stanford, not cool.
posted by stratastar at 5:49 PM on August 17, 2011 [1 favorite]


Hopefully others are still paying attention: to make some sort of move on this study group, I'm thinking about making a metafilter project? Anyone have thoughts about something that would aid group work / questions: a wiki? Google wave? etc?
posted by stratastar at 5:56 PM on August 17, 2011 [1 favorite]


Should be interesting. I've taken AI classes before (and own Norvig's book) I actually didn't have that much of a background in Linear Algebra when I took those courses (I do now) and I did OK.
posted by delmoi at 1:41 PM on August 29, 2011


Official registration is now open. Wasn't there another course or two being done this way this fall?
posted by jeffburdges at 4:53 AM on September 3, 2011


See also Machine Learning and Intro. to Databases
posted by jeffburdges at 5:03 AM on September 3, 2011


Are we still interested in making a study group?
here was the email from today:

Due to its popularity, we will be offering this class in two tracks:

    the basic track - in which you watch lectures and answer basic quizzes.  This is not the full course.  But you will still learn all the basics of AI.
   the advanced track - is the full class, which aspires to be of Stanford difficulty.  You do homework assignments and take exams.
posted by stratastar at 11:34 AM on September 3, 2011


« Older Film Class 101: Cake comes before the fall....  |  In October of 1951 a fan snuck... Newer »


This thread has been archived and is closed to new comments