When two Stanford professors decided to offer their artificial intelligence (AI) course online, for free, to anyone interested in taking it, they'd expected a few thousand students, tops. But by the start of the class, they had 160,000 enrolled [source: Morrison].
That was in 2011. The professors were experimenting with a fairly new course format called the massive open online course, or MOOC (think "mook" for easier reading). Since then, dozens of universities, including some of the most prestigious and expensive in the world, have launched their own MOOCs. Anyone with Internet access can take Fundamentals of Neuroscience from Harvard, Intro to English Composition from Duke, Circuits and Electronics from MIT and Constitutional Struggle in the Muslim World from University of Copenhagen, all for the low-low cost of nothing [source: Coursera]!
And all for no college credit, although various for-credit models are on the drawing board [source: Masterson].
The classes typically run from about four to 12 weeks. Most MOOCs are introductory classes you'd take early in a college career, though some higher-level subjects are available, catering to students who already have some college background or degrees and are looking to continue their education or add a new entry to their resumes [sources: Marques, Pozniak].
It's easy to dismiss the new format as a remake. OpenYale and iTunesU already offer free online lectures. Some colleges offer free e-textbooks and coursework through those types of "open learning" platforms [source: Marques]. Really, if you set the cost-free thing aside, it seems like we've had similar opportunities for years – people have long earned actual college credits and completed degrees online.
MOOCs are legitimately different, though, at least in theory. You take a traditional online course, make it free, nix a lot of the preconceived structure, and oh, right -- have your classmates grade your work. The end is a collaborative, popularized approach to learning based on a theory called connectivism.