"As recently as a year ago, there were many publishers, librarians, and scholars who thought that electronic publishing was just a passing fad." In 1996, the number theorist Andrew Odlyzko, a pioneer in the development of "experimental mathematics" via large-scale computation, wrote a article, prescient in many respects, about the effect the Internet would have on the economics of scholarly publication, and on commerce more generally.
Math Overflow is the first attempt to use the Stack Exchange platform, already popular with programmers, as a scientific research tool. Founded this month by a group of young mathematicians, including Scott Morrison and Ben Webster of the Secret Blogging Seminar, the site is already wrestling with hundreds of questions, ranging from the technical ("When is a map given by a word surjective?") to the historical ("Most interesting mathematics mistake?")
Stephen Wolfram discusses Wolfram|Alpha: Computational Knowledge Engine - at the same time Google Adds Search to Public Data, viz: "Nobody really paid attention to the two hour snorecast" -- like a cross between designing for big data and a glossary of game theory terms -- on Wolfram|Alpha (previously), yet the veil is being lifted nonetheless: "[on] a platonic search engine, unearthing eternal truths that may never have been written down before," cf. hunch & cyc (and in other startup news...) [via] [more inside]
"the scale-free network modeing paradigm is largely inconsistent with the engineered nature of the Internet..." For a decade it's been conventional wisdom that the Internet has a scale-free topology, in which the number of links emanating from a site obeys a power law. In other words, the Internet has a long tail; compared with a completely random network, its structure is dominated by a few very highly connected nodes, while the rest of the web consists of a gigantic list of sites attached to hardly anything. Among its other effects, this makes the web highly vulnerable to epidemics. The power law on the internet has inspired a vast array of research by computer scientists, mathematicians, and engineers. According to an article in this month's Notices of the American Math Society, it's all wrong. How could so many scientists make this kind of mistake? Statistician Cosma Shalizi explains how people see power laws when they aren't there: "Abusing linear regression makes the baby Gauss cry."
On May 13, security advisories published by Debian and Ubuntu revealed that, for over a year, their OpenSSL libraries have had a major flaw in their CSPRNG, which is used by key generation functions in many widely-used applications, which caused the "random" numbers produced to be extremely predictable. [lolcat summary] [more inside]
No, I'm sorry, it does. There are some arguments that never end. John or Paul? "Another thing coming" or "Another think coming?" But none has the staying power of "Is 0.999999...., with the 9s repeating forever, equal to 1?" A high school math teacher takes on all doubters. Round 2. Round 3. Refutations of some popular "They're not equal" arguments. Refutations, round 2. You don't have to a mathematician to get in on the fun: .99999=1 discussed on a conspiracy theory website, an Ayn Rand website (where it is accused to violating the "law of identity"), and a World of Warcraft forum. But never, as far as I can tell, on MetaFilter.