The Megaprocessor is a 16-bit computer made almost entirely from discrete electronic components (individual transistors, diodes, resistors, capacitors and LEDs). When finished it will measure 14m wide x 2m tall. [more inside]
In electrical engineering class, I was told to think of electric circuits with a kind of hydraulic analogy. But could you extend this to entire computers? The Rube Goldberg Machine That Mastered Keynesian Economics, built by John Horton Conway[PDF] from a urinal flush mechanism. [more inside]
Lo and Behold: Reveries of the Connected World - "With interviewees ranging from Elon Musk to a gaming addict, Werner Herzog presents the web in all its wildness and utopian potential in this dizzying documentary." (via)
Thinking about learning a new programming language? How about a functional language with support for test-driven development and a snazzy visual interface, already deployed on millions of computers around the world? I'm speaking, of course, about Excel. In a 2014 Strange Loop talk, Felienne discusses the virtues of the Excel programming language (which is Turing complete, if you were wondering).
TensorFlow. Google has open-sourced their numerical computation library for machine learning applications. (Especially "deep" learning.) [more inside]
By definition, any computing platform invented in the first half of the 1980s that has survived until 2015—and is an enormous business—has accomplished something remarkable. There's the Windows PC, which traces its heritage back to the original IBM PC announced in August 1981. There's the Mac, which famously debuted in January 1984.
And then there's the Bloomberg Terminal, which hit the market in December 1982. [more inside]
And then there's the Bloomberg Terminal, which hit the market in December 1982. [more inside]
To commemorate the last decade’s worth of failures, we organized and analyzed the data we’ve collected. We cannot claim—nor can anyone, really—to have a definitive, comprehensive database of debacles. Instead, from the incidents we have chronicled, we handpicked the most interesting and illustrative examples of big IT systems and projects gone awry and created the five interactives featured here. Each reveals different emerging patterns and lessons. Dive in to see what we’ve found. One big takeaway: While it’s impossible to say whether IT failures are more frequent now than in the past, it does seem that the aggregate consequences are worse. [more inside]
There are at least three emoji-based programming languages: 🍀 (aka 4Lang; bubblesort example), Emojinal, and HeartForth (stack-based, for extra obscurity; factorial example). [more inside]
It's 1983, time to watch Computer Show. There are only a couple of episodes uploaded to Youtube, but the first one features custom art work site Lumi, and the second explores Reddit. [more inside]
The iBookGuy explains how graphics worked within the memory constraints of the Commodore 64 and NES, and the Apple II and Atari 2600
"I found this collection of outtakes in my archive. I shot these interviews on the streets of New York in the late 70s when I was doing a documentary on the coming of the information age." - Man on the street interviews with New Yorkers in 1979 about science, technology, corporate influence, computers, and paperwork. (SLYT 5:45)
Here's David Manning's YouTube videos illustrating how to make use of ghost AI quirks on the fly while playing in Ms. Pac-Man: Ghost Behavior and On Grouping. It's excellent for building an intuitive sense of how to play the game which, because of random aspects, cannot be reliably beaten with patterns as with Pac-Man. [more inside]
20 years ago: August 24, 1995 was the release date of Microsoft Windows 95. Its legacy was vast.... [more inside]
Fenlason dubbed his clone Hack for two reasons: "One definition was 'a quick [computer] hack because I don't have access to Rogue'. The other was 'hack-n-slash', a reference to one of the styles of playing Dungeons and Dragons." - A chapter long excerpt from David Craddock's Dungeon Hacks, a new book on the history of the Roguelike RPG.
A computational approach for obstruction-free photography takes out the chain link fence obscuring the target of your photo, removes reflections, and--this is the crazy TV show part--can even build a separate image from the reflection. It uses multiple frames and
magic math to build up the two "clean" images. [more inside]
The RISKS Digest Turns 30: In February 1985 Adele Goldberg, the President of the Association for Computing Machinery (ACM), published a letter in the Communications of the ACM expressing concern with humanity’s “increasingly critical dependence on the use of computers” and the risks associated with complex computer and software systems. On August 1st 1985 Stanford Research Institute's Peter G. Neumann responded by creating RISKS@SRI-CRL. [more inside]
PICO-8 is a fantasy console for making, sharing and playing tiny games and other computer programs. When you turn it on, the machine greets you with a shell for typing in Lua commands and provides simple built-in tools for creating your own cartridges.What does that mean? PICO-8 is like an emulator for a lo-fi game console that never actually existed. With 16 colors, 128x128, 4 channels of sound, and tight data limits, PICO-8 "cartridges" can be played -- and created -- in a web browser, or on just about any home computer, and even inside maker Lexaloffle's other, more full-featured fantasy console, Voxatron. [more inside]
Watch a Large Scale Deep neural net hallucinate while onlookers supply topics in a chat room. Almost magically, after a few seconds the psychedelic representations of those suggestions begin creeping out of the woodwork into which you infinitely zoom. Jonas Degrave writes about how the thing came to be on his blog. Previously.
Do Androids Dream Of Cooking? The following recipes are sampled from a trained neural net. Happy cooking!
In the late 60's and early 70's, the technology and market were emerging to set the stage for production of monolithic, single-chip CPUs. In 1969, A terminal equipment manufacturer met with Intel to design a processor that was smaller and would generate less heat than the dozens of TTL chips they were using. The resulting design was the 8008, which is well known as the predecessor to the x86 line of processors that are ubiquitous in desktop PC's today. Less well known though, is that Texas Instruments came up with a competing design, and due to development delays at Intel, beat them to production by about nine months. [more inside]
Aurion looks to be a standard and mechanically unremarkable retro action RPG with heavy Japanese design influences. But its design and feel are unmistakably fresh, offering a bold color palette and interesting unit designs. Its fiction is rooted in stories of exploitation and division, and in a desire for harmony.This review of Cameroon's Kiro’o Games latest release is just one of the increasingly visible ways Africa's game developers are beginning to gain traction in their domestic and international markets. Last fall, Lagos hosted the inaugural West African Gaming Expo, bringing together startups, gamers, developers and investors for the first time. Games range from mobile only, extremely local - smash the mosquito or drive your matatu like a maniac - to educational - to full fledged RPG like Kiro'o's Aurion. Women are as much a part of this nascent industry, breaking barriers and encouraging others to join. Watch this space.
Kernelmag's Jeff Keacher documents connecting his old Macintosh Plus to the World Wibe Web, courtesy of a Raspberry Pi and a bunch of software to remove all those pesky <div>s and such. [more inside]
Erowid Recruiter A Markov-powered mashup of Erowid trip reports and tech recruiter emails. "front end engineer would literally make or break the next hour, I walked through this area for a while and then my face ended, and the rats."
"The main reason I got so involved with the Internet is because it was safety and sanctuary in a hostile world. I was heavily bullied in school due to racial tension — most of the teachers were hostile instigators or at least uncaring. I didn't really have a lot of space to express myself, because I was constantly told that my existence was wrong. I didn't really learn a lot from the Malaysian education system: most of it was already decades old." [more inside]
X-Presion, a cutting-edge (no pun intended) hair salon in Madrid, has pioneered an interesting new pixelated hair coloring technique that has the internet abuzz. Pixelated Hair Is The Newest Cutting-Edge Trend (Bored Panda) [more inside]
The Deep Mind of Demis Hassabis - "The big thing is what we call transfer learning. You've mastered one domain of things, how do you abstract that into something that's almost like a library of knowledge that you can now usefully apply in a new domain? That's the key to general knowledge. At the moment, we are good at processing perceptual information and then picking an action based on that. But when it goes to the next level, the concept level, nobody has been able to do that." (previously: 1,2) [more inside]
We Know How You Feel Computers are learning to read emotion, and the business world can’t wait.
Deep Visual-Semantic Alignments for Generating Image Descriptions. A model that generates free-form natural language descriptions of image regions. Holy crap.
WYNC's Manoush Zomorodi investigates the gender gap in tech and computer science, and finds a number of people working towards bridging that gap, from childhood to university: completely restructuring a required computer science course to make it more welcoming to female university students, celebrating women in computing history (and recognizing that computer science wasn't so male-dominated, and making children's books and toys (even dollhouses!) for kids to explore programming concepts on their own. She also noticed that the majority of female computer science students in the US had grown up overseas - possibly because computer science isn't a common subject in American high schools. This is slated to change: a new AP Computer Science subject is in the works, with efforts to get 10,000 highly-trained computer science teachers in 10,000 high schools across the US. If you want to join Mindy Kaling in supporting young girls entering computer science, tech, and coding, there's a lot [more inside]
In Search at San Jose the R&D minds at IBM describe how they designed & built the world's first hard drive, the IBM 305 RAMAC (previously). First sold in 1956, it stored a whopping 5 million characters of information, all ready for immediate access to the user.
Welcome to Al Zimmermann's Programming Contests. You've entered an arena where demented computer programmers compete for glory and for some cool prizes. The current challenge is just about to come to an end, but you can peruse the previous contests and prepare for the new one starting next month.
The Mystery of Go, the Ancient Game That Computers Still Can’t Win
The challenge is daunting. In 1994, machines took the checkers crown, when a program called Chinook beat the top human. Then, three years later, they topped the chess world, IBM’s Deep Blue supercomputer besting world champion Garry Kasparov. Now, computers match or surpass top humans in a wide variety of games: Othello, Scrabble, backgammon, poker, even Jeopardy. But not Go. It’s the one classic game where wetware still dominates hardware.[more inside]
How the typewriter is/isn't better/worse for your writing. A little bit about ye olde handwriting in there as well.
The July 23, 1966 issue of Norman Cousins' The Saturday Review used 30 pages to focus on The New Computerized Age (Link to chapter PDFs), digitized and licensed for your enjoyment by Unz.org. [more inside]
In 1994, Douglas Davis [personal blog] created The World's First Collaborative Sentence. Last summer, The Whitney Museum faced a new challenge: what happens to digital art when the technology becomes obsolete? [more inside]
For your Sunday reading, a couple of stories of ye olden computing days: Why MacPaint's Original Canvas was 416 Pixels Wide and A Great Old Timey Game Programming Hack.
For years we've been told that our laptop cameras and webcams are "hardwired" to an LED such that the camera can't be turned on without triggering the light. Yeah, you can see where this is going (the original paper). The exploit works on pre-2008 Macs, though other laptops and webcams could be vulnerable to a similar exploit. The researchers have a kernel extension to prevent this on 2007 / 2008 MacBooks. My preferred solution for the rest of us.
"After two decades online, I'm perplexed. It's not that I haven't had a gas of a good time on the Internet. I've met great people and even caught a hacker or two. But today, I'm uneasy about this most trendy and oversold community. Visionaries see a future of telecommuting workers, interactive libraries and multimedia classrooms. They speak of electronic town meetings and virtual communities. Commerce and business will shift from offices and malls to networks and modems. And the freedom of digital networks will make government more democratic. Baloney. Do our computer pundits lack all common sense? The truth [is] no online database will replace your daily newspaper, no CD-ROM can take the place of a competent teacher and no computer network will change the way government works." A view of the Internet's future from February 26, 1995 at 7:00 PM
Imgur began as a photo sharing site to be used by Redditors. It now outpaces Reddit in total traffic. What's next for the site?
"The Dutch social network Hyves didn't chase its users in white clean profiles (like Facebook), and it also didn't allow them to build their own web sites (like Geocities, Tumblr). Instead, for almost 10 years, it was going with the Pimp My Profile model (Myspace): users were allowed to change the avatar, colors of texts and other elements, background image and its position. Not a lot, but the users of Hyves developed the mastery of talking to the world through the choice and combination of userpic and wallpaper." (From Contemporary Home Computing, Olia Lialina. Via @cory_archangel)
What could be more impressive than learning to program, and then writing a complete new music notation program? Doing it with your feet.
"It looks like the state of the art in intrusion stuff is a lot more advanced than we assumed it was."
Douglas Hofstadter, the Pulitzer Prize–winning author of Gödel, Escher, Bach, thinks we've lost sight of what artificial intelligence really means. His stubborn quest to replicate the human mind.
A few months ago there was a list of links to classic video game emulators posted. Very recently, I'm pleased to report, those links all came true. The Internet Archive bespoke upon aforementioned consoles, computers, and mileposts on our way to the tech utopia of today, (seriously, where's my flying car?) and they asked us to do something: Imagine every computer that ever existed, literally, in your browser. And it was so. I have absolutely no affiliation with jscott, btw. Thought I should disclose that.
Many of the Macintosh team members gathered Wednesday, September 11 2013 to play with one of the original “Twiggy Mac” prototypes still in running condition. Quick, Hide In This Closet!
The Physics of Light and Rendering is a talk given at QuakeCon 2013 by John Carmack, co-creator of Doom, Quake, and many other games at id Software and beyond. It provides a detailed but surprisingly understandable history of 3D rendering techniques, their advantages and tradeoffs, and how they have been used in games and movies. (SLYT, 1:32:01, via)