Shell help
September 16, 2013 2:41 AM   Subscribe

Explain Shell is a nifty little website created by Idan Kamara that takes the often horrid Linux command line man pages and makes them that much easier to understand, by breaking down a command like ssh(1) -i keyfile -f -N -L 1234:www.google.com:80 host into its component steps.
posted by MartinWisse (64 comments total) 105 users marked this as a favorite
 
brill. thanks!
posted by RTQP at 2:49 AM on September 16, 2013


This is nifty, though it might be useful for it to be able to recognize pipes and back-ticks. An experienced user would have no trouble seeing what's going on with those, but if you're a beginner, those confuse the hell out of you. This kind of tool could be extremely helpful for learning that stuff.
posted by gkhan at 2:54 AM on September 16, 2013


Oh this is very useful! Thanks, MartinWisse!
posted by Foci for Analysis at 2:56 AM on September 16, 2013


What I always hated about the man pages is that they listed all the options of a command without highlighting the most used ones. And there were never enough examples.

So you want to create a TAR archive from a directory? If you can figure out how to do it by reading this, you are some kind of genius.

I know that explainshell has a tar example. But the example already assumes you figured out how to use it, which is a bit of a catch-22.
posted by vacapinta at 3:19 AM on September 16, 2013 [6 favorites]


I love this, and really wish it'd been available when I first started noodling around at the command line.

...it might be useful for it to be able to recognize pipes and back-ticks

Looks like that's coming soon - "support for pipes, redirections and other shell syntax will be added later on".

(As an aside, I always end up in a tizzy when using backticks - $(blah blah blah) is worth the extra typing, I think, especially if you're nesting stuff.)
posted by jack_mo at 3:21 AM on September 16, 2013


This is pretty great. I'll use this one all the time.
posted by empath at 3:56 AM on September 16, 2013


Looks like that's coming soon - "support for pipes, redirections and other shell syntax will be added later on".

Great! This will be a wonderful resource!

(As an aside, I always end up in a tizzy when using backticks - $(blah blah blah) is worth the extra typing, I think, especially if you're nesting stuff.)

Me too, man, backticks unnerve me. I try to use xargs whenever I can.
posted by gkhan at 4:04 AM on September 16, 2013 [1 favorite]


Just as a heads up, I get a Mal/HTMLGen-A warning when I try to go to that site from Sophos
posted by skrymir at 4:17 AM on September 16, 2013 [4 favorites]


Yeah, I can confirm that malware warning. Same thing here.
posted by Just this guy, y'know at 4:23 AM on September 16, 2013


Now if I can find the same thing for Python...
posted by zardoz at 4:44 AM on September 16, 2013


Hmm. No malware warning from McAfee. Maybe it's the URL structure?
posted by KGMoney at 5:37 AM on September 16, 2013


Just as a heads up, I get a Mal/HTMLGen-A warning when I try to go to that site from Sophos

The main link or the sample link?

Not that I'd put much stock in Sophos. The supremely generic threat analysis doesn't help either.
posted by kmz at 6:26 AM on September 16, 2013


As an aside, I always end up in a tizzy when using backticks - $(blah blah blah) is worth the extra typing, I think

And throws an error in the only shells you can definitely assumed to be on the box, namely /bin/sh and /bin/csh. The Korn shell introduced treating command substitution as an oddball form of variable substitution with the $(foo --bar baz) syntax, which was adopted by bash 2, leading it to be commonly available.

But if you need truly portable, you can't rely even on POSIX syntax, you assume you have /bin/sh and you use backticks. If you don't, you can start with #!/usr/bin/ksh or whatnot and use the better syntax.
posted by eriko at 6:41 AM on September 16, 2013 [1 favorite]


I get the warning from Sophos as well (main link).
posted by dukes909 at 6:41 AM on September 16, 2013


> man pages … listed all the options of a command without highlighting the most used ones. And there were never enough examples.

Good grief, you think they were written for users? No, a man page describes what the command can do, not what you might want it to do. They also have to be short because they'd take forever to print on a daisywheel.

My shell skills are almost 20 years out of date, but they still work. I'll always be learning: f'rinstance, I recently found out about parallel, which runs commands like xargs, but in parallel. Handy for multi-core modern machines, if there's not too much I/O.
posted by scruss at 7:00 AM on September 16, 2013 [2 favorites]


I just want something that tells which one is the input file and which one is the output file. Half the time they go in one order, half the time they go in the other order, and man people seem to go out of their way to bury the info on which is which.
posted by alms at 7:39 AM on September 16, 2013 [1 favorite]


> Good grief, you think they were written for users? No, a man page describes what the command can do, not what you might
> want it to do. They also have to be short because they'd take forever to print on a daisywheel.

Great heavens, print a man page? Users, maybe.

Unless... was there ever actually a Unix-based system that only took input from punch cards, only wrote output to greenbar paper, and lacked even so much as a green-screen console monitor? Inconceivable!
posted by jfuller at 7:52 AM on September 16, 2013


Unless... was there ever actually a Unix-based system that only took input from punch cards, only wrote output to greenbar paper, and lacked even so much as a green-screen console monitor? Inconceivable!

My sarcasm daemon just crashed.
posted by Celsius1414 at 8:28 AM on September 16, 2013 [4 favorites]


This is really neat.
posted by brennen at 8:30 AM on September 16, 2013


God this is why I love metafilter. I would have never found explainshell otherwise.
As a Windows admin who works with Linux peeps who toss out things like ssh(1) -i keyfile -f -N -L 1234:www.google.com:80 host in email correspondence, I love this site. Thanks for posting, MartinWisse!
posted by 8dot3 at 8:41 AM on September 16, 2013


parallel, which runs commands like xargs, but in parallel.

A separate utility is not needed, as 'xargs -P' does exactly what you describe. (And xargs will be installed, whereas 'parallel' probably won't be). If you read the man page for 'xargs' you'd see this.

often horrid Linux command line man pages

Please don't disparage the documentation, it is not horrid, and it is there to help you. The man pages are locally installed (or should be) so you can read them even if you don't have net access. Also they are more likely to match the versions of software that are actually installed on the local system. Oftentimes I've seen people look at an online manpage which doesn't quite match the version of software that is installed, leading to confusion.
posted by crazy_yeti at 8:42 AM on September 16, 2013 [7 favorites]


Not that I'd put much stock in Sophos. The supremely generic threat analysis doesn't help either.

Sophos "mis"-identifies google analytics as malware? I think maybe they're just the only ones with the guts to tell the truth! Get that spyware crap off my computer.

In conclusion, NoScript (or NotScripts).
posted by Galaxor Nebulon at 9:01 AM on September 16, 2013 [1 favorite]


eriko: "As an aside, I always end up in a tizzy when using backticks - $(blah blah blah) is worth the extra typing, I think

And throws an error in the only shells you can definitely assumed to be on the box, namely /bin/sh and /bin/csh. The Korn shell introduced treating command substitution as an oddball form of variable substitution with the $(foo --bar baz) syntax, which was adopted by bash 2, leading it to be commonly available.
"

Does any bash user who isn't a sysadmin really need to worry about this, though? Otherwise, I feel like your likelihood of having to deal with a commercial *NIX or BSD variant where bash isn't the default shell are pretty low, barring some niche project.
posted by invitapriore at 9:19 AM on September 16, 2013 [1 favorite]


In additiont to pipes and backticks, it could use more support for shell built-ins. I tried something of the form

for i in 1 2 3 4 ; do echo $i ; done

and it complained it didn't have a manpage for "for". "while" didn't work either.
posted by fings at 9:37 AM on September 16, 2013


um... hate to be nit-picky, but the command in the title is not, in fact, a legal command in Unix or anywhere else.

There is no "ssh(1)" command. Yes, there is an "ssh" command - and in the names of man pages, the (1) means it lives in the first section of the man pages. This is needed because, for example, sometimes you have the same command appearing as a command line program and as a C library function - as an example time(1) and time(2).

In fact, parentheses are special characters, so this could never be a valid command. If you try ssh(1), with or without the remaining line, you will get an error similar to this:

bash: syntax error near unexpected token `1'

(note the different quote marks! no idea where that came from...)

They should fix that - this is going to confuse the heck out of people who paste command lines and get a weird error.
posted by lupus_yonderboy at 9:58 AM on September 16, 2013 [2 favorites]


Please don't disparage the documentation, it is not horrid, and it is there to help you.

Are you.... serious?

I have been using unix-derived systems for two decades now and I still can't remember which order to put the arguments for 'ln'. The man page says this:
SYNOPSIS
ln [-Ffhinsv] source_file [target_file]
ln [-Ffhinsv] source_file ... target_dir
link source_file target_file

DESCRIPTION
The ln utility creates a new directory entry (linked file) which has the
same modes as the original file.
Beautiful. Because "source file" could mean either "linked file" or "original file", and "target file" could also mean either "linked file" or "original file"...

If this isn't horrid documentation then I don't know what you'd use the term for.
posted by Mars Saxman at 10:29 AM on September 16, 2013 [5 favorites]


Yes, very cool. Should go nicely right next to my DeTeXify button.
posted by pjenks at 10:57 AM on September 16, 2013


I still can't remember which order to put the arguments for 'ln'.

The easy way to remember is it is the same as 'cp' and 'mv': OldFile NewFile.
posted by fings at 11:04 AM on September 16, 2013


What a clever tool. Really nice to break out useful bits of man pages in this manner. I know Unix man pages are not always the easiest things to read.. But they were such an innovation, the idea that the manual for your operating system was built right into the operating system itself. And that you could browse it on screen or print it out or buy a book. I've often thought man pages were as important to the growth of Unix as "view source" was to the web.

Also a shout-out to GNU, whose tools popularized the longopts that make remembering obscure flags more humane. curl --show-error is way better than remembering curl -S (and not the totally opposite curl -s). Also much love for bash-completion which makes the tab key complete Unix commands in a much smarter way.
posted by Nelson at 11:16 AM on September 16, 2013


One persons 'horror' is another persons 'power'...
posted by DesbaratsDays at 11:16 AM on September 16, 2013


If this isn't horrid documentation then I don't know what you'd use the term for.

It's great documentation if you assume familiarity with the format. It is not for beginners.

(One might make the argument that the command-line isn't for beginners in general, though I've seen it argued the other way for users who haven't been "corrupted" by GUIs first. ;)

However, there are numerous resources available to teach the basics, including how to human-parse manpages.
posted by Celsius1414 at 11:42 AM on September 16, 2013


At the same time, it is more than fair to say that not all manpages are created equal, and it is something to which most programs should pay better attention.
posted by Celsius1414 at 11:44 AM on September 16, 2013


In conclusion, Unix is a land of contrasts.
posted by Celsius1414 at 11:45 AM on September 16, 2013 [6 favorites]


Celsius1414: "(One might make the argument that the command-line isn't for beginners in general, though I've seen it argued the other way for users who haven't been "corrupted" by GUIs first. ;)"

I think those people are normalizing their own experience excessively. GUIs really are more discoverable, both by virtue of the fact that they visually enumerate all of the available options and that those options are made members of a hierarchy of functionality rather than a network of functionality, which means that you can easily learn about subbranches in that hierarchy in isolation. I think it's fair to say that the network approach is the more powerful one, but there's a reason that novice programmers using OO languages so often defy the maxim of composition over inheritance.
posted by invitapriore at 11:53 AM on September 16, 2013


Speaking of discoverable: I got my first shell account in 1990. I sat down, logged in ("Say, I know my username! And my password, too!"), and then... A mocking, blinking nothingness.

I typed in all the things I could think of that I'd learned using DOS and old Apple machines, and I got nothin'. The `help` command actually frightened me:
HELP(1)

NAME

help - ask for help about SCCS error messages and commands

SYNOPSIS

help [args]

DESCRIPTION

Help finds information to explain a message from an SCCS command or
explain the use of an SCCS command. Zero or more arguments may be
supplied. If no arguments are given, help will prompt for one.

The arguments may be either message numbers (which normally appear in parentheses following messages) or command names, of one of the following types:
I logged off and didn't touch a computer without a GUI for a couple of years, until my unhappiness and loneliness drove me to a Vax terminal in the first few weeks of what I later learned was The September That Never Ended. Since then I've made a decent career as a Unix sysadmin, but my first explorations were pretty fraught.
posted by wenestvedt at 12:10 PM on September 16, 2013


Yeah, man pages are definitely a reference, not an introductory doc. This Explain Shell site does a great job remixing that reference material in a way that's relevant to understanding specific commands.

The lack of introductions to Unix is what originally created O'Reilly & Associates, Inc, the book publisher. I don't have a link to the story at hand but Tim O'Reilly early on wrote a book on how to use Unix that was more accessible than man pages or other books. Again, no link handy, but he's spoken in the past about how important the reference docs and source code culture of Unix was to him and how he built a business around it.
posted by Nelson at 12:11 PM on September 16, 2013


I was preparing to weigh in on needing to know the difference between backticks and $() syntax for users vs. sysadmins, but did some testing in dash. And then some research after I was surprised by the results.

It seems that $() is POSIX, so you should be able to rely on it pretty much anywhere that cares about POSIX. I expected dash (the default /bin/sh symlink on Ubuntu) to barf on $() but it handles it exactly as bash does.

I guess it's time for me to update some of my old shell scripts.
posted by fader at 12:15 PM on September 16, 2013


It's great documentation if you assume familiarity with the format. It is not for beginners.

That's just another way of saying that the documentation is terrible. The beginner-expert axis is multidimensional and non-monotonic; we are all beginners in those parts of the system we have never used before. This is why you can't build an "easy mode" for people to graduate from; there is no one path from easy to hard.
posted by Mars Saxman at 12:21 PM on September 16, 2013


Because "source file" could mean either "linked file" or "original file", and "target file" could also mean either "linked file" or "original file"...
Under what circumstances would a symlink to be created be a "source file" as far as the linker is concerned?
posted by idiopath at 12:22 PM on September 16, 2013


On the one hand I applaud an effort to make the POSIX environment easier to use and document.

On the other I want it keep it close to the vest as I get money by knowing things most people don't.
posted by wcfields at 12:24 PM on September 16, 2013


That's just another way of saying that the documentation is terrible.

Apart from it saying the exact opposite of what you're saying, it's completely the same thing.
posted by Celsius1414 at 12:30 PM on September 16, 2013 [2 favorites]


The beginner-expert axis is multidimensional and non-monotonic; we are all beginners in those parts of the system we have never used before. This is why you can't build an "easy mode" for people to graduate from; there is no one path from easy to hard.

Except we're talking about how to get help, not every other topic under the Unix sun.
posted by Celsius1414 at 12:32 PM on September 16, 2013


FWIW, Sophos is no longer detecting the site as malicious.
posted by maxim0512 at 12:40 PM on September 16, 2013


GUIs really are more discoverable, both by virtue of the fact that they visually enumerate all of the available options

But, as the Dutch prime minister who shall remain nameless proved, the mouse + windows GUI environment is something you must learn as well, or you have no clue how a mouse works and might Spockwise try to use it as a microphone...

There's a lot of power in a command line environment once you learn a few basic commands and those really aren't more difficult than learning how to navigate a GUI. It's just that it's easier (and has been made easier) to do complicated things quickly in a GUI by just knowning how to use your mouse and menus rather than having to remember complicated command structures, even if these are more powerful.
posted by MartinWisse at 12:43 PM on September 16, 2013


Beautiful. Because "source file" could mean either "linked file" or "original file", and "target file" could also mean either "linked file" or "original file"...

I'll admit not all man pages are super great, but this seems like a pretty bad example. There is no ambiguity there. Source file == original file, target file == linked file. I'm honestly confused by your confusion.
posted by kmz at 12:48 PM on September 16, 2013


Except that a symlink points to the... source file? from the... target file? Huh? There's even a little arrow in some ls implementations...
In other words, I can totally empathize with the original complaint.
posted by tigrrrlily at 1:05 PM on September 16, 2013


Seems to me that most man pages are excellent for their target audience. That audience is not beginner-level.
posted by five fresh fish at 1:20 PM on September 16, 2013


I don't know how anybody who's used a real programming language can truly enjoy or recommend the command line, especially since there are now a multitude of modern scripting languages with good syntax and useful documentation. I mean, I like many of the ideas behind the command line, but as soon as I start considering piping anything into grep or awk I know it's time to do it in Python instead.
posted by Pyry at 1:33 PM on September 16, 2013


> Great heavens, print a man page? Users, maybe.

You mean your department didn't maintain a wall of binders of all the manual pages arranged in alphabetical order, lovingly maintained with locally-printed updates as the pages changed?

> 'xargs -P' does exactly what you describe. … If you read the man page for 'xargs' you'd see this.

Oh, not exactly; parallel is line-based; xargs is token-based. Also, it seems that -P came in around 2001, which was somewhat after I last needed to read the man page.
posted by scruss at 1:43 PM on September 16, 2013


> as soon as I start considering piping anything into grep or awk I know it's time to do it in Python instead.

I'm so terribly sorry to hear about your loss. A life without awk would not be worth living.
posted by scruss at 1:48 PM on September 16, 2013 [4 favorites]


Pyry: "I don't know how anybody who's used a real programming language can truly enjoy or recommend the command line, especially since there are now a multitude of modern scripting languages with good syntax and useful documentation. I mean, I like many of the ideas behind the command line, but as soon as I start considering piping anything into grep or awk I know it's time to do it in Python instead."

I don't know, I can't imagine how much time I'd have wasted at this point regurgitating all the boilerplate necessary to start futzing around with fields in a CSV file when I could spend just a few seconds writing some AWK. And god forbid you need to coordinate output and input between subprocesses, because holy shit is the subprocess library just the most painfully verbose thing in the entire world. Plus, for a file of any reasonable size, Python (CPython at least, which I'm constrained to using in my day-to-day for various reasons) is slow.

That's not to say that Bash isn't a hideous and malformed insult of a language, but I feel like I got disabused pretty quickly of the notion that I could be equally efficient if I automated tasks using Python instead.
posted by invitapriore at 1:48 PM on September 16, 2013 [1 favorite]


I have been using unix-derived systems for two decades now and I still can't remember which order to put the arguments for 'ln'. The man page says this:
posted by Mars Saxman


This is made worse by the fact that the GNU bash man page and the OSX man page use 'Target' to mean the opposite of each other, so I don't think you are the only one getting confused.

But here I made you an ln man page reference with some (hopefully) unambiguous language (self link)
posted by Lanark at 2:30 PM on September 16, 2013


I don't know, I can't imagine how much time I'd have wasted at this point regurgitating all the boilerplate necessary to start futzing around with fields in a CSV file when I could spend just a few seconds writing some AWK.

Well, Python has a standard library just for handling csv files, which will take care of most of the boilerplate and also handle the complicated cases (commas in quoted fields, for example) that you will probably miss if you try to hack up a regular expression yourself.

But, I never said awk was bad: awk's great, it's a Real Programming Language, and if you're using it to reformat multi-gigabyte CSV files, then that's an appropriate* use. On the other hand, if you're piping ls into awk, then you're using awk as a band-aid for bash's limitations and should just do the whole thing in your choice of sane scripting language.

*But, you know, databases were invented to efficiently deal with exactly this type of thing.
posted by Pyry at 2:49 PM on September 16, 2013


Does any bash user who isn't a sysadmin really need to worry about this, though? Otherwise, I feel like your likelihood of having to deal with a commercial *NIX or BSD variant where bash isn't the default shell are pretty low, barring some niche project.

Uhh, like FreeBSD? I'm almost certain that not only is bash not the default shell, you still have to install it. For reals. Also pretty sure it only comes with vi by default, which is more than enough for any real sysadmin. *ducks*

On the plus side, the man pages are better than linux. Considering the ln example above, it's explained more thoroughly and uses the same wording in both sections:
The ln utility creates a new directory entry (linked file) for the file name specified by target_file. The target_file will be created with the same file modes as the source_file.
Sweet utility, by the way, much easier than digging through some of the really long man pages.
posted by nTeleKy at 2:59 PM on September 16, 2013


Pyry: "Well, Python has a standard library just for handling csv files, which will take care of most of the boilerplate and also handle the complicated cases (commas in quoted fields, for example) that you will probably miss if you try to hack up a regular expression yourself.

But, I never said awk was bad: awk's great, it's a Real Programming Language, and if you're using it to reformat multi-gigabyte CSV files, then that's an appropriate* use. On the other hand, if you're piping ls into awk, then you're using awk as a band-aid for bash's limitations and should just do the whole thing in your choice of sane scripting language.

*But, you know, databases were invented to efficiently deal with exactly this type of thing.
"

The csv module is exactly the boilerplate I was talking about, databases don't cover every use case for tabular data, and I don't understand what the output of ls has to do with Bash, but okay.
posted by invitapriore at 3:02 PM on September 16, 2013


nTeleKy: "Uhh, like FreeBSD? I'm almost certain that not only is bash not the default shell, you still have to install it. For reals. Also pretty sure it only comes with vi by default, which is more than enough for any real sysadmin. *ducks* "

Yeah, I know, but I was assuming that if you're already a programmer who typically uses bash then it's unlikely that you're suddenly going to be forced to use something like FreeBSD.
posted by invitapriore at 3:03 PM on September 16, 2013


About once a week for 15 years I've done some variant of

zcat data | awk '{ print $1 }' | sort | uniq -c | sort -nr | head

Where the zcat and awk expression are replaced with whatever I need to get at the data I'm trying to characterize. There's more efficient ways to do this data crunching, hell I've written my own Python script+library to do it, but there's something about the flexibility of the shell/awk combo that's hard to beat.
posted by Nelson at 3:05 PM on September 16, 2013 [1 favorite]


fings:
for i in 1 2 3 4 ; do echo $i ; done
and it complained it didn't have a manpage for "for". "while" didn't work either.

Try "help for". Shell builtins are documented internally in the shell, "help" brings up the doc for all shell builtins.
posted by crazy_yeti at 3:40 PM on September 16, 2013


mars saxman: Are you.... serious?
Absolutely.

SYNOPSIS
ln [-Ffhinsv] source_file [target_file]
ln [-Ffhinsv] source_file ... target_dir
link source_file target_file

DESCRIPTION
The ln utility creates a new directory entry (linked file) which has the
same modes as the original file.

Beautiful. Because "source file" could mean either "linked file" or "original file", and "target file" could also mean either "linked file" or "original file"...

Interesting, what system is that from? On my Linux box it's quite a bit clearer.

SYNOPSIS
ln [OPTION]... [-T] TARGET LINK_NAME (1st form)

DESCRIPTION
In the 1st form, create a link to TARGET with the name LINK_NAME.

I think that eliminates the ambiguity you are complaining about.

But ... even in the version you posted, I think it's pretty clear what "source file" means ... that's the original file, not the new directory entry. Source file always comes first, as you go from a source to a destination ... like the way a river flows, from the source ... or think about the way a compiler works, it produces a target from a source file, not the other way round. The terminology is meant to be suggestive.
posted by crazy_yeti at 3:49 PM on September 16, 2013


scruss: Oh, not exactly; parallel is line-based; xargs is token-based.

You can control the xargs behavior with '-0' or '-d' flags.

Also, it seems that -P came in around 2001, which was somewhat after I last needed to read the man page.

So, your knowledge is 12 years out of date.
posted by crazy_yeti at 3:58 PM on September 16, 2013


Try "help for". Shell builtins are documented internally in the shell, "help" brings up the doc for all shell builtins.

Oh, I know that. I was just suggesting that the Explain Shell page could do a better job for those who don't.
posted by fings at 4:23 PM on September 16, 2013


> it only comes with vi by default, which is more than enough for any real sysadmin. *ducks*

I am Not A Real Sysadmin. I am actually a PACS admin. Half of what I know is about workflow in a department of Radiology and how that typically fucks up ("My patient had an Ass sequence and an Elbow sequence ordered and I just sent my ass images to my elbow folder, can you fix that quick before some doctor sees it?") while (since our current PACS is Windows-based) the other half is about Win 2008 and 2012 servers and TCP/IP networking, and how those typically fuck up.

But there is yet a third half which I think of as the sausage half. Don't look at it too closely and certainly don't watch it being made. It contains lots of increasingly fetid and decayed Unix knowledge (our previous PACS, replaced in 2004, was Solaris-based), lots of leftover hobbyist shite from ages past (how to do hex dumps with opcodes on an Apple II), lots of site-specific information on doing end runs around our ever-vigilant IT staff (who really REALLY want to manage my own WS remotely and I'm sorry, no.) There's also quite a bit of superannuated Linux sausage-packing (my first distro was slackware circa 1992-93, kernel version 1.0.13.)

And somewhere among the *nix bits is THE POWER of being vi-capable. That stuff is NOT superannuated and I review it several times a year because vi is still there, and always there (I found FreeBSD running on a printer not too long ago) and it's often--for me, usually--the only text editor there. You say you must for some reason edit the hosts file on a GE MRI machine? Dig into it a little bit and you see it claims to be GEMS (GE Medical Systems) Linux. Dig just a little deeper and hey, it's Redhat 8. With GE logos.

I can also start a campfire with flint and steel. Or anyway I did that once, and RMS was no damn help at all.
posted by jfuller at 5:39 PM on September 16, 2013 [3 favorites]


Also, I love parens.
posted by jfuller at 5:41 PM on September 16, 2013


Better is to run:

ssh -D1080 host

...and then tell your browser to use 127.0.0.1 port 1080 as a SOCKS5 proxy. Then everything routes through host, not just Google.
posted by effugas at 8:21 PM on September 16, 2013


« Older Croak and Dagger   |   But it can be Newer »


This thread has been archived and is closed to new comments