Programming with trits and trytes!
June 17, 2017 12:40 PM   Subscribe

 
Mods, can we have a facility to neither favourite nor not favourite this post?
posted by Segundus at 1:41 PM on June 17 [73 favorites]


Maybe
posted by STFUDonnie at 2:24 PM on June 17 [32 favorites]


I like to think that they also ran 33% faster than their contemporary computers.
Look, it makes sense in my head, ok?
posted by slater at 2:26 PM on June 17 [6 favorites]


There are only 11 types of people in the world: Those who understand binary, those who understand trinary, those who understand both, and those who understand neither.

Did I get it right? Math is confusing.
posted by Balna Watya at 3:00 PM on June 17 [12 favorites]


There are +0 types of people in the world: those who understand trinary, those who don't understand trinary, and those who go so far beyond mere ignorance of trinary that they actually understand it a negative amount.
posted by nebulawindphone at 3:10 PM on June 17 [8 favorites]


MetaFilter: The artist is actually not happy with the title of his work.
posted by GenjiandProust at 3:27 PM on June 17 [2 favorites]


There are only 11 types of people in this world, those who think this is a binary joke, those who think this is a trinary joke, and the remaining (numberbase-1) who don't joke about such things.
posted by otherchaz at 3:29 PM on June 17 [7 favorites]


I wish there was a diagram of how the adder logic actually worked. Saying it was ferrite cores is fine, but the adder was just diodes? The article could go a lot more in depth.
posted by GuyZero at 3:41 PM on June 17 [4 favorites]


There are only 1110 types of people in this world: Those that belong to the emperor; Embalmed ones; Those that are trained; Suckling pigs; Mermaids; Fabulous ones; Stray dogs; Those that are included in this classification; Those that tremble as if they were mad; Innumerable ones; Those drawn with a very fine camel hair brush; Et cetera; Those that have just broken the flower vase; Those that, at a distance, resemble flies.
posted by ardgedee at 3:47 PM on June 17 [57 favorites]


I love this! It's like peering across the cobain into a different timeline, where FORTH can only really be appreciated in the original ternary Klingon.
posted by bigbigdog at 4:00 PM on June 17 [8 favorites]


This is pretty tangential, but this just reminded me that I had a dream last night that there was a stack-based programming language called Firth (after the eponymous actor's many roles as posh characters with posh accents) that required the use of different (lexical) registers for any given function invocation depending on how high the current runtime parameter stack was regardless of arity, e.g. 5 2 + would only be acceptable if nothing was on the parameter stack at the time 5 was pushed onto it, otherwise you'd have to say 5 2 +_registerN, where +_registerN is a placeholder for the version of + that you defined for invocations where there are N items on the stack below your two operands. Needless to say, this requires that you define the function name for each such register/stack size, and it's a runtime error if your invocation register doesn't match the current size of the stack. Also, defining an insufficiently elegant name for a high-register function is a compile time error, as is defining an inappropriately elegant name for a low-register function, as is any failure to have function-name_registerN be strictly more elegant than function-name_registerM for any N > M. An example for a higher-register version of + might be whereby-the-two-preceding-quantities-are-combined-to-yield-a-sum (kebab case is syntactically valid in this language, a la most Lisps).

I'm pretty sure that even without the benefit of a maximum stack size this language is not Turing complete, but of course flexibility is contrary to order, and order is paramount, so this tradeoff is to be weathered by the user with stoicism at the very least, and ideally welcomed with solemn satisfaction at the small but essential part one has played in the maintenance of The Way of Things.
posted by invitapriore at 4:11 PM on June 17 [14 favorites]


This is rather quite lovely.
posted by Artw at 4:12 PM on June 17 [1 favorite]


I frequently have a computer snark observation that is obscure enough that it's not obviously sarcasm to non practitioners, but just so totally out of my league here. Do want an expansion of De Morgans law in ternary.
posted by sammyo at 4:19 PM on June 17 [1 favorite]


> Ternary logic was implemented by combining two of these ferrite elements and wiring them in such a way that they could represent three stable states. This approach was successful but did nothing to reduce the number of elements required as, in reality, those two ferrite cores could potentially represent two binary bits, which represents more information (2^2) than one ternary "trit" (3^1). Alas, at least their power consumption was reduced! Setun operated on numbers of up to 18 trits, meaning one could represent anything between -387,420,489 and 387,420,489. A binary computer, on the other hand, would need at least 29 bits to reach this capacity.

Wouldn't this be one of the critical obstacles to the acceptance of implementing ternary logic? The trit is effectively a software value being abstracted in hardware. Even though 387,420,489 can be represented in 18 trits vs. 29 bits, it requires 36 bits of storage compared to 29 bits of storage. So the elegance and expressive efficiency provided by ternary math (at least in the Soviet implementation) ends up outboxed by the user demand for optimized volume and throughput.
posted by ardgedee at 4:22 PM on June 17 [2 favorites]


You may also enjoy reading Pioneers of Soviet Computing (that's a direct link to a 200-page PDF book). I didn't see it linked in any of the pages above, although I might have overlooked it.
posted by Wolfdog at 4:50 PM on June 17 [5 favorites]


Fascinating, I knew nothing about this! My only contribution is to note that the Russian word Сетунь (originally the name of a river) is stressed on the first syllable, SEH-toon (and if you really want to get fancy, both the s- and the -n are palatalized, which means they sound as if they have an almost imperceptible -y- sound attached: S[y]EH-toon[y]).
posted by languagehat at 5:55 PM on June 17 [6 favorites]


Little-known fact: As half a byte is known as a nybble, so half a tryte is known as a "clyché".
posted by Wolfdog at 6:10 PM on June 17 [20 favorites]


adgedee but why assume your storage is bits? Actual hardware level storage doesn't use two state physical elements anyway. I'd assume storage developed for trinary systems would be abstracted as if it were trinary in the same way our current storage is abstracted as if it were binary.
posted by idiopath at 6:17 PM on June 17


My assumption is based on the section I quoted. In the Setun, two units of magnetic core can represent either one trit or two bits, which would imply that binary storage is potentially 125% more efficient than trinary storage. You're right, though, that how the Soviets engineered trinary storage in the pre-solid state era would not necessarily imply how trinary data could be handled with subsequent technologies.
posted by ardgedee at 6:33 PM on June 17


The existence of such a system is something I've wondered about for ages, but in my ignorance (particularly in electronics) I assumed any such system would be based around the three states an electronic component can have - positive, negative, or no signal. Similarly with storage; north, south or unmagnetized. That's why I was surprised at the whole using two elements to store a trit thing. But I assume there are good reasons my idea is impossible.
posted by Jimbob at 6:49 PM on June 17 [1 favorite]


Similarly with storage; north, south or unmagnetized.
Actually, all such systems are magnetized, if you look close enough. It's the direction of magnetization that can vary.
posted by MikeWarot at 8:02 PM on June 17 [2 favorites]


I like the idea presented that base3 is more efficient at some tasks because it's the closest integer to e.
posted by MtDewd at 9:11 PM on June 17 [1 favorite]


Do want an expansion of De Morgans law in ternary.

I've forgotten a lot of this stuff, but there's actually multiple different ways to do trivalent boolean logic. People use different ones depending on why they're using a trivalent system.

For instance, in linguistics (and probably other fields, but linguistics is the one I know best) people sometimes want the three values to correspond to "true," "false," and "shit, something's gone wrong." If you're doing that, then maybe you'll decide that anything ANDed or ORed together with SHIT should yield SHIT, like so.
A B A&B AvB
T T  T   T
T F  F   T
T S  S   S
F T  F   T
F F  F   F
F S  S   S
S T  S   S
S F  S   S
S S  S   S
(If you're into Haskell, this looks a lot like a Maybe Bool, with T as Just True, F as Just False, and S as Nothing.)

Or you could decide that you want your values to be TRUE, FALSE, and IUNNO, which would give you a different truth table. Or you could decide that you want to preserve the property of bivalent logic that OR is like addition and AND is like multiplication, which would give you another different truth table. Or you could do it in yet other ways that I've forgotten about because not being in grad school is wonderful and this stuff isn't my problem anymore. Etcetera.
posted by nebulawindphone at 9:11 PM on June 17 [17 favorites]


There are infinitely many types of people in this world: Those who don't divide people into types, those who divide people into two types, those who divide people into three types, those who divide people into four types, those who divide people into five types, those who divide people into six types, those who divide people into seven types, those who divide people into eight types, those who divide people into nine types, those who divide people into ten types, and you get the idea.
posted by erniepan at 9:23 PM on June 17 [6 favorites]


For those wondering why hardware seems to favor binary logic:

Recall that power is voltage drop times current. A transistor has two low-power states: "on" (aka the "saturation region"), where current is high and voltage drop is near-zero, and "off" (aka the "cut-off region") where voltage drop is high and current is near-zero. In between these two states (in what is known as the "active region") there is both a voltage drop and current, so the transistor dissipates much more power. This is undesirable in computing applications. (There is a fourth, "reverse-active," region which also dissipates power and has additional undesirable characteristics.)

Certain storage technologies, notably MLC flash, aren't inherently binary, but tend to have a power-of-two number of levels per cell for compatibility with inherently-binary CPUs.

Other common storage technologies are inherently-binary: CPU cache is typically SRAM built from flip-flops, which are inherently bistable. And magnetic storage is based on spin-1/2 electrons, which fundamentally have two quantum states.
posted by lozierj at 11:56 PM on June 17 [3 favorites]


Okay, to put this "there are X type of people" thing to rest... there are 17 types of people in the world. Those who will tell you their Myers-Briggs type and those who won't.
posted by DreamerFi at 12:36 AM on June 18 [4 favorites]


And by "love this," I do mean I will be yoinking the shit out of this for at least one of my little art projects. I think the one where Xanadu gold shipped on time will do nicely.
posted by bigbigdog at 1:02 AM on June 18


Ah, and here is reason number 1001201102100 why I love MeFi. {That's 565,281 for all you non-trinary, base-ten squares who are keeping count.} So much super math-y fun!
posted by sic friat crustulum at 4:26 AM on June 18 [2 favorites]


A few years ago I worked up a scheme for a ternary clock such as Arthur Clarke's Ramans (who did everything in threes) might have used. The display would have been made of old school red-green-both as yellow LED's with green=0, yellow=1, red=2. A single trit gives you a third of a day, which is surprisingly useful. Two more give you divisions of about an hour, while a fourth gives you divisions of about 15 minutes which would be perfect for most scheduling uses. A 3x3 grid would give you a clock where the least significant trit changes every four seconds or so. Blinking that one three times per interval gives you a passable version of seconds. I even wrote a simulator for it on a PC.

Unfortunately, it's absolutely useless for anything that has to mesh with ordinary timekeeping because other than the midnight-8-4-midnight trit, it really is a pain to relate to duodecimal clock time.
posted by Bringer Tom at 5:33 AM on June 18 [1 favorite]


For those wondering why hardware seems to favor binary logic:

Recall that power is voltage drop times current. A transistor has two low-power states: "on" (aka the "saturation region"), where current is high and voltage drop is near-zero, and "off" (aka the "cut-off region") where voltage drop is high and current is near-zero. In between these two states (in what is known as the "active region") there is both a voltage drop and current, so the transistor dissipates much more power. This is undesirable in computing applications.


That's all very well, but look at the structure of a standard CMOS logic output: there's one transistor connected between the positive supply rail (+V) and the output, and another between ground (0V) and the output, and they're driven in such a way that exactly one of them is on at any given time, which means that the output is either connected solidly to +V or to 0V.

I can think of no good reason why a balanced ternary output stage couldn't be built with two balanced supply rails and three transistors, exactly one of which is on at any given time, connecting the output solidly to +V or -V or 0V.

Two such balanced ternary outputs would require six transistors, enough to build three standard CMOS binary outputs.

Three bits can represent one of eight states; two trits, one of nine. So balanced ternary would give you more possible output states for any given amount of chip area, not less.
posted by flabdablet at 5:42 AM on June 18 [2 favorites]


Also, while it's true that a flip-flop is inherently binary, I can see no reason why it ought to be impossible to construct storage devices by aggregating flip-flap-flops.
posted by flabdablet at 5:46 AM on June 18 [3 favorites]


Unusually for one of my instinctive handwaves, there actually appears to have been a certain amount of serious research work done on ternary CMOS circuitry.
posted by flabdablet at 6:21 AM on June 18 [1 favorite]


Tristate logic is, of course, a very common thing, albeit with the third state being high impedance representing 'nothing to say' and thus the absence of a useful output.

As well as power issues (even just having a third Vcc/2 bus will cost something) and complexity, the greater the number of valid voltage levels on a bus the lower its ability to reject noise - binary states notionally clamp to one of the rails, but in practice can easily be engineered to work over quite wide windows of valid levels, so a lot of noise energy can be coped with before a signal becomes invalid. And it's trivial to clean things up.

If you go all the way to a true analogue signal capable of representing an arbitrary number, you are limited by noise, stability and drift - which is one of the reasons analogue computers are not widely used, and why even specialist analogue computers such as radios and music synthesisers are commonly replaced by digital systems these days.

Trinay (or n-ary) logic systems are interesting and worth knowing about, but there's nothing they can do that can't be done with binary, where there is so much expertise, so many tools and such a huge amount of extant engineering that you would have to have a very serious inherent advantage to do things differently - high-density flash storage is one case, as mentioned above. I can intuit that there may be such advantages in certain forms of signal analysis and pattern recognition, but I sure as hell can't justify that.
posted by Devonian at 11:04 AM on June 18 [2 favorites]


There are ℵ1 kinds of people in the world: ℵ0 who can count, and ℵ1 who cannot.
posted by grobstein at 11:27 AM on June 18 [4 favorites]


there's nothing they can do that can't be done with binary

This is demonstrably true in a logical sense, given that any ternary signal can be emulated by a pair of binary signals in any of several ways. However, ternary logic might offer enough of a reduction in power consumption and/or interconnect complexity to offer practical implementation advantages in very high density circuitry.
posted by flabdablet at 11:37 AM on June 18 [1 favorite]


This post has sent me down a very interesting rabbit hole indeed - thanks, O twisty one.

The Ternary Manifesto is another good entrance to the warren.
posted by flabdablet at 11:42 AM on June 18


ternary logic might offer enough of a reduction in power consumption and/or interconnect complexity to offer practical implementation advantages in very high density circuitry.

It might indeed. It would also be interesting to see if such parameters could be part of a generalised calculus of different-base logic systems, so that potentially fruitful areas could be identified for further work. I'm reminded of Feynman's work for Thinking Machines, where he analysed router behaviour as a set of partial differential equations - to the bemusement of all... (he also did QCD simulation on the Connection Machine, but as the only language he knew was BASIC, it was in a parallelised form of that. Because Feynman.)
posted by Devonian at 12:06 PM on June 18


Checking in late as usual after a trip to the National Museum of Computing at Bletchley, where a volunteer graciously spent a good twenty minutes or so giving us a crash course on the principles behind the electronics in the Elliott 803 computer. This machine also used ferrite cores as a critical component of its logic elements. I snapped a photo of one of the sheets in the schematic, the one that shows the individual logic elements themselves. (Click to open a light box to zoom in).

Observe the "general circuit" at top left. The logic element is sort-of "clocked" by the a pulse sent along the trigger line, but as the machine is wholly based on current pulses and not logic levels, the term "clock" isn't really accurate. Meanwhile, pulses arrive (or don't) on the AB/CD/EF coils, and depending on which direction the current flows they can interfere constructively or destructively... this induces a current in the coil on the right, which then gets amplified by the transistors to generate outputs.

The whole machine is made of lots of these things hooked up together in various ways. Of note is that there are two trigger pulses, "alpha" and "beta", and that the logic pulses from "alpha"-triggered elements only go to "beta"-triggered ones, and vice-versa. So the computer shuffles along one leg at a time.

All of this apparently winds up being a thrifty way to get one or two (expensive) germanium transistors to do all kinds of logic---each one of these elements is effectively a logic gate, and most modern logic gates wind up using several transistors of much higher quality than the ones available to Elliott back then.

I suspect the Setun machine must have had a similar setup to this, with at least some tubes/valves/transistors in their logic elements somewhere. At the end of the day you need some kind of electrically-controlled switch, and I don't think you can do that with (what's ultimately) just a transformer on its own...
posted by tss at 2:47 PM on June 18 [3 favorites]


Correction: on recollection (assisted by the discussion at the Wikipedia page), the input pulses on the AB/CD/EF lines either magnetise (or don't) the core first, and only after that does trigger pulse come along to induce a current in the read line (or not) depending on the magnetisation (or lack thereof).
posted by tss at 2:57 PM on June 18


tss - did the volunteer demoing the Eliott have a substantial, snow-white beard?
posted by Devonian at 5:32 PM on June 18 [1 favorite]


I can think of no good reason why a balanced ternary output stage couldn't be built with two balanced supply rails and three transistors, exactly one of which is on at any given time, connecting the output solidly to +V or -V or 0V.

Is the transistor connecting the output to 0V PMOS or NMOS?
posted by lozierj at 5:37 PM on June 18


First I've heard of this.

I should give it a tri.
posted by ZenMasterThis at 5:52 PM on June 18 [3 favorites]


Is the transistor connecting the output to 0V PMOS or NMOS?

Dunno. Probably depends which way it turns out convenient for the gate to behave. At this point you leave the realm of my handwaving competence and would need to ask somebody who actually knows what they're talking about.

In fact most of the design options I've looked at since posting that comment seem to involve gate outputs driven by only two transistors, and rely on tricks involving floating gates and capacitance to make the third logic level work anything like sanely.
posted by flabdablet at 9:49 PM on June 18


There are ℵ1 kinds of people in the world: ℵ0 who can count, and ℵ1 who cannot.

*at least!!
posted by grobstein at 7:43 AM on June 19


That's actually demonstrably untrue.

The maximum possible number of kinds of people would have each kind consisting of exactly one person, and is therefore equal to the current human population. This is a finite integer and therefore < ℵ0.
posted by flabdablet at 8:58 AM on June 19


The maximum possible number of kinds of people would have each kind consisting of exactly one person, and is therefore equal to the current human population. This is a finite integer and therefore < ℵ0.

Even if every kind of person had to include at least one person, the number of kinds is far larger than the number of people.

Think of the kind of person who's exactly like me in every way (membership: 1), the kind of person who's exactly like me or my brother in every way (membership: 2), etc. So the number of kinds of people who are currently instantiated would seem to be of the order of the powerset of the current human population, a much larger number -- although still a finite one. As long as kinds are defined extensionally, and each kind must include at least one living person, then the number of kinds is finite. If kinds are defined extensionally, then for there to be ℵ1 kinds there need to be ℵ0 people to include in those kinds.

However, there are kinds of people who don't exist right now. Think of the kind of person who is a current ruler of the Roman emperor. No one today is that kind of person, but it's still a kind of person! (There used to be some kinds of person, now there's different kinds of person.) Most importantly, consider the kind of person who thinks unicorns have exactly 441,000,000 hairs. There might not be any of this kind of person alive today or ever, but it's still a kind of person.

So the proposed finite bound on the number of kinds of people won't work.
posted by grobstein at 9:30 AM on June 19 [3 favorites]


I think it's reasonable to take the view that kinds of nonexistent people are not kinds of people for the purposes of the statement "There are N kinds of people in the world: <distinctions>".

If you're going to say "There are, were, will be or could be N kinds of people in any possible world" then sure, but nobody says that.

I agree with the revised figure of two to the power of seven-and-change billion for the number of possible groupings of existent people.
posted by flabdablet at 10:25 AM on June 19 [1 favorite]


Well, however many kinds of people there are, there are more sets of kinds of people than that.
posted by Wolfdog at 11:12 AM on June 19 [3 favorites]


I think it's reasonable to take the view that kinds of nonexistent people are not kinds of people for the purposes of the statement "There are N kinds of people in the world: ".

That's fair. I should probably drop "in the world," which I think does more for the rhythm of the joke than the meaning anyway.
posted by grobstein at 12:50 PM on June 19


Come now, and let us over-egg the pudding together :-)
posted by flabdablet at 7:35 PM on June 19 [1 favorite]


« Older Historical Markers Database: pinpointing local...   |   “The appeal is simple: sheer, stupid, ridiculous... Newer »


You are not currently logged in. Log in or create a new account to post comments.