The dark forest is full of life.
May 27, 2019 3:01 AM   Subscribe

This is also what the internet is becoming: a dark forest. In response to the ads, the tracking, the trolling, the hype, and other predatory behaviors, we’re retreating to our dark forests of the internet, and away from the mainstream.

Dark forests like newsletters and podcasts are growing areas of activity. As are other dark forests, like Slack channels, private Instagrams, invite-only message boards, text groups, Snapchat, WeChat, and on and on. This is where Facebook is pivoting with Groups (and trying to redefine what the word “privacy” means in the process).

From Yancey Strickler who writes the newsletter, Ideaspace.
posted by Telf (41 comments total) 39 users marked this as a favorite
 
Facebook banks on digital currency - "Facebook continues to pivot towards the model of Tencent's WeChat — more private social channels where buyers and sellers can be matched and communicate with one another, paying for goods as they go. Close to 60 per cent of WeChat revenues come from payments. Less than a third is from advertising."
posted by kliuless at 3:56 AM on May 27, 2019 [1 favorite]


Interesting article, but the dark forest metaphor is all over the place (also, if you haven't read the Three Body Problem, do so, it's awesome). Podcasts and the like aren't the dark forest, we know exactly where they are, how to access them, etc. Snapchat, WeChat, etc. are private clubs, but it's not like nobody knows that Snapchat or WeChat exist. The existence of the darknet is the closest we to anything dark foresty.

Then it starts talking about "dark forests" (plural), which doesn't make any sense in the metaphor.

It is an interesting article, though. Maybe if I weren't such a big fan of the Three Body Problem the metaphor wouldn't bug me.
posted by Bugbread at 4:01 AM on May 27, 2019 [18 favorites]


"dark forest" sounds better to me than "rounded filter bubbles of happiness" but please don't ask me to explain the difference between the two because I can't.

And I don't know about newsletters, but I find it a lot harder to escape ads in podcasts than on the internet at large, what with forwarding through the icky parts or unsubscribing as the only means of protection.
posted by Ashenmote at 4:02 AM on May 27, 2019 [3 favorites]


The web 2.0 utopia — where we all lived in rounded filter bubbles of happiness 

The piece is racing for something, but it's just why we should think of a "dark forest" as anything other than another filter bubble. Strickler stresses the lack of monetization and tracking, but it's not like newsletter ads and podcast ads don't exist, or like those media aren't being supported by advertising. Or, for that matter, that that advertising won't get honed to a finer edge.

This is an appealing buzzword to attach to a sentiment that I've seen elsewhere (Facebook is dead, long live newsletters) but the idea that this is killing the filter bubble seems off.

Rather, the critical thing with newsletters, podcasts, private Slack channels, and private subreddits is that the tone is much more explicitly set by the moderator / person sending out the newsletter. There's a very direct kind of accountability there, someone much more immediate with whom the buck should stop. These are moderated filter bubbles.
posted by Going To Maine at 4:03 AM on May 27, 2019 [18 favorites]


Most of the benefits he describes can be had in public, simply through not using the platforms of the companies who sell ads, track you and happily house trolls and predators. I mourn the loss of the early internet, I'm not sure the retreat into balkanised private spaces is the answer though (places that rarely turn out to be as private as thought when push comes to shove).
posted by deadwax at 4:06 AM on May 27, 2019 [5 favorites]


Private utopian web communities just decay into stories about shitting your pants, anyway.
posted by thelonius at 4:07 AM on May 27, 2019 [21 favorites]


(Which isn't to say Facebook isn't moderated. It's just that these spaces want to own their degree of moderation, while Facebook seeks to deny having any.)
posted by Going To Maine at 4:07 AM on May 27, 2019 [1 favorite]


Wait, so is Metafilter the dark forest? Because when I tried to RTFA, this happened and I turned around and came back here.
posted by jeremias at 4:09 AM on May 27, 2019 [16 favorites]


Private utopian web communities just decay into stories about shitting your pants, anyway

It's going to be odd to watch Instagram go from pictures of babies to pictures of bedpans over the next so many years.
posted by Going To Maine at 4:10 AM on May 27, 2019 [2 favorites]


The metaphor is clumsy at best, but the idea seems to be in suggesting that people are withdrawing from the wider web like prey animals do to avoid predators, thus connecting to the dark forest idea sorta, but absent predators those aren't dark forests and withdrawing to communicate with like others isn't silence, so the idea doesn't hang together all that well. The piece questions the trade offs, but with the "dark forests" seeming to be used to provide safer reserves for people the trade offs become less meaningful to discuss in the fashion one would when everything is wide open. I dunno doesn't seem like much new here other than the term.
posted by gusottertrout at 4:16 AM on May 27, 2019


I think the "predator" could be hyper-targeted advertising?
posted by Going To Maine at 4:22 AM on May 27, 2019 [2 favorites]


I agree that the metaphor isn't as tight as it could be. The article touched on a point raised in the conversation that Ezra Klein had with Jenny Odell. The talked about the difficulty of decontextualized conversations on the internet. For example certain jokes only make sense when embedded in your twitter circles or group of friends. Taken out of context, they could be misconstrued as horrible/racist/regressive/dumb/misinformed etc.

Actually, come to think of, it's a very similar point raised by Shane Parrish in his conversation with Sam Harris. He used the term "Sandbox".

The cut and paste nature of web 2.0 makes it too easy to strip conversations of their meaning and abuse their representation.

I think our best conversations aren't happening on big social media. Twitter is too performative even at the best of times.

Maybe dark forest isn't the best term. We need more alcoves? Carrels?

It all comes back to the point that we need to slow down social media and add some friction to the amplification process. An anti-avalanche mechanism that retards the spread of viral memes; especially pile ons. Every retweet should become slightly more 'expensive' based on how many people have already shared an idea. This refers back to my idea that we should have a social network with a finite amount of regenerating action points. You could accrue more by producing quality content and any action would cost points. (Members would 'pay' producers with their own points. It would add a modicum of cost to the millions of behaviors people exhibit. It's too convenient to be horrible on the internet. Obviously, the whole thing would be run like a worker owned co-op and the points you accrue would be a fungible crypto currency thurs rewarding participation and rewarding good stewardship.
posted by Telf at 4:29 AM on May 27, 2019 [4 favorites]


gusottertrout,

A lot of this conversation is new to me. Aside from the same anti-social media messages we've been hearing more of since 2016, what are some enlightening takes you've read on this? I'm interested on the growth of newsletters and the general retirement from big internet platforms. The joke I've made with my email friends is that I'm LARPing late 90s internet. (No social media, no real web 2.0, still use message boards etc.)

Not so interested in the current conversations about algorithmic radicalization or left/right filter bubbles in this case.
posted by Telf at 4:46 AM on May 27, 2019 [2 favorites]


I tried going into the dark forest but I was eaten by a grue.
posted by overeducated_alligator at 4:52 AM on May 27, 2019 [16 favorites]


The forest thing fails as a metaphor for me because I don't really think people are hunkering down and turning into monks. I'd think there was a problem if we each had one single other community we migrated to, but, like, Metafilter aside, I have nine Discord servers and four Slacks that I check regularly right now. I'm the most sociable I've ever been.

Twitter and Facebook aren't getting much in the way of contributions from me--nothing, in Facebook's case--but it seems downright delusional for me to think that I was having any positive impact on Facebook by being there. It was genuinely bad for my mental health to keep insisting on that. The whole problem with Facebook was that we weren't doing any good by hanging out there. The algorithm made sure of it! The main problem I see is not this migration; it's how many people aren't changing their internet usage or where they get their information at all, even in the face of years' worth of horrifying stories about abuses and misinformation.
posted by Sequence at 4:54 AM on May 27, 2019 [3 favorites]


I'm just getting to the end of Liu Cixin's trilogy (it's a lot of fun, do give it a go) and I agree that the way this article uses the term "dark forest" is confusing.

The internet is the dark forest. Newsletters, podcasts, Slack channels, private Instagrams, invite-only message boards, text groups, Snapchat, WeChat, etc. are not dark forests.

There's a comment by Yann Eves on TFA that clarifies things a bit (comments in square brackets by me):
there’s a means to safeguard an area in the dark forest by broadcasting a message that your existence does not pose a threat. It becomes commonly understood that this is achieved by lowering the speed of light in an area, to create a dark tomb, where anything that enters can never leave.

If we take this analogy to the author’s [Yancey Strickler's] writing, the internet as a whole is the dark forest (there’s no plural), where he refers to dark forests [Newsletters, podcasts, Slack channels etc.], we can actually interpret these as dark tombs.
It's a bit more complicated in the books, and in my translation "dark tombs" are called "black domains", but basically there are three possible strategies for self-defence once you've revealed your existence in the dark forest. Enough spoilers for now. Read the books!
posted by ZipRibbons at 5:04 AM on May 27, 2019


Paging Mr. Virgil.
posted by BWA at 5:07 AM on May 27, 2019 [3 favorites]


I think our best conversations aren't happening on big social media.

This is a much more straightforward claim than anything in the article, which tries to be so much bolder. "blogs were good, let's do blogs again" seems like a very defensible posture for newsletters et al, in the same way that "radio is good, let's do radio but at any time of day" is a good argument for podcasts.
posted by Going To Maine at 5:08 AM on May 27, 2019 [5 favorites]


The problem with the analogy is that there is something fundamentally untrue and potentially harmful in thinking about it in the terms given. People don't want to not be heard, they just don't want to deal with those they deeply disagree with. We've found that removing the gatekeepers and letting everyone have their say allows all sorts of evil, stupid and wrong people to find an audience. Unfortunately we don't agree on who those people are as a society, so trying to isolate the problem in technology won't work because it isn't a technological problem, it's a people problem.

Returning to an era of segregated conversation, where the "right people" are heard and the rest get shunted off to their own smaller ignored communities isn't an answer since that's where we came from and it wasn't great. Trying to make laws to account for hate speech and harassment has some support, but it would require those in power to agree with the definitions of who is the harasser and what is hate, so there is some real concern there as well given how power is distributed in some nations like the US. We want to be heard or at least to have those we agree with find an audience and we want those we profoundly disagree with to be denied the same. The internet makes finding the audiences and sharing easy, but at the cost of making it difficult to prevent those who have sufficient power but horrible ideas from also finding audience that will work against our interests. I don't know how you fix that.
posted by gusottertrout at 5:22 AM on May 27, 2019 [2 favorites]


Called it.
posted by DarkForest at 6:01 AM on May 27, 2019 [40 favorites]


Good preemptive "Eponysterical!" blocking tactic. Might as well preempt the Metafilter: "Insert previous comment here" thing.

Metafilter: Good preemptive "Eponysterical!" blocking tactic. Might as well preempt the Metafilter: Insert previous comment here" thing.
posted by Telf at 6:09 AM on May 27, 2019 [1 favorite]


It's not about disagreement. When I want opposing opinions I can open up practically any newspaper or magazine and read a piece about how horrible I am. I run google news twitter, and tumblr searches on a regular basis to confirm that lots of people think I'm pretty horrible as a queer person.

The dominant platforms for having conversations online have minimal barriers regarding harassment. Shouting "shut up {slur}" in a comment box is the shallowest form of disagreement, sharing that sentiment with 100 "friends" to do the same is a harassment campaign. I don't need dozens of comments on my timeline to remind me that people disagree with me, including the person who jumped into a conversation about chicken sliders to tell me I'm a horrible person.

The brevity, the shallowness, the ability to cold-call strangers, and treating comments as a cognitive "push" rather than a "pull" were all design decisions built into the software by engineers who ignored the 20 years of prior art that said "don't do that."
posted by GenderNullPointerException at 6:28 AM on May 27, 2019 [12 favorites]


Facebook banks on digital currency - "Facebook continues to pivot towards the model of Tencent's WeChat — more private social channels where buyers and sellers can be matched and communicate with one another, paying for goods as they go. Close to 60 per cent of WeChat revenues come from payments. Less than a third is from advertising."

This is seen as a perceived competitive threat for my current work in Africa. There are systemic differences that preclude scaling the way FB would want it (one solution designed for all users, for eg). WeChat is focused on one billion customers in China, while's FB's one billion are all over the world. Plus, governments have a tendency to shut down FB at critical times due its lovely approach to society and governance, and the informal trade ecosystem that's thriving on digital can't afford to put all their eggs in that basket.
posted by hugbucket at 6:29 AM on May 27, 2019 [1 favorite]


Now, the dark forest. Mine is mltshp. (and metafilter once y'all decide if its dark and foresty or not)
posted by hugbucket at 6:30 AM on May 27, 2019 [1 favorite]


Telf: your proposed system sounds a bit like one of YayHooray's old moderation experiments. It was a spectacular failure; easily gamed, inherently clique heavy, and an enormous amount of work to make anything happen inside. It's enough work having any kind of conversation as it is, and raising the level of friction isn't likely to cut out bad actors so much as it is to cut out any voice that isn't highly motivated, regardless of what that motivation is. And I guarantee you the bad actors are more highly motivated to harass than I am to share a cat video. I think any system that eschews direct human intervention in favour of automation or market-simulating methods will simply mutate the problem. The problem is trust, and the more we mediate trust with automatic systems instead of direct human intervention, the less trust we have.

The dark forest metaphor definitely falls apart a bit when it's unnecessarily pluralized, but I've been a member of one of those little dark pockets since 1999, and we've tried every moderation tactic under the sun. The one thing, the only thing, that's worked, is building trust over time. Named individuals (even if it's just usernames) who are held to account by other named individuals in plain language, with safe, judgement-free spaces to discuss community standards. Sometimes there are hard words and hard feelings, but once we switched from various systems of points or what have you to one that prioritizes knowing and trusting the people in your community, our troll/disinformation/whatever problems mostly disappeared. And when serious problems did show up (which I won't get into for privacy reasons), we were equipped to talk about it and address it in ways that were responsible and healthy, and that allowed for people to make mistakes and change their minds and to do so without being set upon, ostracized, or in some way "punished" by our system. We just trusted each other. But I think this only works if you deal with people qua people instead of people as nodes in a network or actors within a system.

It's not something that scales, and it's certainly far from perfect, but it does a few things: 1) It allows us a space that we can think of as "home" on the Internet; 2) it (mostly) makes us better citizens of the other spaces we occupy online; 3) it creates bonds of community that go beyond the site of the community itself (we've broken up and regrouped several times; we've had weddings and funerals and businesses and helped pay each other's bills when disaster strikes).

The problems with the mainstream internet/social media is a social one disguised as a technological one, and the solution will also have to be social. Facebook, Twitter, etc. are bad, but we talk too much about them and how they work as inevitabilities; they are the result of choices made by people. My biggest worry is that within the surveillance capitalist ecosystem the social fix that needs to happen is not (primarily) at the level of the user, but at the level of the people who build these things. Facebook's fundamental understanding of its purpose is at odds with its users' understanding of that purpose--deliberately so. In order for a trust-based system like the one I described above to work, it has to be built to work that way, and the choice to do that instead of to build it for wealth extraction from surveillance has to be made by people who are held accountable. How do we help people prioritize those decisions instead of exploitive, extractive ones? That, too, will ultimately have to come from direct human intervention (by legislators and regulators, most likely).

Until we figure that out, put up some blackout curtains, I guess.
posted by Fish Sauce at 6:50 AM on May 27, 2019 [28 favorites]


Fish Sauce,
That was excellent. Thanks. Have you written about this topic elsewhere? Any histories about moderation would be appreciated.
posted by Telf at 6:58 AM on May 27, 2019


Thanks! I have not written about this before, except my two cents in discussions about various moderation systems we wanted to try within our community. If there are any histories of moderation I'd be interested in reading them as well.

I've been reading about different kinds of systems lately, stuff like Shoshana Zuboff and Adam Greenfield's books, but I've also been working at my day job on a new kind of legal education program, so I've been learning from practicing lawyers a lot about what contracts are actually supposed to be for, and a lot of stuff has come together for me. From the perspective of the lawyers I've been working with, contracts are only formal agreements at the surface level. What they are more fundamentally is a mechanism for building trust between people; a good lawyer doesn't want to write a contract that's iron-clad, inflexible, and punitive, because that's missing the point of contracts entirely, according to the folks I work with. That kind of blew my mind, and has very heavily informed my thinking as I've gone back and looked at what the communities I'm involved with online have done over the years, and what has and hasn't worked. A bunch of things I used to think were unrelated suddenly look like different takes on the same problem to me.
posted by Fish Sauce at 7:20 AM on May 27, 2019 [8 favorites]


Not a mention of the federated internet and services like PixelFed and Mastodon in this thread. That is to where I am am retreating. Reminds me more of the internet of the late 90s early 00s.

This is why I prefer Mastodon and PixelFed (and MetaFilter) to Twitter/FB and Instagram:

tone is much more explicitly set by the moderator
posted by terrapin at 7:41 AM on May 27, 2019


Telf: it wouldn't hurt to start with A Group Is Its Own Worst Enemy. It's not complete: Stack Overflow notably took its observations into account and they still ran into problems eventually, but then Stack Overflow became the defacto community for IT so they're in a different situation. But everything it talks about there tracks with Fish Sauce's lived experience. You can't solve a social problem with a technical solution, and expecting free and full connections like Facebook and Twitter do mean that the edges of social circles blur in ways that humans are simply not equipped to deal with.

Edit: Mastodon doesn't really solve the problem, I find; microblogging doesn't really allow you to build up that level of accountability for your words and opinions that you need for civilised conversation, so you end up getting incidents like what happened when Wil Wheaton joined, and was driven off, Mastodon.
posted by Merus at 7:43 AM on May 27, 2019 [2 favorites]


The problem is trust, and the more we mediate trust with automatic systems instead of direct human intervention, the less trust we have.

This, speaking as a design researcher/concept designer for digital services in markets which are heavily informal and cash based.
posted by hugbucket at 8:00 AM on May 27, 2019 [1 favorite]


What they are more fundamentally is a mechanism for building trust between people; a good lawyer doesn't want to write a contract that's iron-clad, inflexible, and punitive, because that's missing the point of contracts entirely, according to the folks I work with.

As someone who has been immersing myself in informal economies, rural or urban, for a decade now, observing the underpinning frameworks of what makes these work (not to mention resilient and cooperative), I can say that in environments with weaker institutions and rule of law such as huge chunks of Asia and Africa, its the p2p that drives trust and vice versa. Trusted referrals, word of mouth, long established networks and communities, all of these are create a system that works in the conditions and contexts it must.

Faceless institutions and contracts seeking to minimize perceived fraud end up raising higher barriers to compliance than necessary.
posted by hugbucket at 8:05 AM on May 27, 2019 [7 favorites]


The cool thing about the internet is that anyone can share their thoughts with millions of other people.

That’s also the awful thing about the internet.

Human conversation doesn’t scale well. We do really well when our audience is small - a few dozen at most, people we know and can learn to either trust or at least anticipate response. Think about a big holiday with family: you know darn well cousin Ann will agree with your politics, uncle Bob is best kept to discussions about sports because politics is a minefield with him, aunt Clara isn’t interested in either but will talk recipes and baby photos all day. And everyone stopped inviting cousin Del because he is just not fit for pleasant company. And so on. Same applies at work, same rules, you know who can be counted on to react well to a given topic and who will not.

When the audience becomes larger, you can’t anticipate response. You have no idea who will listen and argue against you. You don’t have any history to determine whether the argument is good-faith or an organized attack against an ideology, rather than you specifically. You can’t avoid allowing angry strangers to invade your space with no motive other than to break things and cause havoc.

Communities are good. Crowds are dangerous. We need more communities, but we’ve fallen into the trap of thinking that communities can be built by profit-driven algorithms rather than accrued personal experience.

(It is amazing however, what a few dedicated mods and a $5 entry fee can do to encourage the latter.)
posted by caution live frogs at 8:24 AM on May 27, 2019 [16 favorites]


Metafilter: It is amazing however, what a few dedicated mods and a $5 entry fee can do....

I wish the author had called them something long and awkward like Thought Enhanced Liminal Forests.
posted by otherchaz at 9:04 AM on May 27, 2019


And I don't know about newsletters, but I find it a lot harder to escape ads in podcasts than on the internet at large, what with forwarding through the icky parts or unsubscribing as the only means of protection.

I don't mind the ads in podcasts.

Support for pracowity's comment comes from Squarespace, the all-in-one platform that lets you create a beautiful website or online store. A wide range of people and businesses use Squarespace -- musicians, designers, artists, restaurants, and more, all taking advantage of Squarespace's award-winning templates and award-winning 24/7 customer support. There's nothing to install, patch, or upgrade ever. To start your free trial and receive a special offer on your first purchase, go to squarespace.com/pracowity. Squarespace, make your next move.

And from Rocket Mortgage by Quicken Loans. When it comes to the big decision of choosing a mortgage lender, work with one that aims to protect your best interests. Rocket Mortgage provides a transparent online process that helps you understand all the details of your home loan. You can even adjust the rate and length of your loan in real time to make sure you get the right mortgage solution for you. Skip the bank, skip the waiting, and go completely online at quickenloans.com/pracowity. Equal housing lender licensed in all 50 states.

You just have to practice skipping past them.

Thanks, as always, to my account's co-founder, Mr. Torey Malatia...
posted by pracowity at 9:09 AM on May 27, 2019 [12 favorites]


When the audience becomes larger, you can’t anticipate response.
A larger audience is also going to contain more disruptive people. It's a number problem, made worse by the fact that disruption scales up faster and stronger than good behaviour, and the damage doesn't scale up linearly. There was a recent Askme about a Facebook group that had started as a parents group: 350 members later, it had become so unusable that the founder was planning to close it down. Problems that used to be dealt with once or twice a year were no longer manageable because too many parents were participating in the disruption. Is the "dark tomb" model - ie a smallish gated community with strong moderation + filters for membership - a solution to that? In some cases yes, but I believe that for many communities having large numbers is an important asset, if only to ensure renewal and change.
posted by elgilito at 9:19 AM on May 27, 2019


A larger audience is also going to contain more disruptive people. It's a number problem, made worse by the fact that disruption scales up faster and stronger than good behaviour, and the damage doesn't scale up linearly.

Newsletters and podcasts do not have this problem, or can at least disguise that they are suffering from it better, because everything passes through the host and whoever they rope in to help moderate contacts. I think of Andrew Sullivan's old blog as an archetype here (say what you will about Andrew Sullivan), because it lasted for a very long time, and I believe it had maybe two interns. But then, Sullivan didn't have a Facebook page or otherwise provide a means for public commenting. Control is tantamount in this model.
posted by Going To Maine at 10:55 AM on May 27, 2019


People don't want to not be heard, they just don't want to deal with those they deeply disagree with.

This sort of statement is part of the problem. We have redefined hate and bigotry culturally to a form of disagreement, and in doing so have created justification for them remaining in our dialogue. It's at the point that a forum choosing to actually call hate and bigotry what it is is actual news.

Is the "dark tomb" model - ie a smallish gated community with strong moderation + filters for membership - a solution to that? In some cases yes, but I believe that for many communities having large numbers is an important asset, if only to ensure renewal and change.

I don't think that disruption and large numbers go hand in hand. Instead, the problem is that many moderators are unwilling to grasp the harder aspects of moderation, specifically dealing with disruptive elements. There's a few reasons for this - fear of being perceived as tyrannical, the inherent difficulties of sanctioning, an unwillingness to push away active members, etc. - but the result is that too often, moderators let toxicity continue until it's too late.
posted by NoxAeternum at 10:55 AM on May 27, 2019 [2 favorites]


I mean nobody likes putting up with their racist uncle at Thanksgiving and the big social media services make you put up with everyone’s racist uncle all the time then give you condescending lectures about free speech if you imply it’s unsatisfactory in some way.
posted by Ghostride The Whip at 11:07 AM on May 27, 2019 [10 favorites]


I mean nobody likes putting up with their racist uncle at Thanksgiving and the big social media services make you put up with everyone’s racist uncle all the time then give you condescending lectures about free speech if you imply it’s unsatisfactory in some way.

Oh yes. The way that the complete erasure of boundaries and meaningful audience distinctions by massive social media has become internalized as a free speech values is deeply troubling.
posted by GenderNullPointerException at 11:19 AM on May 27, 2019 [2 favorites]


If I dropped in on an ophthalmology conference tomorrow, to what degree am I entitled to make demands that participants argue the fundamentals of their field for a complete novice such as myself?
posted by GenderNullPointerException at 11:23 AM on May 27, 2019 [5 favorites]


Returning to an era of segregated conversation, where the "right people" are heard and the rest get shunted off to their own smaller ignored communities isn't an answer since that's where we came from and it wasn't great.

I think a lot about this stuff, too much, in fact, but the above quote is exactly how it is going to look no matter what we do to get there. It is always going to head that way, presupposing there's any hope for change now anyway. I'm not looking for "great" anymore, just "better than this crap we got around us". I'll regress into 60's rhetorical style: "What Do We Want?" "Shunt!" "When Do We Want It?" "Now!"
posted by Chitownfats at 11:50 PM on May 28, 2019


« Older The numerical values of the 7 defining constants...   |   Peripheral Belters or Retooling Finance and Tech... Newer »


This thread has been archived and is closed to new comments