Our faulty brains
January 22, 2023 9:06 AM   Subscribe

A Guide to Your Most Common Thinking Errors. 21 biases explained in an animated video. Part 2.
posted by storybored (52 comments total) 31 users marked this as a favorite
 
Kahneman and Tversky are one of the ur texts of the rationalist movement, popularized by Less wrong and Elizier Yudkowsky. It fed into the Machine Intelligence Research Institute which fed into the Future of Humanity Institute and Effective Altruism.

It's been one of the leading movements against expertise, and the basis upon which many wealthy & educated Liberals abandoned the social sciences and humanities as an analytic tools for understanding themselves.

I can't emphasize enough how many people, often people I liked, we're profoundly harmed by this ideology.

Some notable luminaries who were mixed up in this stuff 10 years ago include Sam Bankman-Fried, and Elon Musk.

Content like this tends to be the leading wedge of getting recruited to a much larger and more harmful ideology.
posted by constraint at 9:14 AM on January 22 [36 favorites]


...I disagree. Bounded rationality is a hugely important concept.
posted by Gadarene at 9:46 AM on January 22 [6 favorites]


There's nothing wrong with rationality. There is a problem with interpreting rationality as an end in itself. Rationality is a tool, and like any tool can be deployed for good or ill purposes. When a party declares their belief in rationailty, it's like declaring their belief in a shovel. That's a nice shovel, but what are you using it for?
posted by phooky at 10:12 AM on January 22 [35 favorites]


Are Kahneman and Tversky rationalist bros themselves? Is it wrong to have a conversation about their cognitive science work that doesn't center rationalist broism?

The treatments of their work I've read are not against expertise, they're against relying on untrained intuition about probabilities.
posted by away for regrooving at 10:29 AM on January 22 [12 favorites]


I don't recognize the use of the word "rationalist" in this context, can someone explain? I am a cognitive scientist, and I am a rationalist in the sense of Galileo; rationalism here is an epistemological principle that is often opposed to strict empiricism in philosophy of science. What is a rationalist bro and how did this word come to mean what it does for all of you?
posted by os tuberoes at 10:40 AM on January 22 [15 favorites]


"Rationalist bro" means a follower of the ideas of Elizier Yudkowsky, as promoted on his site LessWrong (here's an article from a wiki specializing in crackpot debunking).
posted by ver at 10:55 AM on January 22 [7 favorites]


Started the first one, less than 5 minutes in I was being demeaned to rather than informed. Came here and saw it's got a lineage going back to "Less wrong" and... yep.

Don't bother with this, if you want to understand logical fallacies, there are better ways to learn about them without making you feel stupid _or_ superior for doing so.
posted by diracshard at 11:11 AM on January 22 [12 favorites]


As someone who (eponysterically) got into Thinking Fast and Slow and Less Wrong a number of years back, I have mixed feelings on this sort of thing. Interpreting data to establish and revise belief and accounting for confounding factors and bias is a valuable skill. However, the self-styling of one's own group as "rational" is pretty conceited and can easily lead to unhelpful beliefs about the world and others. As I got more into philosophy broadly and learned about the debates within that field regarding rationality, I found it far more useful to understanding myself and others, and accessing empathy and self-compassion. I still think lots of Kahneman's work is useful, though I find the parts about the tension between enjoying our lives in the moment and being proud of the life we've lived retrospectively far more valuable than the cognitive biases.

To me, the worst part of the so-called Rationalist movement is the notion that with perfect knowledge and logic, there would be only one correct answer to anything. That's a distillation of the "reasons externalism" school and it completely ignores the existence of the internalist perspective, which acknowledges that people are different and want different things. (Not to mention the fact that emotional and qualitative experience are just as important and valid for understanding humans as the logical and quantitative are for understanding the universe.) Learning about the difference doesn't take away from my ability to recognize the effects of bias, but it takes away my justification for dismissing other people's surprising behavior as "irrational": a self-serving conclusion that defines people who disagree with me as somehow defective. Taking a primarily internalist view motivates me to understand how someone's values and beliefs may differ from mine and focus on curiosity about how each was formed and may change in accordance with new information.

For anyone who's interested in an exploration into what it means to be rational from a philosophical perspective, I found PhilosophyTube's "Are You Rational?" series to be an excellent introduction.
posted by Cogito at 11:21 AM on January 22 [35 favorites]


Well, thanks for that, ver. I see from poking around there that they do not adhere to rationalism, but rather to rationality, which they just seem to assume applies to whatever they believe. Thanks for the link Cogito.
posted by os tuberoes at 11:24 AM on January 22 [1 favorite]


as promoted on his site LessWrong (here's an article from a wiki specializing in crackpot debunking).

yeah, if the rationality on offer is one that leads to transhumanism, artificial intelligence, the Singularity, and cryonics, fuck it, I'm going with sky fairies ...
posted by philip-random at 11:38 AM on January 22 [5 favorites]


(he said realizing full well he'd likely just fumbled into an obvious bias)
posted by philip-random at 11:39 AM on January 22 [1 favorite]


The video is based on Kahneman and Tversky’s work on cognitive biases. Arguing that Kahneman and Tversky’s work influenced (for example) Lex Luthor and the Joker, and therefore that we should renounce and fear Kahneman and Tversky, is really silly, at least until we have a great deal more specification of the allegedly dangerous ideas that K&T passed on. This kind of argument is (ironically) a classic example of the genetic fallacy! One might as well argue that K&T enjoyed eating tomatoes, and that Sam Bankman-Fried later enjoyed eating tomatoes because he learned about their culinary use from K&T: I promise, the enjoyment of tomatoes is not evidence of the malign influence of K&T on on anyone else.

I watched the video & didn’t find it demeaning at all. YMMV. I think it’s extremely healthy to keep the possibility of limited human knowledge/cognitive bias in mind. I also think the kneejerk application (and, indeed, misapplication) of the genetic fallacy in the comments is troubling.
posted by PaulVario at 11:47 AM on January 22 [10 favorites]


Many cognitive biases are biases only insofar as they disagree with predictions of particular frameworks for understanding behavior. In Rational Choice Theory, in which framework K&T worked, preferences should be consistent: a person who prefers $10 today vs. $100 dollars in a year should also prefer $10 in 10 years vs. $100 dollars in 11 years. People generally are not consistent in this way. Labeling these inconsistencies as 'biases' implicitly suggests that the framework with which they are inconsistent is the ideal toward which all humans should strive. That attitude is pretty much front and center in that video - or at least the 2 minutes I could manage to watch.

What is frequently left unsaid is why we should uncritically adopt Rational Choice Theory, or Bayesian inference and updating, or what have you, as ideal frameworks. The lesson people should be taking from K&T is that rational choice is a shitty theory for describing how people behave, and maybe we should work on finding a better framework instead of negging people about their biases.
posted by logicpunk at 12:34 PM on January 22 [24 favorites]


The lesson people should be taking from K&T is that rational choice is a shitty theory for describing how people behave, and maybe we should work on finding a better framework instead of negging people about their biases.

Abso-freaking-lutely.
posted by Gadarene at 12:41 PM on January 22 [8 favorites]


Hmmm, I may be confusing thinss there but K&T *are* saying that Rational choice is a shitty theory for describing how people behave.
posted by storybored at 12:48 PM on January 22 [6 favorites]


@logicpunk:

In Rational Choice Theory, in which framework K&T worked, preferences should be consistent: a person who prefers $10 today vs. $100 dollars in a year should also prefer $10 in 10 years vs. $100 dollars in 11 years. People generally are not consistent in this way. Labeling these inconsistencies as 'biases' implicitly suggests that the framework with which they are inconsistent is the ideal toward which all humans should strive. That attitude is pretty much front and center in that video - or at least the 2 minutes I could manage to watch.

The above is a significant misunderstanding, or less politely an unintentional parody, of Tversky and Kahneman’s work; I’d be very surprised if you could cite or find anything that supports this very odd paraphrase. They certainly did not argue anything like your dollar example. Their work in behavioral economics is an attack on neoclassical economics; neoclassical economics in its most extreme form understands/understood it to be axiomatic that preferences are stable, well-defined, well-informed and invariant over time. A central area of T&K’s focus is/was the cognitive mistakes people make, but I don’t think their criticism about cognitive errors extends to the idea that (e.g.) people ideally should have meta-preferences about the time value of money that are invariant over a decade or so; that’s a serious misreading.
posted by PaulVario at 12:57 PM on January 22 [12 favorites]


Work on bounded rationality or behavioral economics is really important in countering the model of homo economicus that once and even still prevails in understanding of behavior in markets. people do not, never have, never will act as rational utility maximizers, choosing always the action with the “best price” when faced with multiple options. Instead all kinds of environmental factors and biases collude to push individuals towards irrational behaviors. All that is well and good.

But knowing this is the case doesn’t make you one of the lucky few who can now finally access pure unadulterated rationality, raw dogging the mind cube. that’s the thing that is insidious about this world of less wrong, the idea that now that you’ve learned about some cognitive biases you are one of the elite mental warriors, finally able to wield logic and vanquish the plebs who don’t even know a fallacy from a phallus. There’s no magic bullet
posted by dis_integration at 1:00 PM on January 22 [21 favorites]


Content like this tends to be the leading wedge of getting recruited to a much larger and more harmful ideology.

Classic guilt-by-association fallacy right here.
posted by officer_fred at 1:05 PM on January 22 [7 favorites]


It's hard to develop confidence without becoming arrogant. Knowing cognitive biases increases our self-knowledge and yes, also our confidence, but then when it goes too far, we end up falling into the precipice.

I am curious about how some are rejecting learning about cognitive biases out of hand though.

People lost billions of dollars in the recent crypto bubble. If they had known about confirmation bias, recency, anchoring and social proof the damage would have been far less.

On a more mundane note, isn't it just neat how the Economist's subscription pricing worked? That one is very cool, and I think I'd want to know it to keep from being gulled in the future.
posted by storybored at 1:12 PM on January 22 [2 favorites]


the dollar example is an instance of hyperbolic discounting, which violates preference consistency, not something Kahneman and Tversky specifically argued. interestingly, though, prospect theory can be modeled as hyperbolic discounting over probability (in the form of odds), leading to the classic over-estimation of rare events and under-estimation of merely infrequent events.


A central area of T&K’s focus is/was the cognitive mistakes people make...


this statement is exactly what I was pointing out about K&T operating in the RCT framework. these are cognitive 'mistakes' only if you accept the premises of RCT
posted by logicpunk at 1:13 PM on January 22


I didn't like the title of the YouTube video. What do these people know about me or my thinking? How do they know that my thinking has errors, and how do they know what 21 errors, if I did have any at all, are the most common? Why do they assume that their thinking is correct and that mine has errors? At the very least I know not to insult my audience in the title of my video.

Anyway, not clicking on the video. As mentioned above, it's negging me from the very first sentence.
posted by Balna Watya at 1:17 PM on January 22 [5 favorites]


This has nothing to do with the content of the videos, but the narrator's accent is a little odd and distracting - primarily standard British, but I hear a slight southeastern-US twang on occasional words.
posted by Greg_Ace at 1:44 PM on January 22


guilt-by-association fallacy

This seems like a good example of how one person's fallacy is another person's valuable shortcut.

If for some reason I wanted to determine from first principles whether these people (whoever they are) have good points about cognition (i.e. if I was starting from the position that their opinions, right or wrong, were of great importance and worthy of careful attention), then obviously their political associations would be of very limited relevance. I would want to take the time to understand their arguments in detail on their own terms, weigh them against various countervailing considerations, etc.

But if I'm just trying to figure out if it's worth my time to devote a substantial amount of undivided attention to watching a video in the first place, then understanding the political associations of the creators and their broader community is pretty useful.

Hmmm. Is there a term for the fallacy of calling something a fallacy due to misconstruing the purpose for which it is being used?
posted by Not A Thing at 1:45 PM on January 22 [12 favorites]


This is some kind of Australian accent, not sure where. The creator seems to have made a niche for himself by reading self-help/personal improvement books and then making a video out of them, and also selling the illustrations on gumroad. They all seem to be very normative self-improvement books, like '7 habits of effective people' - basically books you would expect your manager to give you, very masculine stuff. Do any of these books talk about processing emotions, or how your perception is coloured by your emotional state? Or am I supposed to just push emotions inside myself and be a being of extremely rational logic?

I suppose its harder to make a cool diagram for emotions though.
posted by The River Ivel at 2:07 PM on January 22 [6 favorites]


this statement is exactly what I was pointing out about K&T operating in the RCT framework. these are cognitive 'mistakes' only if you accept the premises of RCT

...no. The whole point is that they are showing that the premises of RCT are, empirically, garbage.
posted by Gadarene at 2:37 PM on January 22 [3 favorites]


The enemy here is: extrapolation from an oversimplified communication of a complex reality.

Truly, that's the enemy. Not any one person's actions based on that misunderstanding, nor anyone who we _think_ might act in an extreme way because we think _they_ have an oversimplified understanding. Maybe they will act harmfully, or, really, maybe not.

The whole cause of most tension around ideas like this is a) someone does some thinking about a phenomenon; b) they or someone else tries to communicate that thinking, which necessarily means abstracting and simplifying it unless the receiver of communication takes the time to understand _everything_ that goes into that; c) someone else takes that simplified idea and applies it much more widely, strongly, uniformly, or basically extrapolates too much from it and either does harm or gets angry.
posted by amtho at 2:57 PM on January 22 [5 favorites]


The lesson people should be taking from K&T is that rational choice is a shitty theory for describing how people behave, and maybe we should work on finding a better framework instead of negging people about their biases.

Sure, but the tweaks that K&T are talking about applying aren't very far removed from what an educated layman would call rational choice theory. Stuff like having asymmetric utility functions that punish losses harder than they reward gains instead of a standard trope like negative quadratic utility. If you do the things they want, you're not going to end up at some holistic vision of human existence but just a slightly different kind of Vulcan.
posted by GCU Sweet and Full of Grace at 3:34 PM on January 22 [2 favorites]


The whole cause of most tension around ideas like this is> philosophical telephone game?

we are a bunch of weird slightly less hairy apes, aren't we.
posted by djseafood at 3:55 PM on January 22 [2 favorites]


Did you all just skim past the phrase "raw dogging the mind cube" in the comments above? I don't think I'm going to get anything better out of this thread than that gem.

I did read the entirety of yudkowsky's harry potter thing back when it was new. It did me no more harm than any other fanfic.
posted by Acari at 6:15 PM on January 22 [12 favorites]


Metafilter: you are one of the elite mental warriors raw dogging the mind cube.
posted by euphorb at 6:45 PM on January 22 [9 favorites]


Metafilter: I'm offended by the content of this video I won't even watch about cognitive bias. The researchers are within 6 degrees of Elon Musk.

I mean T&K could have skipped all the surveys experiments and philosophizing and just Godwin's law-ed the whole comments section.
posted by anecdotal_grand_theory at 7:11 PM on January 22 [3 favorites]


Wait are all the "its bad to even consider" K&T comments just some clever reverse psychology trick to make us even more likely to watch the video?

Man, I can't even tell which side of the mind cube am supposed to be raw-dogging? Damn you S4xC2 symmetry!
posted by anecdotal_grand_theory at 7:16 PM on January 22 [2 favorites]


MetaFilter: raw dogging the mind cube
posted by slogger at 7:48 PM on January 22 [5 favorites]


I consider myself rational, because my parents were both integers
posted by Merus at 8:33 PM on January 22 [12 favorites]


My most visceral first reaction was to the chapters being named Mind Trap 1, 2, 3, 4, etc. instead of giving them meaningful names.

That choice feels very motivated by a sort of click-bait logic that they want me to watch the video and not skip a chapter because I know it already. Which makes me constantly wonder how else are they prioritizing trying to get me to perform certain actions over providing me information?



Anyway, do with this what you will:

1) Cognitive Dissonance
2) Spotlight Effect
3) Anchoring Effect
4) Halo Effect
5) Gambler's Fallacy
6) Contrast Effect
7) Confirmation Bias
8) Baader-Meinhoff Phenomenon
9) Zeigarnik Effect
10) Paradox of Choice
posted by RobotHero at 8:43 PM on January 22 [4 favorites]


Kahneman and Tversky are one of the ur texts of the rationalist movement....

Oh no, I really liked Thinking Fast and Slow! But I could never take this Effective Altruism Stuff seriously. I must have taken away something else from this book than these guys; or just come to different conclusions as to how it relates to my life, or how any of these conclusions are actionable.

For me, the book just confirmed my (admittedly pre-existing notion) that rationality is a bit of a delusion, because we mostly make up our reasons after the effect, so I'm right to side-eye everyone who thinks they are especially rational compared to other people. But it would have never led me to conclude that all expertise is useless; firstly expertise is about experience as much as rationality, and secondly the authors describe quite clearly the circumstances under which it indeed leads to better outcomes - there are absolutely cases described in the book were expertise/experience/implicit knowledge leads to improved performance/judgement.
posted by sohalt at 11:38 PM on January 22 [3 favorites]


Shortly after Thinking Fast and Slow came out, I was asked to do one guest lesson on "science" for 15 y.o.s at an inner city school near my university. With some formality, I gave each of the 30 youngsters a random number between 1 and 99 on a slip of paper and then asked them to estimate the population of Spain. There was, in that cohort, no evidence of the anchoring effect. Which was a surprise, but allowed me to riff on "don't accept facts on authority".
posted by BobTheScientist at 12:01 AM on January 23 [5 favorites]


Hmmm.... I'm tempted to try your population estimation experiment with a slightly different setup. Like, maybe having them read a fanciful story about an ant colony, with three different versions with each version giving a different number of ants in the colony. Then you could compare the three different groups of readers to see if there were differences between the groups.

I'm not saying it would show an anchoring effect -- honestly, I kind of think maybe not? -- but it might be interesting to try.
posted by amtho at 12:33 AM on January 23


> To me, the worst part of the so-called Rationalist movement is the notion that with perfect knowledge and logic, there would be only one correct answer to anything.

fwiw...
Doing EA Better[1] - "While the current crises have made some of our movement's problems more visible and acute, many EAs have become increasingly worried about the direction of EA over the last few years. We are some of them."
  • The Effective Altruism movement has rapidly grown in size and power, and we have a responsibility to ensure that it lives up to its goals
  • EA is too homogenous, hierarchical, and intellectually insular, with a hard core of “orthodox” thought and powerful barriers to “deep” critiques
  • Many beliefs accepted in EA are surprisingly poorly supported, and we ignore entire disciplines with extremely relevant and valuable insights
  • Some EA beliefs and practices align suspiciously well with the interests of our donors, and some of our practices render us susceptible to conflicts of interest
  • EA decision-making is highly centralised, opaque, and unaccountable, but there are several evidence-based methods for improving the situation
posted by kliuless at 12:34 AM on January 23 [3 favorites]


...Or telling them that "Countries with populations over X are exciting places to live," then seeing how that number affects their estimates.
posted by amtho at 12:35 AM on January 23


Metafilter: there are several evidence-based methods for improving the situation.
posted by away for regrooving at 12:44 AM on January 23


I did kind of a wiki dive on this because it was a slow day at work, and found out a couple of things.

Firstly, while Kahneman and Tversky are definitely highly important to rationalists, it seems like it might be more accurate to describe Yudkowsky's restatement of their arguments as being the ur-text of the movement, as part of a long series of blog posts where Yudkowsky outlines his philosophy. This is a critical distinction, because not only is Yudkowsky just wrong about a bunch of stuff, I can absolutely see how the rationalist movement got to the state it's in with only a shared reference pool of a series of blog posts of varying validity. I'd argue you probably can find value in Kahneman and Tversky without making the same mistakes as rationalists do.

Secondly, it turns out LessWrong was the originator of "Pascal's mugging", identifying an argument you see sometimes where something's justified by arguing there's an extremely huge payoff from an extremely unlikely event. I thought that was interesting, given longtermism is one of these, and ultimately comes from LessWrong. Thought that was interesting.
posted by Merus at 3:46 AM on January 23 [7 favorites]


This thread has made me realize something about my own bias, namely, that the comments I tend to favorite are made by other academics. I'll impose a veil of ignorance and go ponder that.
posted by Morpeth at 4:41 AM on January 23 [2 favorites]


I will just add that I think it's a terrible mistake to casually accept the label that the Yudkowskians have appropriated for themselves of "rationalism" or "rationalists." Rationality was around before the Yudkowskians, and it will be around long after they're gone. I suppose there are many contexts in which I think of myself as a rational person, but it certainly doesn't commit me to the many esoteric views that are being gleefully lampooned in the comments above.

This is a bit reminiscent of the doomsday cult on an episode of the TV series Parks and Recreation, which calls itself the "Reasonabilists." That is a more appealing name than "Zorpians," I suppose, but arguably more misleading.
posted by PaulVario at 5:58 AM on January 23 [2 favorites]


Those who join rationalist/effective alturism groups are the same ones that fifty years ago would've joined Mensa. I.e., wankers.
posted by mono blanco at 6:15 AM on January 23 [3 favorites]


In the interests of experimenting with telling a little more truth, I find a lot to like in the rationalist community. In fact, I think of it and metafilter as being of roughly the same sort-- smart people interested in the world. Collectors of strange facts. Somewhat reasonable.

When I was in college, I couldn't understand why my friends didn't get along with each other better, and then I realized most of them were smart, talkative people with somewhat odd ideas, that sort of similarity isn't the same as compatibility.

It's like the way I see science fiction fandom-- you aren't going to like everybody, but it can be a good place to find friends.

I bounce off of _Thinking Fast and Slow_ because it gives a lot of respect to priming, an idea that was popular in experimental psychology that tiny cues could have noticeable effects on behavior. It didn't replicate.
posted by Nancy Lebovitz at 8:07 AM on January 23 [5 favorites]


Anchoring effect: Hi I'm loquacious and I like walks on the beach.
posted by The_Vegetables at 9:26 AM on January 23 [2 favorites]


so in conclusion --

Ultimate Guide to your most common Thinking errors

Maybe if they called it Our Most Common Thinking Errors (or just The Most Common), it would land as a little less annoying. But then what should we expect from a crowd who take one of the best tactics we humans have (rationality) and make an ism of it?

Or as phooky put it:

When a party declares their belief in rationailty, it's like declaring their belief in a shovel. That's a nice shovel, but what are you using it for?

That's not wisdom.
posted by philip-random at 9:38 AM on January 23 [1 favorite]




There are a lot of people in the world whom I'd otherwise really enjoy, who participate in particular activities that harm mostly themselves but that cause some risk to others. I'm not going to specify which activities to avoid a) offending them or making them defensive, and b) distracting from the central point here, which is about not judging people.

For most of my life, I couldn't get past this activity's risk to others - to me. Honestly, I still can't, and it keeps me from friendships I'd probably find valuable -- but I just can't get past it. This has, in the past, affected my ability to see these people fully.

I've never been judge about it, but I didn't really reflect on it. I didn't want to think or talk about it at all because it made me uncomfortable and brought up a strong emotional response in me. Some people _do_ judge others based on this, though.

Recently, I tried to think through why so many people who I otherwise really liked, or would like, participated in this unpleasant, risky behavior. Part of it is a groupthink effect, and part of it, for a good number of them, is that when they were at a formative age, they enjoy others perceiving them a certain way -- they wanted to seem cool. Both of these are traits that a lot of people, people with whom I have more in common, find a bit off-putting at a distance, and really upsetting up close.

However, I've trained myself to find the flip side of traits that seem "bad". Obesity? Protection from starvation. Allergies? More active immune systems protect from disease (and, in my case, bad food).

So I reflected on what the benefits of "followership", "not being critical of new things", and "caring what other people think" are. They're pretty substantial.

That doesn't mean I'm signing up. It just means that I'm thinking about why those people, who I find hard to understand -- who I find hard to put up with, and who are, arguably, messing up the world in some important ways a lot of the time -- are actually not bad in essence, and are actually valuable in the right circumstances.

So, since I'm the one thinking about this (and now, maybe you are too), it's up to me to figure out how to change the circumstances.
posted by amtho at 9:51 PM on January 24 [1 favorite]


I think a lot of the "thinking errors" are things that help on average under the right circumstances.

Like, the "anchoring" example, if someone asks the question, "is the tallest red wood taller or shorter than X?" You might expect them to use a value for X that's close enough to the real value to present a challenge.

You have to fully grasp that this is a situation where they don't actually care at all how difficult the question is.

Or the Zeigarnik effect is that an activity that was interrupted is more readily recalled. It seems that could come in handy if you need to finish that activity later.
posted by RobotHero at 1:11 PM on January 25 [2 favorites]


When I read Yudkowsky's Harry Potter fanfic I wasn't familiar with lesswrong or Effective Altruism and I just assumed the freak-out over mortality was some quirk specific to Yudkowsky.
posted by mscibing at 4:05 PM on January 26


« Older Sword-slinging, opera-singing bisexual rock star...   |   surf’s up! Newer »


You are not currently logged in. Log in or create a new account to post comments.