- cross-posted to:
- technews@radiation.party
- cross-posted to:
- technews@radiation.party
My worry in 2021 was simply that the TESCREAL bundle of ideologies itself contains all the ingredients needed to “justify,” in the eyes of true believers, extreme measures to “protect” and “preserve” what Bostrom’s colleague, Toby Ord, describes as our “vast and glorious” future among the heavens.
Golly gee, those sure are all the ingredients for white supremacy these folk are playing around with what, good job there are no signs of racism… right, right!!!
In other news, I find it wild that big Yud has gone on an arc from “I will build an AI to save everyone” to “let’s do a domestic terrorism against AI researchers.” He should be careful, someone might this this is displaced rage at his own failure to make any kind of intellectual progress while academic AI researchers have passed him by.
(Idk if anyone remembers how salty he was when AlphaGo showed up and crapped all over his “symbolic AI is the only way” mantra, but it’s pretty funny to me that the very group of people he used to say were incompetent are a “threat” to him now they’re successful. Schoolyard bully stuff and wotnot.)
In other news, I find it wild that big Yud has gone on an arc from “I will build an AI to save everyone” to “let’s do a domestic terrorism against AI researchers.” He should be careful, someone might this this is displaced rage at his own failure to make any kind of intellectual progress while academic AI researchers have passed him by.
disclaimer/framing: the 'ole yudster only came to my attention fairly recently, so the following is observation/speculation (and I’ll need some more evidence/visibility to see if the guess pans out)
a few years ago I happened to deal with someone who is a hell of a grifter - in intensity, scope, impact. it was primarily through that experience which I gained handle on a number of things that’ve served me well in spotting it in other things. some things I’ve been observing under that light:
- he’s clearly talking out of his ass almost all the time
- shell game applies
- I think 'ole yuddy is aware that he’s not as clever as he claims he is, and is very salty about that[0]
no-op line to make lemmy newline better
(1) and (2) means he has to continuously keep ahead of the marks ^W rats. the guy is fairly clearly some kind of widely read/informed, and can manage to deal with some kind of complexity[1] in concepts. but because (3) - he can never be as right as he wants to be, so he has to keep pivoting the grift to a new base before he gets egg on his face. his method for doing this is “abandon all hope” but practically it’s an attempt to retcon history, and likely if anyone tried to really engage him on it he’d get ragey and blame them on working on “outdated information” or some other shit (because lol who needs acknowledging their own past actions amirite)[2]
[0] - this is a guess from my side, but all his “imagine a world in which einstein wasn’t exceptional, because there’s many of them” shit comes through to me in this way. anyone else?
[1] - not very well, of course, this is why the multi-million word vomits exist, but “some”.
[2] - this is something I’ve seen with narcissists a lot - they can never be wrong, and “making them” be wrong (i.e. simply providing proof of past actions/statements) gets them going nuclear
My perspective is a little different (from having met him), I think he genuinely believed a lot of what he said at one point at least … but you’re pretty much spot on in all the ways that matter, he’s a really bad person of the should probably be in jail for crimes kind.
The line between “actually believes $x” and “appears to actually believe $x” can be made heeeeeella fuzzy (and people in that space take advantage of that)
Curious about the latter half of your remarks. Is that opinion, or something grounded in other knowledge that isn’t widely known yet?
Good point with the line! Some of the best liars are good at pretending to themselves they believe something.
I don’t think its widely known, but it is known, (old sneeeclub posts about it somwhere) that he used to feed the people he was dating LSD and try to convince them they “depended” on him.
First time I met him, in a professional setting, he had his (at the time) wife kneeling at his feet wearing a collar.
Do I have hard proof he’s a criminal? Probably not, at least not without digging. Do I think he is? Almost certainly.
First time I met him, in a professional setting, he had his (at the time) wife kneeling at his feet wearing a collar.
hold on, you can’t just write this paragraph and then continue on as if it’s not a whole damn thing
ah yes the first time I met yud he non-consensually involved me in his bondage play with his wife (which he somehow incorporated into a business meeting)
😅 honestly I don’t know what else to say, the memory haunts me to this day. I think it was the point when I started going “huh, the rats make weirdly dumb mistakes considering they’ve made posts exactly about these kinds of error” to “wait, there’s something really sinister going on here”
Can you say where and when this happened without doxxing yourself? Was anyone else around while he and his wife were doing this?
@self Oh, that’s like the time I met Young Moldbug at his student house and his first words were, “let me show you the lizard room!”
He was so proud of his room full of giant lizards (and the odd snake).
So proud.
that’s literally the most endearing and human thing I’ve ever heard about Yarvin
Personally I imagine him as a cult leader of a flying saucer cult where suddenly an alien vehicle is actually arriving. He’s running around panicking tearing his hair out because this wasn’t actually what he planned, he just wanted money and bitches as a cult leader. And because it’s one thing to say the aliens will beam every cult member up and take them to paradise, but if you see a multi-kilometer alien vehicle getting closer to earth, whatever it’s intentions are no one is going to be taken to paradise…
academic AI researchers have passed him by.
Just to be pedantic, it wasn’t academic AI researchers. The current era of AI began here : https://www.npr.org/2012/06/26/155792609/a-massive-google-network-learns-to-identify
Academic AI researchers have never had the compute hardware to contribute to AI research since 2012, except some who worked at corporate giants (mostly deepmind) and went back into academia.
They are getting more hardware now, but the hardware required to be relevant and to develop a capability that commercial models don’t already have keeps increasing. Table stakes are now something like 10,000 H100s, or about 250-500 million in hardware.
https://www.semianalysis.com/p/google-gemini-eats-the-world-gemini
I am not sure MIRI tried any meaningful computational experiments. They came up with unrunnable algorithms that theoretically might work but would need nearly infinite compute.
As you were being pedantic, allow me to be pedantic in return.
Admittedly, you might know something I don’t, but I would describe Andrew Ng as an academic. These kinds of industry partnerships, like the one in that article you referred to, are really, really common in academia. In fact, it’s how a lot of our research gets done. We can’t do research if we don’t have funding, and so a big part of being an academic is persuading companies to work with you.
Sometimes companies really, really want to work with you, and sometimes you’ve got to provide them with a decent value proposition. This isn’t just AI research either, but very common in statistics, as well as biological sciences, physics, chemistry, well, you get the idea. Not quite the same situation in humanities, but eh, I’m in STEM.
Now, in terms of universities having the hardware, certainly these days there is no way a university will have even close to the same compute power that a large company like Google has access to. Though, “even back in” 2012, (and well before) universities had supercomputers. It was pretty common to have a resident supercomputer that you’d use. For me, and my background’s orginally in physics, back then we had a supercomputer in our department, the only one at the university, and people from other departments would occasionally ask to run stuff on it. A simpler time.
It’s less that universities don’t have access to that compute power. It’s more that they just don’t run server farms. So we pay for it from Google or Amazon and so on, like everyone in the corporate world—except of course the companies that run those servers (they still have to pay costs and lost revenue). Sometimes that’s subsidized by working with a big tech company, but it isn’t always.
I’m not even going to get into the history of AI/ML algorithms and the role of academic contributions there, and I don’t claim that the industry played no role; but the narrative that all these advancements are corporate just ain’t true, compute power or no. We just don’t shout so loud or build as many “products.”
Yeah, you’re absolutely right that MIRI didn’t try any meaningful computation experiments that I’ve seen. As far as I can tell, their research record is… well, staring at ceilings and thinking up vacuous problems. I actually once (when I flirted with the cult) went to a seminar that the big Yud himself delivered, and he spent the whole time talking about qualia, and then when someone asked him if he could describe a research project he was actively working on, he refused to, on the basis that it was “too important to share.”
“Too important to share”! I’ve honestly never met an academic who doesn’t want to talk about their work. Big Yud is a big let down.
A joke I heard in the last century: Give a professor a nickel and they’ll talk for an hour. Give 'em a quarter and you’ll be in real trouble.
It’s true, I’m terrible for it myself 😅
The description of how utopians see critics (“profoundly immoral people who block the path to utopia, threatening to impede the march toward paradise, arguably the greatest moral crime one could commit”) is extremely similar to the way scientologists see their critics and ex-members. I suppose at least TESCREALists have a slightly higher measure of independence than scientologists and are thus less likely to be convinced to poison a critic’s dog or send them threatening letters.
Not just Scientology; Peoples Temple, Aum, or Shining Path all apply too.
Edit: they also have more money and political influence at this point then the above. this isn’t good imo.
deleted by creator
…no?!? ETA: you could get the “learning small skills” benefit this LessWrongie claims to have gotten by, I dunno, learning how to play chess. Without having to enter a Scientology org. And they wanted to do more courses with the Scientology public speaking course???
It’s not a cult folks, it’s definitely not a cult…
I know journos are looking again, finally, but like Peoples Temple was taken down that way.
(also I really recommend reading Raven or the PBS documentary on Jonestown).
In secondary school we were doing a unit on religion and one day when the teacher was out we watched a documentary about Jonestown, pretty horrible. Ended with the Jonestown tapes. It became clear at the end that most of the other girls in my class had never heard of Jonestown and were really shocked by how it all ended. I don’t know if I can go back and look at it all again. Interesting though if only for their relationship to SF politics.
Yeah, that part was very interesting.
oh gawd your turn to repost that LW classic
deleted by creator
@self perhaps we need a “how to post from mastodon” guide
it’s possible but you have to do particular things I think?
that’s a great idea! in the meantime before we write that post, this archived reddit thread has some details on how to do things here from mastodon. in brief, I think starting a new toot, tagging @SneerClub@awful.systems, following that with the post title, inserting two newlines (aka start a new paragraph), then writing the post body and posting that should work. keep in mind that markdown probably won’t make the trip over!
deleted by creator
Yeah it is the classic cult characteristic. Synanon members putting a snake in someone’s letterbox is another example. Also Hare Krishnas, MOVE, etc etc.
The one issue I have is that “what if some are their beliefs turn out to be real”. How would it change things if Scientologists get a 2 way communication device, say they found it buried in Hubbard’s backyard or whatever and it appears to be non human technology - and are able to talk to an entity who claims it is Xenu. Doesn’t mean their cult religion is right but say the entity is obviously nonhuman, it rattles off the method to build devices current science knows no method to build and other people build the devices and they work and YOU can pay $480 a year and get FTL walkie talkies or some shit sent to your door. How does that change your beliefs?
while it’s true that if my dick had wings, it would be a magical flying unicorn pony, so far this hasn’t been shown to be the case at all, so i’m not putting effort into the hypothetical
deleted by creator
we can’t possibly know given that this is past the Singularity, but I’m sure Eliezer has 20,000 words on the subject just ready to roll
What? I was describing how cults/high-control groups react to criticism. I wasn’t trying to assess how accurate their beliefs are. Cults rely on having some beliefs which reasonable people might agree with. Those are the beliefs they present to the public. Cult literature often sounds plausible or benign even if it’s not factually accurate.
Before there was greater awareness of what cults are and how they work, it wasn’t uncommon for early press about cult groups to conclude that while some of the cult’s beliefs were strange, they had good values and were doing good things for their communities so they were probably harmless. It was only later that stories begin to emerge about the extreme levels of control that cults were exercising over their members, how that control led to the exploitation and abuse of members, and how limited and transactional their “good works” were.
If a group with that model of control and exploitation claimed to have access to a source of genuinely new and scientifically significant knowledge, they are the worst people to be in control of it, because: a) Cults keep back the larger part of their beliefs from the public in order to extract as much in money, volunteer time and other resources from their members. If a cult did have a direct line to Xenu, it would be directly in their interests to strictly limit how much other people can know about Xenu without paying exorbitant fees and submitting to cult authority. b) Cults are run by people whose ethics are compromised. Cult leaders believe above all else in their right to power and/or wealth and everything else including the health and safety of others comes second. They bully and indoctrinate their subordinates until cult members believe that there is no good and bad so much as there are things that are good for the cult and things that are bad for the cult. If people with such compromised ethics gain access to Xenu’s special information (why are we assuming Xenu will be wise and helpful anyway? In Scientology mythos, Xenu is evil. And also dead.) they will use it to improve the position of the cult and impose their beliefs on as many people as possible. c) due to the above mentioned, it will be extremely difficult for non-members to assess the accuracy of information provided by the cult.
This is a good article. I just recently heard this TESCREAL term from the Crazy Town podcast, and it’s scary stuff. I don’t know what some of these words are on your community sidebar and I’m not at all sure what this community is about. I’m guessing it’s a hate group, otherwise you would use words everyone understands. And now I’m wondering what’s even the use of this comment. And I’m probably about to get attacked by members of an online hate group
Having lurked for a long time, sneerclub is aimed at people who already have a good idea of the horror of TESCREAL groups—the point isn’t to attract new members, but catharsis for those of us that have had to deal with the TechBros/Facists etc.
and for sneering, the sneering is important.
Getting real for a moment, for me, I used to be in deep with these people and then my friends in the community commited suicide due the rampant sexual abuse and I got the hell out. Sneer club was the only place the reports of assault were taken seriously, while the TESCREALs all closed ranks.
It’s all a way back for me now, but I love this place. That there is a tiny part of the Internet out there that calls these people on their shit and sneers gives me so much peace.
(For sneerclubbers reading this; thanks folks, you’re the best! ✨️)
yeah, sneerclub is a sub for those who know too much about these bozos (I’ve been following them since 2010 good lord) to post negativity. The linked article is the easy introduction to their foolishness.
TESCREAL is becoming the accepted academic acronym - Transhumanism, Singularity, Cosmism, Rationalism, Effective Altruism, Longtermism - even though TREACLES was right there
even though TREACLES was right there
I still wish there were a
U
in there somewhere, becauseARSECULT
Too bad E and A should be next to each other too
Hmm branding opportunity, “unwanted altruism”
SEARCULT it is then
Maybe this place needs a sister community SNEARCULT.
“SEARCULT is totally innocent, okay? We’re just really into good steak and Enlightened Conversations…”
i just wanna grill for god’s sake
That’s all that keeps TESLAREC from being a contender.
T-CEREALS
Part of a balanced breakfast!
Brilliant. Can we just shoehorn in Utilitarianism too then?
Utilitarianism
We’re a point-and-laugh-at-TESCREAL-people group.
…what?
not an unreasonable fear when the linked article details the cultist death threats Torres has been getting
ya idk maybe i was a bit hasty in tossing around ‘hate group’ but I would still like to just Homer Simpson right back into the bush.
get a load of these DWEEBS tryna eugenics their way into utopia! (am i doing this right?)
I realise that a reddit-clone forum with a load of jargon probably does look a bit, um…hate group-y, but I wouldn’t describe sneerclub as a hate group. More like Scientology watchers. The reason sneer club is no longer on Reddit is because it chose to leave (during the recent changes), not because it was banned (like, say, r/incel).
they are cuddly little smol bean nerds, very endearing!!! except you know the pervasive race science
ya idk maybe i was a bit hasty in tossing around ‘hate group’ but I would still like to just Homer Simpson right back into the bush.
speaking for myself (and likely others would agree): I understand this impulse, but choose to throw shade because these utterly moronic ideas deserve public shaming and counterpush. the fact that it’s so easy to take their shit apart while they ostensibly have all those highly-paid bigbrains is, well, rather indicative of just how thoroughly put together their fuckwittery is
get a load of these DWEEBS tryna eugenics their way into utopia! (am i doing this right?)
yep
These people are too disconnected from reality, too prideful of their “intelligence” and “success” from their tech startups, and have read too much Isaac Asimov. smh.
Ah I finally understand what this sub is sneering at- I just call them tech bros. Anyway, I have a great contribution to this.
it ain’t just tech bros (see !techtakes for those), it’s a specific strain of them associated with LessWrong
Interesting read. I’ve read a few articles about the negatives of longtermism before this, but this was the first one that actually made sense