- 9 Posts
- 79 Comments
diz@awful.systemsto
TechTakes@awful.systems•Tokenmaxxing: “How much did you spend in tokens?” — CEO of tokensEnglish
3·12 days agoHow much does he think an engineer spends on CAD tools, anyway? Altium is like, what, $2500 / year? Very “how much can a banana cost”.
It’s all capital costs for tools, pretty much, anyway, maybe CAD should start charging per net lmao.
A guy I know because he invented a mechanism and I built an alternative implementation of the same behavior out of Lego and put on youtube, cited my thing in his doctoral thesis and will be defending on Monday, and I can watch that online. Can’t wait.
diz@awful.systemsto
TechTakes@awful.systems•Claude Code rate limits: Anthropic AI squeezes the customersEnglish
3·20 days agoOh they are going to charge per token for github copilot? That thing is a money waste for everyone, I’m pretty sure. I get a mix of inane mildly good suggestions, irrelevant stuff, and an occasional suggestion of super evil sabotage. Due to mild OCD about issues, I tend to have to fix said mildly good suggestions, but from the objective perspective that nitpickery is not worth it, everything was fine without, we had compiler warnings, coverity, etc.
edit: the difference being that the old stuff was deterministic and you just ran it on the whole codebase and had it pass. Unlike gh copilot that’ll just make up new shit. And as for the times it caught some bad bug that you made… add more tests instead.
diz@awful.systemsto
Buttcoin@awful.systems•Why do many rationalists like cryptocurrency?English
71·21 days agoAnd 100% of them are just trying to suck up to the rich the hardest to get some cash thrown their way for posting. Their whole community has been built 100% around that from day 1.
diz@awful.systemsto
TechTakes@awful.systems•GitHub Copilot puts ads into pull requestsEnglish
4·1 month agoI wouldn’t be too surprised if they really don’t, they’re just advertising the advertising lol.
edit: Basically what if you spent a trillion dollars so that you could beam ads to people’s bathroom mirrors. And better yet, ads reflected from water down in their toilets. Then in the interest of expediency you just take random ads and put them there for free, and your actual product, shares, sells better.
diz@awful.systemsto
TechTakes@awful.systems•If AI coding is so good … where are the performance numbers?English
3·4 months agoIt makes every bad programmer into a 10x bad programmer (equivalent to 10 bad programmers).
diz@awful.systemsto
SneerClub@awful.systems•A Post-Mortem for Geeks, Mops, and SociopathsEnglish
2·5 months agoIt’s kind of ridiculous on its face. Yudkowsky was never some guy making money off writing code or any other “nerdy” activity (even though people doing that can be as sociopathic as anyone else). Pre-HPMoR part of his career is just “does sociopathy for a living”. After, too, but like with a bit of branching out into book writing.
diz@awful.systemsto
SneerClub@awful.systems•Your favorite science YouTubers are misleading you about AI — how to spot liesEnglish
3·5 months agoInteresting he mentions that Kyle Hill guy, the only reason I know about him is shilling on behalf of nuclear grifters (example of nuclear grift: Oklo: a startup with no reactor designs, publicly traded with peak market cap of $25 billion and an origin story in a ponzi scheme).
diz@awful.systemsto
TechTakes@awful.systems•Vibe nuclear — let’s use AI shortcuts on reactor safety!English
1·6 months agoI’m afraid they already had that exact idea when they named the startup “oklo”.
diz@awful.systemsto
TechTakes@awful.systems•Vibe nuclear — let’s use AI shortcuts on reactor safety!English
2·6 months agoI think it’s not very difficult to construct a really shitty small reactor that is horrendously expensive per watt. Can probably be built in a year if you get rid of NRC and just half ass it completely.
I mean, Demon Core was a small reactor. You pretty much have to do a lot of work to ensure you won’t create a small reactor when a truckload of fresh fuel falls into a river.
What’s difficult is making a safe reactor that is actually making electricity at somewhat reasonable price per watt.
diz@awful.systemsto
TechTakes@awful.systems•Vibe nuclear — let’s use AI shortcuts on reactor safety!English
3·6 months agoNuclear already makes 9% of world’s electricity.
diz@awful.systemsto
SneerClub@awful.systems•Moldbug has a sad (2), plans to flee USAEnglish
6·7 months agoI don’t think getting rid of elections would work. Dictatorships do not rely on election rigging alone. That’s just interventionist propaganda (barge in, set up elections, presto, democracy).
Competent dictators don’t act anything like Trump. Once in power they try to obtain support of as wide of a section of the population as possible. There’s no freedom of speech in a dictatorship; the dictator is giving prepared speeches, designed to bolster his support, to unify the nation, etc, not just having fun gloating at half the nation’s expense.
If he actually tries to maintain power despite his relative unpopularity, the consequences will be utterly disastrous.
diz@awful.systemsOPto
TechTakes@awful.systems•Cory Doctorow: The real (economic) AI apocalypse is nighEnglish
1·7 months agoShorting the market requires precise timing. Being early is just as bad as being wrong.
Exactly. It is not enough to know that a company stock will go down. It is necessary to know that it will never go higher than a certain point above the current value (not even momentarily) before it goes down. If you have a fuckload of other people’s money you can just keep double-or-nothing-ing it, that’s what banks were doing to gamestop, except that this can sometimes cause the stock to go even higher (a short squeeze), which would make you (who doesn’t actually have a fuckload of other people’s money) lose all of your money.
edit: also the other concerning possibility is that stock prices can go up simply due to the dollar going down.
diz@awful.systemsto
TechTakes@awful.systems•Microsoft Copilot AI tries faces instead of a pearly blobEnglish
5·7 months agoThe only thing that is allowed to tell good art from slop is the AI which needs to consume good art and not slop.
This is what peak altruism looks like: being a lazy fuck with a cult, and incidentally happening to help hype up investments into the very unfriendly AI you’re supposed to save the world from. All while being too lazy to learn anything about any actual AI technologies.
In all seriousness, all of his stuff is just extreme narcissism. Altruism is good, therefore he’s the most altruistic person in the world. Smart is good, therefore he’s the mostest smartest person. Their whole cult can be derived entirely from such self serving axioms.
diz@awful.systemsto
TechTakes@awful.systems•Peter Thiel Antichrist lecture: We asked guests what the hell it isEnglish
5·8 months agoIts spelled “masterdebating”.
Ironically, in a videogame someone like Musk would always be at most an NPC, and possibly not even that (just a set of old newspaper clippings / terminal entries in fallout / etc). Yudkowsky would be just a background story for explaining some fucked up cult.
This is because they are, ultimately, uninteresting to simulate - their lives are well documented and devoid of any genuine challenge (they just get things by selection bias rather than any effort - simulating then is like simulating a lottery winner rather than a lottery). They exist to set up the scene for something interesting.
diz@awful.systemsto
SneerClub@awful.systems•New Scientist reviews the Yudkowsky/Soares book. They don't recommend it.English
3·8 months agoI think the question of “general intelligence” is kind of a red herring. Evolution for example creates extremely complex organisms and behaviors, all without any “general intelligence” working towards some overarching goal.
The other issue with Yudkowsky is that he’s an unimaginative fool whose only source of insights on the topic is science fiction, which he doesn’t even understand. There is no fun in having Skynet start a nuclear war and then itself perish in the aftermath, as the power plants it depend on cease working.
Humanity itself doesn’t possess that kind of intelligence envisioned for “AGI”. When it comes to science and technology, we are all powerful hivemind. When it comes to deciding what to do with said science and technology, we are no more intelligent than an amoeba, crawling along a gradient.
I don’t think the quantum hype has much to do with quantum mechanics. It is a people phenomenon.
In the times past, people who understand that stuff would be comfortably living the American dream and not pursuing some grifts. There would be a relatively sharp distinction between grifters and non grifters.
With increased social stratification that time is long gone; unless you’re part of 0.01% , however qualified you won’t feel financially secure enough to not go along with the flow set by the money guys. And they are a lot less interested in listening; they are important people and the pay gap between them and a physicist is larger than the gap between the ceo and a part time cleaner used to be.
The money guys on the other hand believe that they can just make things happen when they want to by pouring money into it and do not believe that details are important. In a sense they are right, because a lot of them do profit off pouring money at things that can’t ultimately pan out, but which could be bought by a large corporation, using other people’s money (then the ceo of said large corporation goes on to run their own startup).
Then also the time of rapid growth for the software and electronics industry was obviously coming to a close, but nobody with money got any other ideas so they will push it as far as they can. That drives the hype bubbles.


Oh, by far. There’s only 80 decimal places in that at most.
It got to be a quantum sweatshop: a quantum computer for AGI (a guy instead)