![]() The problem is we are blindly pushing forward and _we don't know_ what to expect. Obviously the current version of ChatGPT and LLMs of today are not world threatening. I'll try summarizing Yudkowsky's main point as I understand it (however you should read him directly if you find me unconvincing). I’m not qualified to speculate as to why they do, but sometimes simply ignoring them can be… sensible. There seem to be certain personality types who gravitate towards certain types of beliefs. ![]() ![]() I’m starting to shut out such claims immediately, or at least engage with them less. climate change will make humanity go extinct in the next few decades, it’s likely you feel similarly about A.I., COVID, etc. Lately I’ve started acting similarly with doomers, almost inadvertently. But it’s borne out of years of similar interactions, and it’s saved me a lot of time and effort. I thought a lot of these people genuinely looked at the official story and related evidence with an open mind, and came to a different conclusion than I did.īut the more of them I’ve met, the more I’ve found they tend to be everything-else truthers as well: moon landing, vaccines, etc., and I tend to disregard their arguments outright. I used to think you could be a 9/11 truther and still hold reasonable beliefs otherwise.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |