I just tried them to see if they worked, but I’m a casual.
I just tried them to see if they worked, but I’m a casual.
This is how I found out. RIP to a real one.
I know it’s probably a typo, but it’s worth correcting in case people don’t know.
People still do this?
If you’re going to be using Black American culture, you should at least try do it right.
And, no, it’s not just English. This was popularized through Black Southerners.
Hypercam 2 and WinRAR?
Typical Switch.
Virtual Machines, but I’m too dumb to figure it out.
Were you able to find any more information on this?
Coincidentally trying to do this as well. I’m not sure if it’s possible to play HL mods without Steam anymore.
I like running Linux on my Lenovo Ideapad. It wasn’t expensive and has everything I want, including easily running Linux.
The only thing is it’s not a popular laptop so it doesn’t have accessories, like cases or whatever.
Is that different from SimpleLogin?
I have Proton Premium so I use SimpleLogin, but since I already have BitWarden I skip on Pass. Just curious if it’s worth checking out for that.
Just want to clarify that it’s “GMaps WV”. No space, in case you had trouble finding it on Droidify. But it’s exactly what you’re requesting.
G Maps WV might be what you’re looking for. I mostly use that.
It hurts when I pee.
You can do that?
The day Newpipe and all Inviduous instances die is the day I never use YouTube again.
But, ironically, the Chinese Room Argument you’re bringing up supports what others are saying that LLMs do not ‘understand’ anything.
It seems to me like you are establishing ‘understanding’ with a functionalist meaning to be able to say that input/output is equivalent to understanding in order to say the measurable process in itself shows ‘understanding’. But that’s not what Searle, and seemingly the others here, seem to mean by ‘understanding’. As Searle argues, it is not purely the syntactic manipulation in question but the semantic. In other words, these LLMs do not “know” the information they provide, they are just repeating based off the input/output process with which they were programmed. LLMs do not project or internalize any meaning to the input/output process. If they had some reflexive consciousness and any ‘understanding’, then they could have critically approach the meaning of the information in order to assess its validity against facts rather than just naïvely proclaiming that cockroaches got their name because they like to crawl into penises at night. Do you believe LLMs are conscious?