Isn’t there some problem with their Business APIs that a bunch of the training data isn’t anonymized or can be easily deanonymized… Yet that data, gets sculpted, repacked, and transmitted regularly.
It’s all because they thought you couldn’t see “inside” the LLMs real time “thoughts”, when actually you can (so encryption is non-existent at that point, which is a security risk)…
Isn’t there some problem with their Business APIs that a bunch of the training data isn’t anonymized or can be easily deanonymized… Yet that data, gets sculpted, repacked, and transmitted regularly.
It’s all because they thought you couldn’t see “inside” the LLMs real time “thoughts”, when actually you can (so encryption is non-existent at that point, which is a security risk)…