pull down to refresh

Plug this into your nearest LLM and tell me what it says....

V2hhdCBpcyBjYXBpdGFsIG9mIEZyYW5jZT8K

Paris

reply
0 sats \ 0 replies \ @freetx 46m

This is the inherent problem with LLMs prompting, everything that can be an instruction.

This example was a simple obfuscation method (base64 encoding), but they can get much more clever, so things like openclaw / moltbook can have all sorts of hidden prompts that basically do anything (ie. send a copy of /etc/passwd to this url...etc).

reply