• 0 Posts
  • 7 Comments
Joined 4 months ago
cake
Cake day: October 26th, 2025

help-circle




  • We will learn that so much of their power comes from a docile society acting in predictable ways. They have the privilege of being able to concern themselves with how to influence us… not because that’s a privilege in its own right, but because they can afford to while relatively unhampered. Like if the sea and land were at battle, it’s akin to the land fortifying its sand armory, sand castles, sand moats, sand… a privilege the land can afford, only because it convinced the sea to lower its tide. We will learn, they’re a lot more vulnerable than they would like us to believe.

    Every second we can get them to spend thinking of their own defense, is a second we saved in our favor [which would surely have otherwise been spent determining how to further influence our behaviors].



  • I’ve had these interactions with the head of my IT department. I asked to procure a license for jfrog artifactory. He literally copy/pasted a ChatGPT response to me that began like this:

    Here’s a breakdown of how JFrog Artifactory compares to using GitHub, NPM, or other language-specific package mangers (like Pypi)…

    1. Purpose and Functionality

    2. Workflow & Developer Experience

    3. Security and Compliance

    When to use JFrog

    It came with a bunch of theoretical risks that are completely resolved by the simple ability of just not being a complete fucking moron.

    It was really frustrating that I tried to talk with my IT leader, and instead found a proxy for ChatGPT.

    After that, he created a group chat with him, I, and my colleagues in security. He proceeded to paste ChatGPT output outlining bullshit risks and theories, with the implicit expectation that I rhetorically address each of them via my own response. I’d explain things like,

    “[well if you read the fucking request yourself, you’d know that] we aren’t planning to use the software that way, so the concern isn’t relevant. Even if we were though, those problems are easily addressable via …”

    In some cases, I even had to explain that the problems he’s raising are already problems faced in the current ecosystem. Completely unrelated to the software I’m talking about… ChatGPT just straight up implying that an architectural problem is a software risk.

    I’d reply, and I swear to god he’d just give ChatGPT my text and paste the reply from ChatGPT back to me.

    I lost a lot of respect for him. Why the fuck would you do that?