UK-based YASA has just built a tiny electric motor that makes Tesla motors look like slackers, and this invention could potentially reshape the future of EVs
Stop burning the planet down to generate social media comments
I mean, I thought it would be obvious my issue was with using AI to do so…
Even if it had been a serious question.
But, to be fair I was thinking of what a normal.person would be able to parse, and not people who’s critical thinking had already atrophied from offloading to AI.
They probably don’t have any idea what I meant and would need it explicitly spelled out.
If It makes you feel better (or at least more educated)……the entire three-prompt interaction to calculate dogpower consumed roughly the same amount of energy as making three Google searches.
A single Google search uses about 0.3 watt-hours (Wh) of energy. A typical AI chat query with a modern model uses a similar amount, roughly 0.2 to 0.34 Wh. Therefore, my dogpower curiosity discussion used approximately 0.9 Wh in total.
For context, this is less energy than an LED lightbulb consumes in a few minutes. While older AI models were significantly more energy-intensive (sometimes using 10 times more power than a search) the latest versions have become nearly as efficient for common tasks.
For even more context, It would take approximately 9 Lemmy comments to equal the energy consumed by my 3-prompt dogpower calculation discussion.
you made an offhand joke and got mad at him for continuing the joke?
I mean, I thought it would be obvious my issue was with using AI to do so…
Even if it had been a serious question.
But, to be fair I was thinking of what a normal.person would be able to parse, and not people who’s critical thinking had already atrophied from offloading to AI.
They probably don’t have any idea what I meant and would need it explicitly spelled out.
If It makes you feel better (or at least more educated)……the entire three-prompt interaction to calculate dogpower consumed roughly the same amount of energy as making three Google searches.
A single Google search uses about 0.3 watt-hours (Wh) of energy. A typical AI chat query with a modern model uses a similar amount, roughly 0.2 to 0.34 Wh. Therefore, my dogpower curiosity discussion used approximately 0.9 Wh in total.
For context, this is less energy than an LED lightbulb consumes in a few minutes. While older AI models were significantly more energy-intensive (sometimes using 10 times more power than a search) the latest versions have become nearly as efficient for common tasks.
For even more context, It would take approximately 9 Lemmy comments to equal the energy consumed by my 3-prompt dogpower calculation discussion.