

Well, with how things are going with Windows and laptop OEMs, it’s still a better deal than those.
A lot of user need an iPad with a touchpad and keyboard for their needs, which is precisely what this is.


Well, with how things are going with Windows and laptop OEMs, it’s still a better deal than those.
A lot of user need an iPad with a touchpad and keyboard for their needs, which is precisely what this is.


So from the inference / customer computation side of things, it isn’t a problem.
Not necessarily. There are inference schemes where spreading MoE models across 40+ GPUs with a fast interconnect yields better efficiency.
looks like blackwells can run sunstained at 88c
The coolant still needs to remain relatively cool to hold that silicon temperature, though. Practically it can’t be like 60C.


That’s interesting, but what’s the point? If it’s like 2 DGX boxes in each satellite, spaced out, the interconnect between them is going to be very slow, and the individual computational power of each satellite will not be that impressive.
And if you connect them all in one constructed mesh and wire them together, well, you’ve made a 200MW datacenter! The economies remain the same.
If hardware gets more power efficient, well… Then why do you need to go to space anymore?


100kW? Nvidia BGX 200 servers are 14kW each, not counting the interconnect, or anything else. According to nuggets I’ve read online, we’re talking 200 megawatts for an Earth-based AI datacenter these days, without something exotic like underclocked Cerebras WSEs (which would be pretty neat, actually…)
Plugging 200 megawatts into this:
https://www.calctool.org/quantum-mechanics/stefan-boltzmann-law
I get about 0.46 square kilometers, depending on the coolant temperature, and ultimate efficiency of the system (with how you orient the thing relative to solar panels, how you circulate coolant…)
I have no clue what the construction of such a huge structure would look like, but if it was a simple 0.5 inch aluminum sheet, it would weigh like 15,000 metric tons. Even much thinner, that’s still on the order of “mass of a cargo ship”
Why is that, though?
Well, something like the ISS doesn’t generate much heat, and hypothetical rockets that need big radiators have very hot coolant to dissipate heat quickly. But space data centers are the sinister combination of “tons of waste heat” and “needs a low coolant temperature.”


On radiators, plugging it into this formula:
https://projectrho.com/public_html/rocket/heatrad.php
I get a circular radiator at least a kilometer wide, assuming the radiator is quite efficient, a rather modest datacenter, and very hot coolant (70C).
…Realistically, the coolant temperature would need to be much lower. See how it’s a power of four in the formula? That means the radiator area gets very large real quick.
I cannot emphasize how expensive a functional 1km+ radiator would be in space. It’s mind bogglingly expensive.
If a space datacenter is in LEO like Starlink, then it’s in Earth’s shadow a lot of the time, and would have to be “part” of the starlink network constantly zooming over the ground. If it’s geosynchronous, then laser communication (or any communication) gets real tricky, and latency is limited by the speed of light. I’m not saying it’s impossible, but reliable high data rates would be an expensive engineering challenge.


Heat is radiated into the vacuum for free
When you combine that with a mesh network like Starlink, the need for laying fiber lines disappears entirely
Citation needed.
And on water usage, I will point out that gas generators and evaporative cooling are only used on Earth because other methods (geothermal, big radiators, heatpumps) are somewhat more expensive… Not, like, orders of magnitude more expensive like pure radiative cooling in space.


That’s because science literacy is pretty low.
And to be fair, the average person doesn’t need to understand vacuum thermodynamics. The issues is when a few of those “average people” are now billionaires making unilateral decisions, surrounded by yes men and feeds instead of experts informing them of reality.


So do all the Microsoft subscriptions they already buy, yet they’re extremely popular anyway?


Businesses will adore this. I can guarantee a lot of us will be forced to use these at work, like Teams and CoPilot, as a further mega deal with Microsoft.
…But honestly, I think “home” buyers who don’t really care about PC stuff, aka most people, would pick tablets over this.


But the point is that Valve could easily be Amazon some day. All these little companies taking their first anticompetitive steps could.
Of course everyone loves them when they’re small, and nice, and growing, until they get so big it’s way too late to do anything about it. But many will still feel loyalty, like they do to Amazon today.


Yeah…
That’s how Amazon worked. At first.
Back then, online shopping kind of sucked, and this little book store company made its so streamlined I got invested.


I’ve pointed out Valve doing basically the same thing; games can’t be priced lower than Steam on competing game storefronts (not Steam key resellers), or Valve will threaten to delist your game. Which would be essentially kill it. And they obviously do this to protect their chunky store fee.
But personal loyalty goes a long way.
I’m trying to reframe the perspective here, not drag into an argument about Valve. A whole lot of people feel good about finding “deals” on Amazon, about Amazon services that have helped them, and especially about the value and convenience the whole platform provides. It’s easy for Lemmy to hate on Amazon, but for the average person, I think this is a harder sell than most of us realize. They’ll dismiss it as the “market working” or California sensationalism or, more likely, just filter it out as noise in their feed, just like most PC gamers would when they read something bad about Valve.
To be blunt, I dunno if that’s coming? Apple’s designs are pretty conservative these days; I doubt they’d make a big folding iPhone.
iPhones do go on firesales from some carriers, sometimes even below cost.