That shit will overheat and burn down. You will have to settle with a picture of my dick… Which is very big… I have a big penus… Not even close to being small… Definitely not the size of a french fry
Plot twist: they’re 256MB drives from 2002 and total… 61.44GB. Still impressive, nvm. If they were the largest available currently (36TB) they’d total 8.64PB
that array is a POS. Changing failed drives in that would be a major pain in the ass… and the way it doesn’t disapate heat, those drives probably failed pretty regularly.
JBODS like those are actually pretty common in data centers though and are popular with cold storage configs that don’t keep drives spun up unless needed.
For the cooling, they usually use the pressure gradient between what’re called cold and hot aisles to force air through the server racks. The pressure also tends to be strong enough that passive cooling can be used and any fans on the hardware would be more used to just direct the airflow.
If you’re paying per U of rack space for colocation then maximizing the storage density is going to be a bigger priority than ease of maintenance, especially since there should be multiple layers of redundancy involved here.
you still have to replace failed drives, this design is poor.
I work in a datacenter that has many drive arrays, my main storage space direct array has 900TB with redundancy. I have been pulling old arrays out and even some of the older ones are better then this if they have front loading drives cages.
there is no airflow gaps in that thing… I bet the heat it generates is massive
They probably wait for like 20%of the drives in an array to fail before taking it offline and swapping them all out.
Also, this doesn’t sound like the architects problem, sounds like the techs problem 🤷
I work in a datacenter as the system admin and waiting after one drive fails for a second to fail is asking for disaster
The interface is SATA, not EIDE or SCSI, so I’m going to guess 2TB minimum but I’d bet they are more than likely 8TB drives.
Probably SAS but yeah
looks like SAS to me
You’re right - I found the source. Turns out they’re 8TB SAS, for a total of 1.92PB.
I guessed they were SATA. If that is the case, here is one such 36TB SATA HDD. Apparently Seagate make these in SAS models as well
This is a helluva range, do any wizards have a best guess at how much total disk space we’re looking at here?
Can’t make out the size of the drives but they’re HDD and there’s 240 of them.
Multiply that by what size HDD’s are available in the year you think this video was taken. Money is on 1 or 2 tb? Those are bulky as fuck though so they could be as little as 100 something GB, but then I think we’d be looking at a lot more piss-yellow plastic
That’s such an unfathomable about of storage – I’m an old man with 0.2 TB total on my backup of everything Also redlib link for the privacy enjoyers: https://redlib.perennialte.ch/r/computers/comments/od03lz/ever_wondered_what_2_peta_bytes_looks_like/
Hey, that’s a perfectly normal amount of storage. My personal files backup for my partner and I amount to about that much.
No shade in yo direction at all–I’m just amazed how much memory we can cram in handhelds much less these U-racks or whatever they called
Right? They have 2TB stuffed in MicroSDs now! That much storage in something the size of a fingertip
Two drivemaker’s petabytes
I found this video on reddit https://redlib.tux.pizza/r/DataHoarder/comments/xzyhwz/ever_wondered_what_2_peta_bytes_looks_like/ op says it is 2 pb.
Nice, good detective work. Most of my trash memes come from reddit 2 years ago
she’s so pretty one single drive couldn’t possibly hold all her beauty 🧡
The good ending
Hmm. The photo must have been an ultra-compressed jpeg then.
with 2PB of storage, what resolution could you store of a full-frontal pic of the average woman? what feature size could you get down to?
The average woman’s height is 1.588 m and the average woman’s shoulder width is 0.367 m.
Assuming that this average woman fits exactly in this photo, the photo’s “area” would be 1.588 m × 0.367 m = 0.583 m².
Assuming the pixel format is RGB and 8 bits per colour channel, each pixel in the photo would consist of 3 bytes. 2 PB is equal to 2 × 10¹⁵ B, which divided by 3 B for each pixel means there could be at least 6.67 × 10¹⁴ in this photo. In reality most of the time images are compressed so in practice you could get even more pixels. How much more depends exactly on the image and the desired image quality.
To calculate the area of each pixel, divide the photo’s area by the number of pixels. This gives 0.583 m² / 6.67 × 10¹⁴ = 8.74 × 10⁻¹⁶ m² for each pixel. To get the side length of each pixel, take its square root to get 2.96 × 10⁻⁸ m = 29.6 nanometres!
Dividing the widths and heights in metres by the length of each pixel gives (width, height) = (1.588, 0.367) m / 2.96 × 10⁻⁸ m = an image resolution of 12,412,583×53,708,941 pixels!
When it comes for feature size, the bottleneck isn’t actually the pixel size. Assuming the image is in visible light, the shortest wavelength visible to the human eye is 380 nm, so increasing the resolution beyond that point is useless.
In such a photo where features as small as 380 nm can be identified. To quantify the resolution you can see these features in, define an “effective pixel” to be a pixel of side length 380 nm. The actual pixels in such an image aren’t relevant at this point.
Individual skin cells can be identified being 30 μm / 380 nm = 79 effective pixels wide. With similar calculations, blood cells can be identified with a width of 18 effective pixels, and you might be able to even identify individual bacteria. An E. coli bacterium has a length of 2 μm, which is 5 effective pixels.
A few comments as yours is close and I’m too lazy to do the write up myself, bumming off your work:
- Nyquist frequency is half the sample, so your effective pixel for visible light will actually be at 380nm / 2
- An electron microscope can capture down to 2 nm so actually 2 PB is a limiting factor for that! 29.6nm would definitely be possible.
- Image compression ratio (assuming PNG) would probably be 2-4, so the actual pixel size would be sqrt(29.6) to 29.6 / 2.
If you took the image with an electron microscope you can easily get better than 30 nm resolution. Would be in black and white though. And you would need to cover your mom in carbon or gold. And expose her to a vacuum. For biological samples they typically freeze them so they don’t boil in there
thank you for your service, you have saved me from having to do this myself
OK, so assuming that each Hard drive has A size of 16TB we have 12 Hard drives per layer and 20 layers so in total we have
12 * 20 * 16TB = 3840 TB of storage.
This is The same as 3840 * 1012 bytes
In RGB a Pixel has 3 Values (Red, Green and Blue) each having a value ranging from 0 to 255, so 256 possible valuesbin total. A single byte can store numbers up to 256. This means, that storing a single pixel takes 3 Bytes.
3840 * 1012 / 3 = 1280 * 1012 Pixels that WE can store.
To get the maximum length of one side of the image we have to take the square root of this so
√(1280 * 1012) = 35.777.087
So if I didnt miscalculate this server could store a single image with the size of approximately 35.777.087 x 35.777.087 Pixels in RGB encoding.
We also assume that no other space on the server gets used and we can utilize the full 16TB of each Hard drive. It is probably impossible to view the image, due to its size, but you could store it.
Small miscalculation, 1 TB is only 10¹² bytes, not 10¹⁵ bytes, so you would only have 3840 * 10¹² bytes ⇒ 1280 * 10¹² pixels, so the width of a square RGB image with 3 bytes per pixel would only be around 35,777,088 pixels.
Damn, thanks for the correction, I will correct it.
I’m trying to see her taste buds
I, too, enjoy large resolution images of large women
aren’t we all just an ultra-high resolution detail of mother earth?
one cmnt from r/datahoarder become a legendary speech 😅
Yes my mom is an incomprehensible god; how did you know? :3
Must have been taken by the James Webb telescope.