[ home / bans / all ] [ qa / jp ] [ spg ] [ f / ec ] [ b / poll ] [ tv / bann ] [ toggle-new / tab ]

/qa/ - Questions and Answers

Questions and Answers about QA

New Reply

Options
Comment
File
Whitelist Token
Spoiler
Password (For file deletion.)
Markup tags exist for bold, itallics, header, spoiler etc. as listed in " [options] > View Formatting "


[Return] [Bottom] [Catalog]

File:658550e5142e1dd1969994e44e….jpg (166.09 KB,1133x1600)

 No.109986

How much data storage does /qa/ think we'd need to create a perfect replica of a human and store it onto a computer? My initial guess was about 10^27 bytes just going based off the estimated amount of atoms in the human body but with a bit more thought I realized that's probably an underestimate given you need to account for the neutrons/positrons/electrons of each atom, and even then maybe accounting for the subatomic atoms matter too.

My curiosity over this comes from the thought that maybe in the future once we're able to store a human inside of a machine we'll be finally able to open up a wide range of possibilities for research into the body. Consider if we did this, maybe we could actually conduct human experimentation without any actual physical humans being harmed in the process. You could copy yourself into the net to have your presence never die. Potentially through this you'd be able to have the most immersive sims around too. Although maybe for some of these the actual requirements of what we'd need mapped in the human body is far less than something like every atom.

In any case though, I wonder if this could even be possible given the massive amounts of storage one would need to pull it off probably makes this infeasible. Do you think we'll keep rapidly growing in terms of tech evolution so that the amount of data we can hold will be enough to store a person on? Or are we at/nearing the limits of what we can do?

 No.109987

>subatomic atoms
*subatomic particles
Also I realize that there's probably far more work that needs to be done than just putting bits of data in as well, it all needs to be connected so that it forms a person as well. I wonder, how would creating those connections work? Is that were accounting for stuff like bosons/quarks matters since they represent forces/interactions in a sense?

 No.109988

With the magics of compression I bet you wouldn't need an unfeasible amount of data storage. RAR and ZIP files aren't to be underestimated, and those are general purpose compression methods. If you use something purpose built then you could probably even compress half or more of the data required.
There's also the question of whether you should actually store a completely perfect replica. You don't notice the gaps that exist in digitally stored music or sound, even though it's stored in 1s and 0s despite being a very analogue type of data. You could probably get away with storing only the bits and pieces that are actually important, since the human body has a lot of redundant and inefficient parts

 No.109990

¥look up entropy of human body; Martinás and Grandpierre give an estimate for a 70 kg human body in "Thermodynamic Measure for Nonequilibrium Processes"
¥convert to qubits: (202.4 kJ/Kelvin) / (bolzmann constant) / ln(2)
about 2*10^28 qubits

>you need to account for the neutrons/positrons/electrons of each atom, and even then maybe accounting for the subatomic atoms matter too.
A lot of that stuff is going to be in the ground state, so it wouldn't inflate the size as much as you might think.

For a practical human simulation you'd use a lot less than this, though. Knowing what every atom is doing is extreme overkill.

 No.109993

File:fninf-14-00016.pdf (1.87 MB)

Looking up stuff I came across pic, a paper about modelling the cerebellum on the Japanese K/京 supercomputer, and an article reporting on it:
https://theconversation.com/japanese-supercomputer-takes-big-byte-out-of-the-brain-16693
They list specs in the introduction (11 petaflops and 1 petabyte of DRAM for example, so 10^15) but don't mention storage specifically. Clearly nowhere close to being enough, since it was
>about 1% of the raw processing power of a human brain.
>the simulation still took 40 minutes to provide the computational power of one second of neuronal network activity of the brain in real, biological, time.
I also found a bunch of people talking about how simulating atom interactions would require quantum computing to be feasible, so yeah, it's probably going to be measured in qubits if and when it happens.
>>109988
The thing with redundancies and inefficiencies is that the body is nonetheless adapted to those kind of things being present, and there's way too much stuff we still don't understand about it. Another quote from the K article above:
>Add to the mix the increasing evidence that glia cells (which don’t carry signals like neurons but make up around ten to 50 times more of the brain mass than neurons) aren’t just there to provide physical support for the neurons but are an integral part of the brain’s function, and the computational problem gets even bigger.

 No.109994

File:579px-Guanine_chemical_str….png (13.55 KB,350x290)

>>109988
One thing I don't really understand about compression is just how does it work to reduce the size of 1s and 0s, it doesn't get much smaller than that does it? Or is there something to compression that I'm missing.

And if we wanted the perfect accuracy for simulating chemical reactions I would've thought we'd need to at least simulate things on the atomic level so that you could simulate them to the best of your ability. Please excuse my very shoddy chemistry memory, but aren't most chemical compounds just a bonding of atomic structures?

>>109990
Maybe I do need to go back and study my chemistry better, since I've always had the misguided thought that there was some sort of correlation between the atomic number and number of neutrons. What would you say is a good level to simulate a human then, the molecular level?

 No.109996

File:slide_1.jpg (113.01 KB,960x720)

>>109994
>some sort of correlation between the atomic number and number of neutrons
There is a direct 1-to-1 correlation between atomic number and number of protons. It's definitional, the atomic number is the number of protons in an atom.
Technically there is a correlation between the number of protons in an atom and it's atomic number as there is a certain range of ratios of neutrons to protons that atoms can have before the nucleus of the atom becomes unstable.

 No.109998

File:848f84bf0b5dbe4823720709dc….jpg (225.93 KB,1099x922)

>>109994
>One thing I don't really understand about compression is just how does it work to reduce the size of 1s and 0s, it doesn't get much smaller than that does it? Or is there something to compression that I'm missing.
In an uncompressed file all of the 0s and 1s are to be interpreted directly but a compressed file's 0s and 1s are not supposed to be interpreted directly to get the actual contents of the file. They are more like a set of instructions to reconstruct the original 0s and 1s if you know the specific algorithm that was used to compress the file in the first place. If you don't know the algorithm, you can't uncompress it. Like having a .zip file with a password you don't know, where you can't do anything with it.
Compression algorithms rely on there being some sort of pattern in the data in order to make the file smaller and they're not guaranteed to make the file smaller if it's full of random noise or already compressed. A really simple compression algorithm to get a feel for the idea is run-length encoding. You can see how using it on data like "00000000000000000000011111111111111111111111111111111111" would compress it but using it on data like "011010111000111101110010001001010110101" would not.

 No.110007

File:apple pie.png (957.13 KB,1280x720)

>ITT

 No.110008

>>109994
>What would you say is a good level to simulate a human then, the molecular level?
That obviously depends on what you're trying to find out. If you just wanted a reasonable approximation of human thought, you might be able to get away with a just a good-enough model of the neurons and the connections between them. And you might run lower-level simulations to check that your neuron model really behaves close enough to an actual neuron would. Or compare it to neurons in the lab, whatever's cheaper. Or both.

 No.110080

Actually related

 No.110143

File:1517546979260.jpg (14.02 KB,184x184)

>>110008
Now that I think about it, do we even know how our neurons work? Wonder if you could reconstruct a brain if you knew how it was composed with molecules and such if you don't understand how neurons work.

 No.110144

File:[MoyaiSubs] Mewkledreamy -….jpg (346.67 KB,1920x1080)

Isn't an individuals set of chromosomes alone a few terabytes or something? Or maybe it's lower, I remember something like that. I'm too dumb for this conversation, but I'm sure something like 99% efficiency would take a lot less power than 100% and that's what will happen

 No.110151

>>110144
Billions of terabytes. Although the researchis old as hell now and nothing has come of it so maybe we'll never be able to store data on biological material. Which probably means that we'll be stuck with our hard limits.

https://arstechnica.com/information-technology/2016/04/microsoft-experiments-with-dna-storage-1000000000-tb-in-a-gram/

 No.110183

>>110080
In what way

 No.110676

>>110144
chromosomes are so huge that even the human body utilizes multiple levels of compression on them.




[Return] [Top] [Catalog] [Post a Reply]
Delete Post [ ]

[ home / bans / all ] [ qa / jp ] [ spg ] [ f / ec ] [ b / poll ] [ tv / bann ] [ toggle-new / tab ]