420chan now has a web-based IRC client available, right here
Leave these fields empty (spam trap):
Name
You can leave this blank to post anonymously, or you can create a Tripcode by using the float Name#Password
Comment
[*]Italic Text[/*]
[**]Bold Text[/**]
[~]Taimapedia Article[/~]
[%]Spoiler Text[/%]
>Highlight/Quote Text
[pre]Preformatted & Monospace text[/pre]
1. Numbered lists become ordered lists
* Bulleted lists become unordered lists
File

Sandwich


Community Updates

420chan now supports HTTPS! If you find any issues, you may report them in this thread
How much information can a computer hold relative to the human brain? by AugustusBommleledge.phtml - Wed, 13 Sep 2017 18:52:39 EST ID:sXMF5FbE No.121299 Ignore Report Quick Reply
File: 1505343159570.jpg -(52324B / 51.10KB, 540x720) Thumbnail displayed, click image for full size. 52324
I don't know if this is the right place to ask but I figure it's either you guys or /med/. I'm writing a story and this is supposed to play a role in it, but for it to work I need to know what the score is here.
>>
ShittingSunnerpodging.dun - Thu, 14 Sep 2017 01:55:38 EST ID:OtsFwZAr No.121300 Ignore Report Quick Reply
The brain isn't linear. It cannot be defined how much data the brain holds.
>>
DorisSanningpog.cat - Thu, 14 Sep 2017 02:00:40 EST ID:tn7nbRb1 No.121301 Ignore Report Quick Reply
Apples and oranges doesn't do the comparison justice, but if you wanted to use a brain as a hard drive some how, the capacity might be something on the order of petabytes.
>>
AugustusGambledane.sea - Thu, 14 Sep 2017 11:55:19 EST ID:sXMF5FbE No.121303 Ignore Report Quick Reply
1505404519338.jpg -(43467B / 42.45KB, 603x339) Thumbnail displayed, click image for full size.
>>121301
>>121300
That'll do, thanks a lot fellas!
>>
SimonPammlegold.pm - Sat, 16 Sep 2017 03:59:26 EST ID:QaLypIUP No.121310 Ignore Report Quick Reply
The human brain has around 100 billion neurons in it and between 0.1 and 1.0 quadrillion synaptic connections between these neurons. If each synapse between neurons were used to store just one bit of data (somehow without also disrupting the brain's normal functioning), this would amount to be around 0.1 to 1.0 petabits of data (100 to 1,000 terabits).
>>
DavidSidgeson.doc - Sun, 17 Sep 2017 10:25:06 EST ID:be8b6w6s No.121314 Ignore Report Quick Reply
>>121299
  1. One human cell contains 75MB genetic information.
  2. One sperm contains a half of that; that is 37.5MB.
  3. One ml of semen contains 100 million sperms.
  4. In average, ejaculation lasts for 5 sec and contains 2.25 ml semen.
  5. This means that the throughput of a man's member is equal to (37.5MB x 100,000,000 x 2.25)/5 = 1 687 500 000 Megabytes/s = 1 769 472 000 000 000 bytes/second = 1609.325408935547 Terabytes/sec. (In data line / network terms a 12 874.5 Terabit Line)

This means that the female egg cell withstands this DDoS attack at 1,6 terabyte per second, and only lets through one(!) data package, thereby being the best freaking hardware firewall in the world!

The downside of it is that this only small data package that it lets through, hangs the system for the whole of 9 months!

but wait...there's more

According to them the average genetic difference between two sperm cells from the same source is ~0.25% (this is not a massively accurate number because of long biology words that I don't know anything about but for this purpose it should be close enough).

Using that number it means that for every cell, about 99.75% of the information is going to be a duplicate, bringing the total unique information down to ~3.97TB.

This would be a huge pain to compress however since typical compression algorithms don't explicitly look for duplication, they sort of build up tables of unique strings over time. I'd imagine you'd end up with a bit more than 4TB though.

Additionally because of the tiny percentage of variation between cells it's entirely likely that once you have "transferred" enough "data" you won't actually need to store more than a few bytes per cell since you'll start seeing perfect duplicates of cells that you've already seen.

Some (very) rough calculations puts the average number of cells you'd need before you expect most to be duplicates is somewhere in the region of e56530387 (it took a surprising amount of effort to calculate that, turns out trying to do factorials of 9 digit numbers is hard...) so you'd run out of storage space before you could expect to have "seen them all".
>>
EbenezerHoffingsod.lgo - Mon, 02 Oct 2017 02:56:57 EST ID:QaLypIUP No.121351 Ignore Report Quick Reply
>>121314
How'd you come up with the calculation that one human cell contains 75MB of genetic information?
>>
IsabellaSnodforth.pif - Mon, 02 Oct 2017 09:32:56 EST ID:tn7nbRb1 No.121352 Ignore Report Quick Reply
>>121351
That faggot just went through a leddit thread and copied/rephrased multiple posts. The numbers at the top probably come from a "science" writer whose grasp of information theory is even looser than his grasp of genomics. DNA base pairs encoded in binary work out to around 750MB and 90% of human DNA is useless junk if you went to a shitty school. That's 75MB of genetic information per cell. Super cool blog, bro. The second part is some ledditor's calculation fit for the back of a retarded envelope. I leave that as an exercise to the reader.
>>
kevinmitnick_og - Sun, 08 Oct 2017 14:58:41 EST ID:81pruxy+ No.121366 Ignore Report Quick Reply
1507489121714.jpg -(1628978B / 1.55MB, 3264x2448) Thumbnail displayed, click image for full size.
"Computers are the most complex objects we human beings have ever created, but in a fundamental sense they are remarkably simple. Working with teams of only a few dozen people, I have designed and built computers containing billions of active parts. The wiring diagram of one of these machines, if it were ever to be drawn, would fill all the books in a good-sized public library, and nobody would have the patience to scan the whole of it. Fortunately, such a diagram is unnecessary, because of the regularity of a computer's design. Computers are built up in a hierarchy of parts, with each part repeated many times over. All you need to understand a computer is an understanding of this hierarchy."

"Another principle that makes computers easy to understand is the nature of the interactions among the parts. These interactions are simple and well-defined. They are also usually one-directional, so that the actions of the computer can be sorted neatly into causes and effects, making the inner workings of a computer more comprehensible than, say, the inner workings of an automobile engine or a radio. A computer has a lot more parts than a car or a radio does, but it's much simpler in the way the parts work together. A computer is not dependent so much on technology as on ideas.

Moreover, the ideas have almost nothing to do with the electronics out of which computers are built. Present-day computers are built of transistors and wires, but they could just as well be built, according to the same principles, from valves and water pipes, or from sticks and strings. The principles are the essence of what makes a computer compute. One of the most remarkable things about computers is that their essential nature transcends technology. That nature is what this book is about.

This is the book I wish I had read when I first started learning about the field of computing. Unlike most books on computers - which are either about how to use them or about the technology out of which they're built (ROM, RAM, disk drives, and so on) - this is a book about ideas. It explains, or at least introduces, most of the important ideas in the field of computer science, including Boolean logic, finite-state machines, programming languages, compilers and interpreters, Turing universality, information theory, algorithms and algorithmic complexity, heuristics, uncomutable functions, parallel computing, quantum computing, neural networks, machine learning, and self-organizing systems. Anyone interested enough in computers to be reading this book will probably have encountered many of these ideas before, but short of an education in computer science there are few opportunities to see how they all fit together. This book makes the connections - all the way from simple physical processes like the closing of a switch to the learning and adaptation exhibited by self-organizing parallel computers.

A few general themes underlie an exposition of the nature of computers: the first is the principle of functional abstraction, which leads to the aforementioned hierarchy of causes and effects. The structure of the computer is an example of the application of this principle - over and over again, at many levels. Computers are understandable because you can focus on what is happening at one level of the hierarchy without worrying about the details of what goes on at the lower levels. Functional abstraction is what decouples the ideas from the technology.

The second unifying theme is the principle of the universal computer - the idea that there is really only one kind of computer, or more precisely, that all kinds of computers are alike in what they can and cannot do. As near as we can tell, any computing device, whether it's built of transistors, sticks and strings, or neurons, can be simulated by a universal computer. This is a remarkable hypothesis: as I will explain, it suggests that making a computer think like a brain is just a matter of programming it correctly.

The third theme in this book, which won't be fully addressed until the last chapter, is in some sense the antithesis of the first. There may be an entirely new way of designing and programming computers - a way not based on the standard methods of engineering. This would be exciting, because the way we normally design systems begins to break down when the systems become too complicated. The very principles that enable us to design computers lead ultimately to a certain fragility and inefficiency. This weakness has nothing to do with any fundamental limitations of information-processing machines - it's a limitation of the hierarchical method of design. But what if instead we were to use a design process analogous to biological evolution - that is, a process in which the behaviors of the system emerge from the accumulation of many simple interactions, without any "top-down" control? A computing device designed by such an evolutionary process might exhibit some of the robustness and flexibility of a biological organism - at least, that's the hope. This approach is not yet well understood, and it may turn out to be impractical. it is the topic of my current research."
>>
GeorgeNonderstine.ncf - Mon, 09 Oct 2017 09:24:34 EST ID:b83Pe13c No.121368 Ignore Report Quick Reply
>>121366
>Computers are understandable because you can focus on what is happening at one level of the hierarchy without worrying about the details of what goes on at the lower levels.
This is confusing me a bit. Don't you have to worry about every part and especially the details?
>>
GeorgeNonderstine.ncf - Mon, 09 Oct 2017 09:31:37 EST ID:b83Pe13c No.121369 Ignore Report Quick Reply
>>121368
Actually rereading the sentence I think I understand more now, it's like human hierarchy. Example, a president doesn't need to worry what average Joe on the streets had said about the president, that's what the surveillance is for. But still, in order to be in total control, the user (the pres in this case) has to know everything about the functioning system.
>>
Trans_wigger - Mon, 09 Oct 2017 09:48:34 EST ID:iGOOm3WD No.121370 Ignore Report Quick Reply
>>121369
In 'real life' (as opposed to I.T. theory) every 'organic' event which takes place on God's creation, such as those in one's society has its own effect within its own existential world.

I.E...."Mankind is made in the image of God." or "Man is God writ small."

Surveillance is not for any noble purpose. Perhaps 'in theory' but usually not in practice. Especially in ultra-capitalist Amerikkka.

The President and the establishment and the ruling class think they are 'high' or that they are 'higher' than others (this is human lust and passion) but in truth we all bleed the same in the presence of God.

So on the street level, events which take place have their due significance, no matter what air of arrogance a politician carries and no matter all the catty comments one can make about it, military or not.

It is "We the People" and not "We the Military-Industrial Complex". They are the usurpers and detractors from the 'source code', we are the people of the United States who defeated Germany and Japan, dropped atomic bombs of facists, feed and house the hungry and poor and afflicted of the world, stopped the mouths of tyrants, and landed a human being on the moon. All in the name of liberty and in the name of God and what is right.
>>
bernie_sanders_og_shalom - Mon, 09 Oct 2017 14:33:34 EST ID:81pruxy+ No.121371 Ignore Report Quick Reply
1507574014570.png -(91283B / 89.14KB, 799x843) Thumbnail displayed, click image for full size.
>>121368

i.e.. copy & paste

you only have to copy a URL, and paste the URL for the URL to automatically display an entire webpage

it is like reading blueprints...if you roll out all of the blueprints of an OS, you can focus in on different parts of the OS to see what is going on at a specific part of the OS....and then you can, from there intuit what will take place in the higher or lower levels of the 'OS hierarchy'

API - integral control structure of windows operating system organization to determine how the windows OS functions...one of the 'internal components under the hood', such as a hood on a car contains internal components

I.E.....focusing in on windows API or
.....focusing in on an open source kernel
.....focusing in on GUI display and programming layouts for display

if you look at some part of a windows OSI, for instance, by viewing this part of the hierarchical structure of the (already-programmed) OS software, to be run on a physical machine (hardware), you can see that it fits in with the entirety of the software hierarchy for the whole OS...and thus by looking at a single point in the hierarchy you can discern things about the rest of the hierarchy because there is already a 'standardized' conception of how everything is supposed to function to make the whole OS function appropriately

you could say it is 'standardized' or that it is 'centralized'
>>
bernie_sanders_og_shalom - Mon, 09 Oct 2017 14:34:51 EST ID:81pruxy+ No.121372 Ignore Report Quick Reply
1507574091989.jpg -(98069B / 95.77KB, 960x720) Thumbnail displayed, click image for full size.
you can copy and paste this whole hierarchy 9,999,999 times and it will be exactly the same (mass production/automation)


Report Post
Reason
Note
Please be descriptive with report notes,
this helps staff resolve issues quicker.