Jump to content
Science Forums

An Expirement Never Tried


Recommended Posts

Hello there all you Hypographers! I have excellent news to deliver as well as some requests, so let us begin; Here is where my idea stands and what is left to be done:

 

The basic concept is derived from "Unlimited Detail" technology. Here is the video if you are not familiar with the software -

If you want more detail on the software or company, click on Quipster's channel for videos, or go here for their website - http://www.euclideon.com/home.html

 

After you've watched that, or if you already understand the "infinite" data idea and realized it's complexity and all the thousands of possible facets for its use, then I want you to consider one use alone:

 

An entire recreation of an organic life-form on the cellular level. It would be the first entirely digitized organic being.

 

Now as to what this actually entails and what is required as far as the process of achieving it goes is this - A team of many scientists of various backgrounds, many programers of varying specialties, getting a hold of this graphics engine, messing with the physics properties to get them as near to true as possible, programming data for about 118 objects (enough for a basic periodic table) and using individual points as the objects, thus setting up a system of atoms.

 

We can then model a structure of atoms to represent chemicals. Chemicals can have bind actions and other properties placed on them like chain events in video games, and the chemicals can have specifically programmed events occur when they interact. From there we can input thousands of different types of chemicals into the program database. And once more from there, we can design one Human Embryo Cell, accomplishing this by setting up a chain of amino acids held together by phosphate groups (I'm assuming you know I mean DNA), encasing it in the nuclear membrane made by the compounds and chemicals we've previously assorted, and basically setting up the rest of the cell in a similar fashion. This will be no short task, as it will be a scale model (in the idea that each point is an atom, and there are billions of atoms in a cell of this nature). Not to mention we will need an entire human genome to code for (a real-life individual will need to be chosen as the basis of this for practical and ethical reasons).

 

After an entire Human Cell is synthesized in this program, we can set up essential proteins, vitamins, and other chemicals to spawn at given intervals at specified quantities as to provide the proper nutrients for the cell to function normally. We can place the cell and essential nutrients in an H
2
O environment and let the program run just like a video game, with the dynamic environment unfolding based on the events and actions it is programmed to do. In theory, with basic properties of atoms in play, as well as a more life-like physics engine being manipulated (the gravity specs will be oriented towards real-life weight), the "cell" should act like a normal cell in development and take in surrounding objects for processing and growth, as well as carry out other basic and complex cellular tasks.

 

The process of synthesizing a cell will be facilitated by the program, in that it is designed for video game-mechanics, and things about particle physics and quantum mechanics will not be necessarily factors due to the object-oriented items that these "atoms" will be. Each different object, or atom, will have varying weights and will react with each other object on a scale of probability which will be preset and differentiated for each object (each atom will have specific weight and reactivity stats in comparison to every other atom, as well as how close atoms can get based on their theoretical size, i.e. imaginary mass will be put into the game engine creating an invisible barrier effect which imitates the electron field of real atoms). This significantly reduces the amount of data needed for each atom and will allow for a more realistic or almost natural environment to be simulated when the system is ran.

 

NOTE: This will be done in computer software! Software much like a video game (or entirely like, depending on perspective) engine! We can stop, rewind, fast-forward, edit, change out, copy and paste, entire aspects of the environment, or the entire thing altogether! It will be like, or entirely, playing GOD. There are very serious ethical, moral, and political implications to carrying out a project like this. And as far as the general public is informed, no project or experiment of this manner has ever been performed, due to it's relatively new concept, as well as the former listed implications. However, the fact that we would have supreme and ultimate control over a situation like this means that it would be 100% a controlled environment down to the most minute details and factors. Not to mention before we even ran the system initially, I would ensure checks would be put into place as well as limitations on the amount of human error initially possible to go into it.

 

If a full Human Being was developed in such a program, a copy could be made and ran (with higher-level neural capabilities severed for ethical purposes of cruelty) in which to test all kinds of different diseases and viruses, and run millions of different chemical combinations to see if any of them cripple the ailment. Think of the possibilities, in such an environment, Cancer, Influenza, AIDs, could all be analyzed and fought a million times over in mere minutes to find a practical cure.

 

Henry Markram talks about a very similar project -

Something related could be easily obtained from my own idea, in that it would be very easy to 'frame-by-frame' watch how a neural signal travels throughout the brain on micro or macro scale. I'll talk more about this when I edit it later.

 

In an interview with Bruce Dell, that CraigD led me onto, Mr. Dell states, "We are able to display pictures with no geometry limitations at 24-30 fps 1024*600, one core with out any graphics hardware assistance, and we have only just begun to optimize so we are hoping to double that without any hardware assistance." He says this while running trillions of points in the computing environment. This makes me feel much better about the idea of being able to handle this information on a small scale than Markram's project.

 

Since posting the original of this topic, IDMclean has informed me that in comparison of the Henry Markram video, it would take a lot of processing power for such a computer environment to be rendered. I am in agreement with him that Unlimited Detail is less processor intensive than Henry Markram's system. Running my system would still require processing power beyond the normal home computer. Acquiring a super computer is not something I know how to go about, nor am I legally an adult yet, so I don't think I'm in the position to make any business deals with IBM. This is a major problem that will most likely lead to me not being nearly as involved with this project as I would like, given the prospect that it actually goes anywhere. Any solutions or work-arounds would be helpful. =P

 

A project of this size can be taken on with enough people, managed correctly. I am but one man, with a Biology, Chemistry, Physics, and Programing background I know to be lacking in order to be able to sufficient tackle this project. As well as lacking enough time or current support to gather data and monitor a system of this size. I also do not have equipment necessary for such a task at the current time, as several higher-end computers and most likely a server or two would be necessary in the long run.

 

The Team Personnel required for a project like this to be taken on would require a minimum of:

 

One to Three Physicists - One specializing in particle physics, others varied.

Several Biologists - Majoring or Graduated in the options of Cellular Biology, Neurobiology and Behavior, Human Physiology, Genetics, and possibly more.

Three to Five Chemists - The focus being in Organic Biochemistry (I may be wrong, but I think that's all that's required from that field for this type of project).

Several Programmers (more than half a dozen) - With extensive knowledge of standard programming languages as well as decent experience with game-design engines.

If there is anything else needed that I missed, please do inform me, it is quite late where I am. Aside from that, this is a minimum required for handling this project.

 

I will continue editing the post, as I am sure my erratic writing style has made me jump and leave things out. If you feel that this topic holds significant and definite relevance to practical applications in life and would like to help, please tell me in pm or comments and I can start a sign-up sheet for people who can fill in the fields for the base team.

 

I will be monitoring this every day. Also, I NEED YOUR HELP, Hypography community, to give me input on how to go about making this project actually happen. I am not opposed to the idea of having someone else run it, I just want to see the idea take off.

 

Please tell me if any of you are willing to help and give any further feedback on this idea, as well as cover major flaws you may have found in such an idea that I have not. Thank you, and I'm editing this currently, so expect major changes. And please give feedback! =P

Link to comment
Share on other sites

Welcome to hypography, Matthew! :) You’ve got some cool ideas. :thumbs_up

 

Here’re a few ideas of mine on the subject:

  • In essence, the idea of modeling individual atoms or even sub-atomic particles has been around since before electronic computers or modern particle physics were even thought of. It’s fairly clear that folk like Charles Babbage and Ada Lovelace were having these thought in the 1820s, and some evidence (lots, if you credit Neil Stephenson’s Baroque Trilogy speculations) that folk like Blaise Pascal and Gottfried Leibniz were in the mid 1600s, imagining that such computations might be done with complicated mechanical computers.
  • There’s a fundamental physical limit to the “resolution” that such an approach can achieve, due to quantum uncertainty. Thus, an exact numeric simulation of atoms or sub-atomic particles is not possible even in principle.
  • The serious idea that a human can be modeled in detail in a realistic way has a fairly long – since about 1978 – recent history, and is the heart of the controversies like Strong AI and New Mysterianism.
  • Assuming some simplifying assumptions to circumvent the preceding limitation – as I assume that what folk like Bruce Roberts Dell and you, Matthew, are interested in is making better 3D graphics rendering programs (in Dell’s case), and a human in silico – we should estimate whether the computing resources necessary to do it currently exist, or if not, when they might.
  • Hans Moravec did some good work in the 1990s, leading to a pretty well respected conclusion, based on his comparison of computer image recognition and small animal (mostly insect) neurological image recognition, and application of Moore’s law to predict that computer hardware necessary to run a human-like intelligence in real-time (using efficient programs, not simulations of individual cells or atoms) is likely to be available (and affordable) in the 2030s.

All this said, I think the project you have in mind would be a long one, working mainly on how to program the hardware that will be available about 20+ years from now. “Moravec’s law” only suggests that the needed hardware will be available in the near future, not that, without a lot of hard work between now and then, we’ll know how to program it!

 

I also suspect - just a guess, bit one based on years of deep musing and hobby programming - that a “bottom up” approach – modeling individual cells or atoms – won’t be the approach to take. Rather, it’ll be necessary to “pick your scale” for the many subsystems required by the model, much as is necessary in present day video games. This is both heartening and disheartening, as such programming is now, and is I think likely to remain for a decade or three, more intuitive art than reducible science, and while I love the art of (top down) programming, I feel most comfortable when it reduces to a small bag of widely applicable (bottom up) techniques.

 

On the subject of "Unlimited Detail" technology: it seems to me to be both a commercial venture, and a philosophical embracing of point-based volume modeling and rejection of polygonal modeling. I’m a big fan of using points for non-graphical simulations (most of the models of mine here at hypography are like this), and I appreciate Dell’s argument that polygon modeling is more complicated and problem-prone (from this interview), I don’t think there’s much fundamental computation difference between the two for graphics rendering. To tie the processing requirements to the display resolution rather than what’s being modeled, I assume the UD approach is a kind of eye-based (AKA reverse) ray tracing. With polygons, the basic equation being solved for each ray is for the intersection of a line with the plane of a polygon, then finding if it’s within the polygon’s edges. With points, it’s solving for the intersection of a line and a sphere centered at a point. These calculations have about the same computational cost.

 

I don’t think there’s much connection between very fine scale – molecules, atoms, or finer – and Dell’s UD venture. One’s a physical modeling approach, the other an approach for rendering surfaces. Dell’s interested in rendering representations of physical things, not modeling the processes that can generate those representations.

Link to comment
Share on other sites

I feel like my edits have sufficiently answered your retorts, but if you feel reading it again is tl;dr I suggest you watch more of the software videos and re-read your article interviewing of Dell and then compare it to Markram's project and then tell me it can't be done for another decade.

 

I understand that you believe that the virtual memory will max out at a given point when the environment becomes larger and larger and eventually becomes too hard to process, and I agree with you. But Markram needed only one processor for each neuron (one per cell) and he's running a much more complex and processor intensive system than I am, and he was able to amass thousands of processors on super computers. A home computer without a graphics card can run multiple cells with the unlimited detail software. I am fairly certain that the Human Cell inside of this game engine would reach development into full fetus before the amount of cells and processor requirements exceeded that of fields of super computers. Even if they embryo only reached 20,000 cells, the amount of human physiology and biological data (and detail of the information acquired) that we would have obtained would be beneficial on a scale unreached in human society so far. Just at 20,000 cells, we'd be able to record in excruciating detail the whole process of fetal growth. The amount of data we'd get on organ and tissue development would be unmatched.

 

Even if it stayed at a single cell, we'd be able to monitor the single cell on a scale of detail unmatched so far. We'd be able to record things about cellular biology that are not yet understood. There are a plethora of things things unknown about minuet details of DNA replication that would be solved.

 

I think ignoring this project until 'technology advances' is a (pardon my french) stupid, and unimaginative idea. The fact that this can be done TODAY not ten years from now, and the fact that other people haven't jumped on it, gives us a great deal of power if there's enough support to launch the project. Imagine being the first person to do this, even if it never goes beyond a single cell, it's still the first entirely virtual organic life form to ever be created. Not to mention it would technically be a Human life form by genetic definition.

Link to comment
Share on other sites

This experiment seems like it would take millions (more likely billions) of dollars to make a success. What would be the practical application needed to justify not only the huge expense, but the massive amount of man hours of work, research, developement,and testing that will surely be needed to make it happen? Is there an actual benefit to this, or is it playing God for playing God's sake?

Link to comment
Share on other sites

This experiment seems like it would take millions (more likely billions) of dollars to make a success.

 

It would only cost the initial payment for the purchase of the game engine itself, and possible a few servers (which could be easily obtained through sponsors). The only other fees involved would be employee payment. This project is entirely feasible and I'm willing to pay for the software my self if there is enough interest to launch this project.

 

 

What would be the practical application needed to justify not only the huge expense, but the massive amount of man hours of work, research, developement,and testing that will surely be needed to make it happen? Is there an actual benefit to this, or is it playing God for playing God's sake?

 

Man hours of work, research, development, and testing? Re-read the OP and watch more tutorial videos on the software. If it's not apparent that the initial cell is all we need, and the programming/development work for it would only take 3-5 months, then I'm not sure you understand this idea. Not to mention the practicality of it is the benefit of having a perfect research tool for all medicine and science to come, being able to have a human being to test things on thousands of times over without actually harming a real human being.

 

Another practical use is the advancement of the human race into a society were technology is used in a more virtual way (in terms of medicine and science). If the cell developed into a full man (as should happen when basic mechanics of the engine kick in) the world would have a perfect cadaver, a perfect physiology database, a perfect anatomy reference. We'd be able to test millions of "medicines" and chemicals on isolated or large-scale areas in a day by running them through the engine. Cures would be found virtually before they were found in real life. Cancer, if a cure is possible (and I highly suspect it is and has already been found), would be solved, along with many other diseases and potentially "incurable" ailments that our current society deals with.

 

The amount of practical medicine that is produced from such a project, as well as extensive research into the human body for things we have never been able to understand, would produce money greater than any current stand-alone pharmaceutical company. Do you realize how many millions, billions even, of dollars that are poured into cancer research alone every year? Cancer is just one aspect of a plethora of things that could easily be solved by an idea of this sort. The revenue from it would also greatly outweigh any initial costs.

 

It is in no way "playing" God, I used that metaphor as a creative example for the amount of power and control we would have over the testing environment. If an entire human being was created in such a system, with the same exact intellectual capabilities as any other standard human (as is expected if cellular growth and biological development occurs naturally), then he would be treated like a human being, we could name him Joshua and allow him to explore his virtual world like any real place by setting up structures for him to live in. There is no ethical negativity occurring from this. Any biological or medicinal testing where harm is involved would be done on a "copy-and-pasted" portion of his body ran in a separate server. The initial subject would never be tested on, only studied.

 

This is a once in a millennium thing, with time quickly closing in on it, before a major company steals it all together. In fact, the way this has unfolded, it seems like it's more of a waste not being used, than a waste started. I want you to explain to me Atom, how is it that such a thing of this concept be justified not being used at all?

Link to comment
Share on other sites

I feel like my edits have sufficiently answered your retorts, but if you feel reading it again is tl;dr I suggest you watch more of the software videos and re-read your article interviewing of Dell and then compare it to Markram's project and then tell me it can't be done for another decade.

I think you’re confusing a few very different kinds of software, Matthew: graphics rendering; neuron/brain modeling; and cell modeling.

 

“Unlimited Detail”, is proposed graphics rendering software that allegedly uses points rather than polygons as its fundamental units. In the video you posted, Bruce Dell, CEO and founder of Euclideon, presents some sensible discussion of how graphics engines might reach “unlimited detail” – a bit of a misnomer that means not that there are no limits to the amount of detail stored, but that it can render typical 3D video game graphics without unrealistic relics detectable to human being, in much the way 32 bit and higher resolution color, though it doesn’t store unlimited color data, appears to a human to be able to produce every possible color – and a strong pitch that his company is close to having succeeded in writing this software.

 

Note that I say “allegedly”, because no trusted 3rd party has actually seen Eulideon’s code or a demonstration of it, while many accomplished graphics programmers have accused Dell of engaging in a scam to get public grant and private investor money.

 

Point-based (AKA point cloud or voxel) graphics rendering is an old technology. What allegedly distinguishes the Unlimited Detail engine alleges from previous ones is, to quote Dell "We made a search algorithm, but it's a search algorithm that that finds points, so it can quickly grab just one atom for every point on the screen." Note that Dell uses the term “atom”, in place of the more common terms like voxel, to mean a stored graphical point or volume, not the simulation of an actual atom (from the periodic table, etc).

 

See today’s article Euclideon Creator Swears Infinite Detail is "Not a Hoax" for a pretty neutral article on Unlimited Detail, and this 2 Aug 2011 article by Markus Persson for a good argument that “it’s a scam”.

 

The Blue Brain project is respected academic project headed by Henry Markram, part of a family of wide-reaching neurology research programs that started around 1990, with the goal of modeling parts, and eventually all, of a human brain with a computer program. The “blue” in its name is from its use of an IMB Blue Gene computer.

 

In his TED lecture, Markram speculates that a computer simulation of a complete human brain – and thus, a human personality – may be possible in about 10 years. This is about 10 years earlier than Moravec predicted in 1999, but I believe it may be correct, as it’s based on more experience than Moravec had in ’99. However, I believe Markram would agree that his estimate is an optimistic one.

 

See the wikipedia article neuroinformatics for more information on this field, and this 18 Apr 2011 report gives a thorough non-technical summary of the Blue Brain and Human Brain projects. A summary of and links to websites about simulating the behavior of part or all of a single or multiple biological cells, using either simulations of their component molecules, or empirically tuned mathematical models can be read at the Wikipedia article cellular model.

 

This experiment seems like it would take millions (more likely billions) of dollars to make a success. What would be the practical application needed to justify not only the huge expense, but the massive amount of man hours of work, research, developement,and testing that will surely be needed to make it happen? Is there an actual benefit to this, or is it playing God for playing God's sake?

Being able to model cells has great potential benefits, as it would greatly increase our understanding of cellular processes that result in disease, especially cancer. IMHO, bioinformatics is no more “playing God” than any other medical research has been. Medicine is on the order of a trillion US dollars a year industry, justifying research spending on the order of 10s of billions of dollars a year.

 

Present day bioinformatics computer modeling is limited and moderately expensive. A Blue Gene computer can cost from $800,000 to over $100,000,000, depending on the generation and number or racks of processors in it. A rack contains 32 CPUs (by comparison, a Sony Playstation 3 contains 7). The Blue Brain Project’s 512 rack Blue Gene/P would, purchased new, cost about $650,000,000, but as it served as a kind of beta test for the Blue Gene/P, the BBP got theirs at a substantial discount.

 

In projecting the cost and computing power into the future, however, Moore’s law suggests that the processing power/cost will double about every 2 years. So Blue Brain would cost only about $20,000,000 in 2022, only $650,000 in 2032, and only $20,000 in 2042.

Link to comment
Share on other sites

  • 3 weeks later...

Craig, its just a voxel engine, uses voxels instead of polygons, there are a few in existence.

 

It's not, however, entirely unlimited detail, though the name sounds cool, its only limited to the data you have provided it. To build even a simple drop of water on atomic level would require immense amount of data, and ofcourse since it's me, i'll show you just how much data we're talking about:

 

lets consider that a molecule of water weigs 0.05g, molecular weight of water is 18.015g/mol, so [math]\frac{0.05g}{18.05 g/mol} = 0.00277546489mol[/math] and now using this, and the Avogardo's constant, [math]6.02214179\times10^{23}\frac{particles}{mol} * 0.00277546489mol = 1.67142431007467531\times10^{21}particles[/math]

 

so if we wanted to model 1 drop of water on an atomic level we would need to model 167142431007467531000 molecules of water, each consisting of 1 oxygen, 2 hydrogen atoms, and don't forget about 10 electrons meaning that to keep track on atomic level, you will need to keep track of 2172851603097077903000 points of data

 

now lets assume that we are keeping a charge for an electron (boolean)

lets also assume that we are keeping atomic weight(double), electron configuration (linked list of ints) and electrons/shell (linked list of ints)

 

On top of this, we are storing it in an octree

 

so roughly:

 

electron just has a bool

an atom actually has a double, and 2 linked lists with 2 int overhead and 2 ints per node

octree will have lets say 2 ints as overhead plus 8 ints as pointers to 8 branches, so 8 ints*n +2

 

electrons - 1 boolean of data = 1671424310074675310000 * 1 byte = 1671424310074675310000 bytes

hydrogen - 1 double, 1 list with 1 piece of data (4 ints), and another list with 1 point of data (4 ints) = 334284862014935062000 * (8 bytes + 16 bytes + 16 bytes) = 13371394480597402480000 bytes

oxygen - 1 double, 1 list with 2 pieces of data (6 ints), another list with 3 points of data (8 ints) = 167142431007467531000 * (8 bytes + 24 byte + 32 bytes) = 10362830722462986922000 bytes

 

the octree is 8ints * 4bytes * 167142431007467531000 molecules + 2bytes = 5348557792238960992002 bytes

 

so total cheapest estimate is: 1671424310074675310000 bytes + 13371394480597402480000 bytes + 10362830722462986922000 bytes + 5348557792238960992002 bytes = 30754EB (thats exabytes)

 

lets keep one reference to electron data type, as well as hydrogen as well as oxygen to optimize the data a little bit, so octree mode for a molecule totals 9 ints of data + a little overhead:

9 ints * 4 bytes * 167142431007467531000 molecules + 2 bytes + 1 byte + 40 bytes + 64 bytes = 6017EB

 

Which is to say that billions of dollars worth of equipment won't really cover the storage costs here for modeling a simple drop of water on an atomic level, never mind subatomic, or anything bigger and more complex...

Link to comment
Share on other sites

  • 7 months later...
If there is anything else needed that I missed, please do inform me, it is quite late where I am. Aside from that, this is a minimum required for handling this project.

 

 

You will need a facial mapping expert to handle it from the graphics level and a modeling expert. Your models, as in gaming, will require "rigging" that offers the animation ListProgramming generating, for the routines, handles to twist, warp and otherwise auto generate surface patches on the fly. In short, you need a computer animator above all the rest of those

 

UNLESS

 

You plan to address this from the genetic level. Then you need all the others AND still above all else, the animator. If you think like a mathematician in creative efforts you will get pixels. If you think like Walt Disney you will get a character.

 

 

I would compare tthis to the movie "The Fly" with Jeff Golbloom as Seth Brundle. Brundle could transmit non-animate objects, but the flesh? He had to think "crazy" "like those old ladies pinching babies."

 

I saw in I think it was Alexander's signature "The sun causes Global Warming." If I'm interpretting that meaning correctly, it comes from thinking not so much against the flow, but outside the box. Outside the box thinking involves playing God. To teach a computer to create a lifeform one needs to think like the creator of all lifeforms.

Edited by 7DSUSYstrings
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...