mkline55

2013-Apr-04, 07:11 PM

I have been a programmer for a number of years, and only recently delved into physics and astronomy. I was surprised to see this April 3 article (http://www.space.com/20484-how-spiral-galaxies-evolve.html) from space.com, which discusses some results from a supercomputer simulation where "scientists used powerful software to model the formation of stand-alone disk galaxies and follow up to 100 million hypothetical stellar particles being tugged at by gravity and other astrophysical forces."

Why the surprise? In my spare time over the previous week (just prior to the article release) I had written a PC-based simulation of my own to analyze galactic structure and formation, and this article sounded like a very similar model, though much further developed, and probably far better-funded. Granted, I have only scaled my model up to 8 million particle points, but the operation is entirely scalable, limited only by the memory of my PC and limits of the programming language (VBA), plus a degree of patience.

My initial model only used mass and location to analyze gravity intensity/acceleration by calculating the effect from every particle in a sphere of consistent density on varying locations inside and outside the sphere. I set up the analog sphere as a 3-D array representing the three spatial dimensions. The content of each element being the 'mass' of the particle at that location.

The reason I used a sphere was so I could validate my math and logic. Gravitational effects for a sphere conform to some fairly simple math formulas. Testing amounted to running my particle-by-particle calculations, adding the results, and comparing that to the simple formula. After a couple corrections, the results of my calculations matched the formulas.

The second phase of the model squashed the sphere into a flat disk shape, so that density was highest at the center, and dropped off to zero at the edges. I could then calculate the net acceleration due to gravity at any location, but I chose to work just in the plane of the disk. I am still puzzling over the results. They are quite different from the sphere model.

I tested the model using a sphere diameter of 40, requiring a matrix of 64,000 points (40 X 40 X 40). I later expanded the model to a diameter of 200, or 8,000,000 points and a sampling of 500 distances, making for over 4 billion calculations. It's amazing what a PC can accomplish while I sleep.

The newest version is an attempt to add directional vectors for each particle, so the model can generate meaningful time-lapse type pictures of motion, or simply show the results of what happens after a longer period of time.

I'll go back to the spherical model as a starting point. Instead of simply calculating the acceleration at various locations, I want to calculate the effect of every particle on every other particle. A million particles is a million times a million calculations for each time period. Maybe my matrix is too large for a PC to handle in a reasonable time frame. The 64,000-particle model will do for starters. 4 billion (64K * 64K) calculations per period is somewhat more manageable.

I'm considered whether or not I should change the 3-D array to a 4-D array with time as the fourth dimension, but I really only need to know how the current matrix maps onto the next. Once the next generation is built, I no longer have any use for the previous matrix. Funny how much like time that really works.

I'll also want to add collisions in the mix, which could have varying levels of stickiness - depending on velocities and other factors. I'm not certain if charge would be significant as well, but I'll keep that in mind.

The characteristics of each particle are limited only by creativity. Eight million calculations for each sample location is a lot, so I have to keep in mind that each tested location requires 8 million calculations for each new variable. Four variables, and the results take four times as long to process.

I'm still working on the display feature, but right now, I plan to use color and intensity to show what is where visually. Since it's a 3-D data model on a 2-D display, it tends to just look washed out because each pixel represents many particles. Any helpful suggestions would be appreciated. Maybe show depth via a line of pixels, or a hexagon. Also, since I'm thinking about it now, I need to add a save and restart capability, as I may want to run the model for as long as weeks at a time. Ever have that disappointment of a glitch after hours of processing time and knowing you have to start all over from the beginning?

Other future enhancements I'd like to try are involving an inflow of particles, perhaps somewhat off-center to the galactic sphere. Or, how about two passing streams of particles on non-parallel trajectories and no initial galactic sphere. Could that generate a spinning galaxy at the vortex? Time - and patience - will tell.

Why the surprise? In my spare time over the previous week (just prior to the article release) I had written a PC-based simulation of my own to analyze galactic structure and formation, and this article sounded like a very similar model, though much further developed, and probably far better-funded. Granted, I have only scaled my model up to 8 million particle points, but the operation is entirely scalable, limited only by the memory of my PC and limits of the programming language (VBA), plus a degree of patience.

My initial model only used mass and location to analyze gravity intensity/acceleration by calculating the effect from every particle in a sphere of consistent density on varying locations inside and outside the sphere. I set up the analog sphere as a 3-D array representing the three spatial dimensions. The content of each element being the 'mass' of the particle at that location.

The reason I used a sphere was so I could validate my math and logic. Gravitational effects for a sphere conform to some fairly simple math formulas. Testing amounted to running my particle-by-particle calculations, adding the results, and comparing that to the simple formula. After a couple corrections, the results of my calculations matched the formulas.

The second phase of the model squashed the sphere into a flat disk shape, so that density was highest at the center, and dropped off to zero at the edges. I could then calculate the net acceleration due to gravity at any location, but I chose to work just in the plane of the disk. I am still puzzling over the results. They are quite different from the sphere model.

I tested the model using a sphere diameter of 40, requiring a matrix of 64,000 points (40 X 40 X 40). I later expanded the model to a diameter of 200, or 8,000,000 points and a sampling of 500 distances, making for over 4 billion calculations. It's amazing what a PC can accomplish while I sleep.

The newest version is an attempt to add directional vectors for each particle, so the model can generate meaningful time-lapse type pictures of motion, or simply show the results of what happens after a longer period of time.

I'll go back to the spherical model as a starting point. Instead of simply calculating the acceleration at various locations, I want to calculate the effect of every particle on every other particle. A million particles is a million times a million calculations for each time period. Maybe my matrix is too large for a PC to handle in a reasonable time frame. The 64,000-particle model will do for starters. 4 billion (64K * 64K) calculations per period is somewhat more manageable.

I'm considered whether or not I should change the 3-D array to a 4-D array with time as the fourth dimension, but I really only need to know how the current matrix maps onto the next. Once the next generation is built, I no longer have any use for the previous matrix. Funny how much like time that really works.

I'll also want to add collisions in the mix, which could have varying levels of stickiness - depending on velocities and other factors. I'm not certain if charge would be significant as well, but I'll keep that in mind.

The characteristics of each particle are limited only by creativity. Eight million calculations for each sample location is a lot, so I have to keep in mind that each tested location requires 8 million calculations for each new variable. Four variables, and the results take four times as long to process.

I'm still working on the display feature, but right now, I plan to use color and intensity to show what is where visually. Since it's a 3-D data model on a 2-D display, it tends to just look washed out because each pixel represents many particles. Any helpful suggestions would be appreciated. Maybe show depth via a line of pixels, or a hexagon. Also, since I'm thinking about it now, I need to add a save and restart capability, as I may want to run the model for as long as weeks at a time. Ever have that disappointment of a glitch after hours of processing time and knowing you have to start all over from the beginning?

Other future enhancements I'd like to try are involving an inflow of particles, perhaps somewhat off-center to the galactic sphere. Or, how about two passing streams of particles on non-parallel trajectories and no initial galactic sphere. Could that generate a spinning galaxy at the vortex? Time - and patience - will tell.