Debug: Database connection successful
You are not logged in.
Pages: 1
From this thread:
Oh, past discussion, and I think I did give a presentation about radiation at a past Mars Society convention. It started when one woman made the observation that neutron spectrometers on Mars Odyssey measured neutrons coming up from Mars. That means neutron radiation. Interaction of radiation from solar wind with Mars regolith generates neutron radiation. Ok. So through contacts I got a student at MIT to estimate dose. The result was 0.5 REM per year (just neutrons).
Something we don't normally consider is that the soil itself on Mars might be radioactive. I don't know enough about the radiation environment on Mars to say, but it seems to me like there may be a significant neutron flux, which would make it quite difficult to make areas that are truly immune to radiation (If you're building enclosures from radioactive steel and radioactive bricks, your enclosures will be radioactive). It is fortunately unlikely that the air or water will become radioactive, because Hydrogen, Carbon, Nitrogen, and Oxygen both have a couple of isotopes between them and radioactivity. Argon will decay in a short amount of time to Potassium, so that's not an issue.
RobertDyck, do you have more information on the matter, particularly on the actual neutron flux in Neutrons/m^2*year? This is potentially a major issue from a radiation standpoint; Incoming radiation can be blocked but if the very material around you is "hot" that's much harder to deal with.
-Josh
Offline
Like button can go here
I have a set of papers published by the team for the MARIE instrument on Mars Odyessey. I saved a copy on the local chapter website.
Radiation Climate Map
Mars Flux Paper
FC-Nara-Paper
PS-Nara-Paper
The first two papers have the same charts at the end. The charts give numbers. With total radiation in the range of 20 - 24 REM/year for any place we would want to go, results of 0.5 REM/year neutron is relatively small. US nuclear reactor workers are allowed 5 REM/year, and radiation from a reactor will be almost exclusively neutron.
There is a paper from Curiosity, but I don't have a current subscription to the journal Science.
Mars’ Surface Radiation Environment Measured with the Mars Science Laboratory’s Curiosity Rover
Offline
Like button can go here
My question is more regarding residual radiation, e.g. neutrons absorbed by the regolith and thus transmuting it to radioactive material. For example, if Calcium-40 (the most common isotope of Calcium) absorbs a neutron, it turns into Ca-41, which is slightly radioactive (Half life ~10^5 years) and would contribute to radiation within the habitat. I'm just curious about what the neutron flux is, because the amount of these radioisotopes is determined primarily by that. This, in turn has very significant effects-- Calcium, for example, becomes part of your bones. Will our colonists be at a great risk of contracting leukemia because the calcium in their food is radioactive?
I don't know, and that's why I'm interested in Neutron counts rather than the immediate radiation dose.
-Josh
Offline
Like button can go here
By the way, I have access to the Curiosity paper through my university and I'd be glad to email it to you if you're interested. I believe you have my email?
One interesting table discussed the radiation under the surface. According to the paper, under 2 m of regolith the annual GCR dose will be 15 mSv, 1.5 Rem. While not trivial, this is well within acceptable limits. Although their models are more complex than this, I found the halving distance to be about .45 m for every meter past the peak radiation level at .1 m below the surface. Overall, for depths greater than 10 cm, a good approximation is:
R(D)=R_0*1.5*.5^(D/.45)
Where R(D) is the radiation that would be experienced by a person under D meters of shielding and R_0 is the radiation that they would experience with no shielding. Using the density estimate from the Curiosity paper (2800 kg/m^3), and assuming a habitat pressure of 500 mb, the depth of shielding would have to be 4.8 m. At this depth, the annual radiation dose for a person who remained inside would probably be about .2 mSv, or .02 Rem. Compared to .3 Rem per year on Earth, this is negligible.
However, this calculation is meaningless if the regolith used for shielding is itself radioactive.
-Josh
Offline
Like button can go here
By the way, I have access to the Curiosity paper through my university and I'd be glad to email it to you if you're interested. I believe you have my email?
One interesting table discussed the radiation under the surface. According to the paper, under 2 m of regolith the annual GCR dose will be 15 mSv, 1.5 Rem. While not trivial, this is well within acceptable limits. Although their models are more complex than this, I found the halving distance to be about .45 m for every meter past the peak radiation level at .1 m below the surface. Overall, for depths greater than 10 cm, a good approximation is:
R(D)=R_0*1.5*.5^(D/.45)
Where R(D) is the radiation that would be experienced by a person under D meters of shielding and R_0 is the radiation that they would experience with no shielding. Using the density estimate from the Curiosity paper (2800 kg/m^3), and assuming a habitat pressure of 500 mb, the depth of shielding would have to be 4.8 m. At this depth, the annual radiation dose for a person who remained inside would probably be about .2 mSv, or .02 Rem. Compared to .3 Rem per year on Earth, this is negligible.
However, this calculation is meaningless if the regolith used for shielding is itself radioactive.
I live in Rome, one of the most radioactive cities in the world: we have almost 0.2 REM/year, but the frequencies of neoplasies are almost the same of other less radioactive cities. I suppose that 1.5 REM/yr for 500 days are less than 0.5 REM/yr. for 80-90 years.
Last edited by Quaoar (2014-04-11 02:51:10)
Offline
Like button can go here
The average American is exposed to about .6 REM in any given year from a mix of environmental and occupational factors, and it doesn't seem to be affecting us too badly.
In any case, though, that's not what I'm worried about so much as how possible it really is to get radiation levels in the hab to truly negligible levels in order to allow relatively high occupational exposure to radiation from the greenhouse, from journeys outside in spacesuits, solar storms that we are not truly prepared for. In one mission this is not so much of an issue, but for colonists who are in it for the long haul it gets to be something pretty significant.
-Josh
Offline
Like button can go here
US 0.6 REM is a figure that surprises me. I thought the US average background was closer to 0.3 to 0.4 REM, with about a third of that coming from coal plant plumes. It would be up to factor 10 higher at high-altitude locations in the presence of radioactive ores. Yet folks in Denver seem to be about as healthy as the rest of us. So, civilians seem to do mostly fine with lifetime-long annual exposures up to 3 full REM. Maybe more if the 0.6 figure is better than mine.
I'd still set the standard a bit lower, nearer 1 REM max annual, but that 3 would also likely be more or less OK. Nobody really knows for sure. Astronauts are allowed 50 REM max annual, with a career limit that is age and gender dependent. They can only absorb that max a very few times in their lifetimes.
I'd think a standard somewhat like that astronaut standard would work for civilians going on colonization voyages, with a long-term domestic exposure in their new homes nearer 1 (or at most 3) REM annual. And BTW, unshielded GCR exposure in the vicinity of Earth varies sinusoidally with the solar activity cycle from a min of 24 REM annual to a max of 60 REM annual. That's actually just about the same as the astronaut's max allowed dose rate.
The only real risk on a voyage is being hit by an X-class coronal mass ejection, which is about like going outside in the max fallout after a nuclear bomb goes off on or near the surface (100's to 1000's of REM per hour, for several hours). Yet the energy is lower, and shielding is possible. 20+ cm of water is an effective shield for that.
What would the source be for radioactivity in Martian soil? Induced by GCR or solar wind/flare interaction? I know there can be induced radioactivity in Earthly dirt from near-, on-, or sub-surface atomic explosions, but that's the extreme case. It's where most bomb fallout comes from. Air bursts that do not pull dirt up through the fireball are much cleaner: just the bomb fragments.
Lesser sources like reactors, even reactor accidents, seem not to induce much of anything in the basic soil, they just contaminate it. That's different.
If Martian dirt really is radioactive, could that be a sign of abundant nuclear fuel ores there?
GW
Last edited by GW Johnson (2014-04-11 09:35:41)
GW Johnson
McGregor, Texas
"There is nothing as expensive as a dead crew, especially one dead from a bad management decision"
Offline
Like button can go here
What would the source be for radioactivity in Martian soil? Induced by GCR or solar wind/flare interaction? I know there can be induced radioactivity in Earthly dirt from near-, on-, or sub-surface atomic explosions, but that's the extreme case. It's where most bomb fallout comes from. Air bursts that do not pull dirt up through the fireball are much cleaner: just the bomb fragments.
Lesser sources like reactors, even reactor accidents, seem not to induce much of anything in the basic soil, they just contaminate it. That's different.
If Martian dirt really is radioactive, could that be a sign of abundant nuclear fuel ores there?
I was specifically referencing neutron radiation; If some amount of "reflected" neutron radiation was measured from orbit, that means that there is a neutron flux at the surface, which could transmute elements into heavier, radioactive isotopes. Cosmic rays also have this potential, think spallation.
The existence of secondary radiation means that we can say for a fact that cosmic rays do something to the material they interact with, but that's mostly electrons, I think. Neutrons are much more likely to interact directly with nuclei.
If interaction cross sections are large, maybe only the uppermost level is radioactive, which has unfortunate consequences because there's all that fine dust blowing around everywhere.
If they're small, maybe Mars is radioactive all the way down to a few meters. In that case, it will be necessary to dig deeper than that to obtain most materials.
By the way, the .6 REM per year figure is about 50/50 natural and artificial (particularly career related) sources, which explains the discrepancy.
-Josh
Offline
Like button can go here
Some solar radiation gets through the Martian atmosphere. Neutron radiation is produced as secondary radiation. That was the basis of instruments on Mars Odyssey that searched for water. The same instruments that Lunar Prospector used for the Moon. Hydrogen absorbs epithermal neutrons, so a dip in those emissions means the soil has hydrogen in some form. Large chunks of ice absorb high speed neutrons, so a dip in that detects large chunks of ice. There were no dips in high speed neutrons on the Moon, so no large chunks of ice, but there were dips in epithermal neutrons at the Lunar poles, so there is hydrogen there. It could be clay or gypsum or tar, or could be ice the size of a grain of salt. But the reason it was raised here is the fear that neutron radiation will "infect" Mars soil. That is, cause radioactive isotopes through neutron absorption. There isn't much of that, or it would have been detected.
Oh, there are nuclear ores. Mars Odyssey mapped many things. One map was thorium. Thorium is considered an indicator for uranium, but thorium itself can be used as nuclear reactor fuel. Looks like Elysium Planetia has low thorium.
Last edited by RobertDyck (2014-04-11 22:34:32)
Offline
Like button can go here
To be clear, I believe these results showed that even "high" thorium is parts-per-million.
I'm glad to hear that there's a reason to believe that things aren't particularly radioactive. It was a concern, but it's probably true that one or many of the sensors we've sent would recognize it.
-Josh
Offline
Like button can go here
So there's some radioactivity in Martian dirt, it's around 0.5 REM annual, and seems to be neutron radiation. And we have no idea how deep it goes. That's not a killer dose, even for lives spent there, but it is rather inconvenient to build a radiation shelter out of slightly-radioactive material. I suppose therefore that determining how deep this goes is crucial data. Another reason to drill meters down....
GW
GW Johnson
McGregor, Texas
"There is nothing as expensive as a dead crew, especially one dead from a bad management decision"
Offline
Like button can go here
Most of that 0.5 REM/year is secondary radiation from active radiation from space. Only a tiny fraction is from radioactive dirt. And space radiation only penetrates 1 metre, according to the NASA team for the instrument on Mars Odyssey. But yea, drilling is a good idea.
Offline
Like button can go here
Were the lunar rocks returned by the Apollo missions appreciably radioactive? Do the space station modules suffer from high levels of induced radioactivity? From what I understood the problem with GCR was direct damage not induced radioactivity.
Offline
Like button can go here
Pages: 1