Debug: Database connection successful
You are not logged in.
The optics of mirrors and lenses is physics that is no longer changing at all. We have solutions that work for chromatic and spherical aberrations.
Being a new technology, the digital camera pixel thing improved dramatically every year for a while, but has since more-or-less leveled off. This was driven by both optical cameras and staring-array infrared seekers.
That's pretty much how technology improvement behaves generally. It's not linear, it's more of the S-curve model. There's fits-and-starts of minor improvement "bursts" along the way, as this-or-that supporting technology gets a significant improvement.
So, the technology for cameras has matured somewhere in the vicinity of 1-meter resolution at 600 miles (1000 km).
Now military technology usually "gets there" before civilian by around a decade or two. I remember a security breach regarding a proposed pole-sitter Earth observation satellite in the late 1970's or early 1980's. Its camera was said to resolve 1 km objects at million-km orbital altitude, which ratio was the security breach for that time.
Do you see the same ratio? 1:1000? OOPS!! EDIT: 1,000,000:1. Sorry, I wrote the wrong ratio. That's 1 microradian resolution for the very, very best systems, which are quite large and massive, even today.
That's the limits of resolution using the absolute best digital camera and optical focusing technology that there is. You can do worse, but you cannot do better with anything that we have or know about.
From a 300 km altitude orbit around Mars, the very best technology might resolve a 30 cm object as one pixel. One and only one pixel, not a recognizable image! That takes at least 10 pixels for a blur of an image. If the orbit is 600 km, one pixel is 60 cm, but only with the very best technology.
And that best-technology package is STILL the size of the space shuttle cargo bay, because of the mature folded optics technology (dates from the late 1960's). Such was NOT sent to Mars. Not yet, anyway.
So, the notion of seeing "at best a 2-m boulder" from a 200-300 km orbit about Mars is actually quite realistic for the craft we have sent there, and it will be a blur, not a sharp image. And THAT is why you need to send the pathfinder rover I have described in post #68 above, before you try to land a BFR/BFS there.
Physics is just physics. It's the same everywhere for everyone.
GW
Last edited by GW Johnson (2018-05-05 12:07:25)
GW Johnson
McGregor, Texas
"There is nothing as expensive as a dead crew, especially one dead from a bad management decision"
Offline
Like button can go here
GW -
I don't understand what you are saying in view of this article:
https://www.gizmodo.com.au/2016/04/new- … s-on-mars/
The article shows the digitally enhanced photos where you can easily make out objects well under 1 metre.
The Professor who produced the images states: "We now have the equivalent of drone-eye vision anywhere on the surface of Mars where there are enough clear repeat pictures. "
So, unless you are saying the Prof is spreading misinformation, I am not quite sure what you are trying to say.
The optics of mirrors and lenses is physics that is no longer changing at all. We have solutions that work for chromatic and spherical aberrations.
Being a new technology, the digital camera pixel thing improved dramatically every year for a while, but has since more-or-less leveled off. This was driven by both optical cameras and staring-array infrared seekers.
That's pretty much how technology improvement behaves generally. It's not linear, it's more of the S-curve model. There's fits-and-starts of minor improvement "bursts" along the way, as this-or-that supporting technology gets a significant improvement.
So, the technology for cameras has matured somewhere in the vicinity of 1-meter resolution at 600 miles (1000 km).
Now military technology usually "gets there" before civilian by around a decade or two. I remember a security breach regarding a proposed pole-sitter Earth observation satellite in the late 1970's or early 1980's. Its camera was said to resolve 1 km objects at million-km orbital altitude, which ratio was the security breach for that time.
Do you see the same ratio? 1:1000? OOPS!! EDIT: 1,000,000:1. Sorry, I wrote the wrong ratio. That's 1 microradian resolution for the very, very best systems, which are quite large and massive, even today.
That's the limits of resolution using the absolute best digital camera and optical focusing technology that there is. You can do worse, but you cannot do better with anything that we have or know about.
From a 300 km altitude orbit around Mars, the very best technology might resolve a 30 cm object as one pixel. One and only one pixel, not a recognizable image! That takes at least 10 pixels for a blur of an image. If the orbit is 600 km, one pixel is 60 cm, but only with the very best technology.
And that best-technology package is STILL the size of the space shuttle cargo bay, because of the mature folded optics technology (dates from the late 1960's). Such was NOT sent to Mars. Not yet, anyway.
So, the notion of seeing "at best a 2-m boulder" from a 200-300 km orbit about Mars is actually quite realistic for the craft we have sent there, and it will be a blur, not a sharp image. And THAT is why you need to send the pathfinder rover I have described in post #68 above, before you try to land a BFR/BFS there.
Physics is just physics. It's the same everywhere for everyone.
GW
Let's Go to Mars...Google on: Fast Track to Mars blogspot.com
Offline
Like button can go here
Image dithering to clear up the blurring....
One thing is for sure we will have images of a possible first landing site when Insight lands and starts its mission Landing Site: Elysium Planitia, Mars... nov 26th
Offline
Like button can go here
My understanding is it's averaging out differences between numerous images to create clear image boundaries.
Image dithering to clear up the blurring....
One thing is for sure we will have images of a possible first landing site when Insight lands and starts its mission Landing Site: Elysium Planitia, Mars... nov 26th
Let's Go to Mars...Google on: Fast Track to Mars blogspot.com
Offline
Like button can go here
Well, these are semi-experimental image processing techniques. This is based on monkeying with the data digitally. To one extent or another, what these algorithms find is the answer that they assume is there. While it may work most of the time (and then again it may not), it always runs the ASS-U-ME risk.
You can trust stuff like that if you want. I don't. Not when it partially assumes the answers it is looking for.
GW
GW Johnson
McGregor, Texas
"There is nothing as expensive as a dead crew, especially one dead from a bad management decision"
Offline
Like button can go here
I vote we deploy an airship drone to take photos from far closer to the ground. Maybe we could send a (literal) tonne of small, 10kg drones to map the site from the air.
Speaking of image processing, we can see how accurate it is quite easily on Earth. Compare processed satellite imagery with aerial photography.
Use what is abundant and build to last
Offline
Like button can go here
A simular scout mission Helocopter was to be a ride on side mission but little seems to be going forward with it. Even balloons/blimps would work effectively towards that goal of arial observation of the landscape.
Here is that topic Scouting Mars by Helicopter
Offline
Like button can go here
A simular scout mission Helocopter was to be a ride on side mission but little seems to be going forward with it. Even balloons/blimps would work effectively towards that goal of arial observation of the land scape.
You watch too much TV. The National Geographic Channel had a TV show called "Mars". In that program they showed UAV helicopters scout the land. But how does a helicopter produce enough lift in Mars thin atmosphere? An aircraft can work, with sail plane wings. But the aircraft must be very light with very large wings. But a helicopter? I doubt that would ever work.
Online
Like button can go here
I pick on SpaceNut quite a bit. So I will support SpaceNut this time.
https://en.wikipedia.org/wiki/NASA_Mars … pter_Scout
Nothing against you either Robert.
End
Offline
Like button can go here
To quote from the paper: "We assess the “true” resolution of the 5 cm super-resolution restored images using contemporaneous rover Navcam imagery on the surface and an inter-comparison of landmarks in the two sets of imagery."
So it is not untested.
Well, these are semi-experimental image processing techniques. This is based on monkeying with the data digitally. To one extent or another, what these algorithms find is the answer that they assume is there. While it may work most of the time (and then again it may not), it always runs the ASS-U-ME risk.
You can trust stuff like that if you want. I don't. Not when it partially assumes the answers it is looking for.
GW
Let's Go to Mars...Google on: Fast Track to Mars blogspot.com
Offline
Like button can go here
It may not be untested, but it is manipulation of the raw data. Such algorithms can only find what they are programmed to find. THAT is the pitfall. Especially when landing in a place you've never been to before.
GW
GW Johnson
McGregor, Texas
"There is nothing as expensive as a dead crew, especially one dead from a bad management decision"
Offline
Like button can go here
It is a simular process to the visual one used by photograph slides which lead to Pluto's discovery....just with more pixels but you are correct in that the algorithim can cause a false blurring or over correction...it has more to do with being very percise with each image used. Remember the satellites are moving so timing is everything....
Offline
Like button can go here
As I said, such a data-refining algorithm does a really good job finding exactly what it was intended to find. But it inherently cannot find what was never included in the programming.
Example: odd lighting perspective in dusty conditions makes interpreting a boulder's shadows fall outside the range of contrasts the program has in embedded it.
Myself, I prefer to examine the raw data with my own eyes. If that needs to be taken from closer up, then that's what we got to do. Simple as that.
GW
GW Johnson
McGregor, Texas
"There is nothing as expensive as a dead crew, especially one dead from a bad management decision"
Offline
Like button can go here
Aren't all photos from Mars digital? Digital also includes averages I am pretty sure - you (the software) elect one pixel to be a particular colour even though in reality it is not a single colour.
As I said, such a data-refining algorithm does a really good job finding exactly what it was intended to find. But it inherently cannot find what was never included in the programming.
Example: odd lighting perspective in dusty conditions makes interpreting a boulder's shadows fall outside the range of contrasts the program has in embedded it.
Myself, I prefer to examine the raw data with my own eyes. If that needs to be taken from closer up, then that's what we got to do. Simple as that.
GW
Let's Go to Mars...Google on: Fast Track to Mars blogspot.com
Offline
Like button can go here
https://en.wikipedia.org/wiki/Digital_camera
Filter mosaics, interpolation, and aliasing
https://www.dpreview.com/articles/14711 … onyrgbeccd
https://en.wikipedia.org/wiki/Bayer_filter
https://www.cambridgeincolour.com/tutor … ensors.htm
Not a single pixel but a combination of 4 under the different mask colors
Offline
Like button can go here
seems the answer is in the new topic...
Offline
Like button can go here