User Tag List

Thanks useful information Thanks useful information:  11
Page 2 of 3 FirstFirst 123 LastLast
Results 21 to 40 of 51

Thread: ok Arthur, why do reds oversaturate?

  1. #21
    http://steveaxford.smugmug.com/
    Threadstarter

    Join Date
    19 Nov 2007
    Location
    About in the middle between Byron Bay, Ballina and Lismore
    Posts
    3,150
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Geez Arthur, you do go on. As I said, I do understand RAW files and I really don't need your lesson. I was just trying to change the subject back to the real issue and maybe I glossed over colour spaces which are an exact science and very well understood, unlike colour creation and colour perception.

  2. #22
    A royal pain in the bum! arthurking83's Avatar
    Join Date
    04 Jun 2006
    Location
    the worst house, in the best street
    Posts
    8,777
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by Steve Axford View Post
    Geez Arthur, you do go on. .....
    Apologies.

    But if you introduce a variable(in this instance colour space), I have to assume that you may not know that it has nothing to do with the issue.
    So if you understand the raw file so well .. why introduce a totally irrelevant topic?

    A topic is only worthy of discussion with somebody once you have determined that the other party truly understands what's being discussed.

    So try to see it from my point of view.

    You commented on colour space as a possible variable in this discussion.

    I explained that colour space has nothing to do with the issue if you're shooting in raw.

    you introduce colour space as a variable again .... claiming that Canon don't offer a colour space wider than adobeRGB!!!

    ie.(from my point of view) .. I don't get what you're trying to say! I don't get why you think Canon needs to provide you with a wider colour gamut. So I try to explain it again .. and to be sure I do try to use as few words as possible, even tho the opposite appears to be the case!


    So the end result is that:
    You see my comments as: 'Geez he can go on'
    I see your comments as: 'what is it he doesn't understand about colour spaces and raw'
    Nikon D800E, D300, D70s
    {Nikon}; -> 50/1.2 : 500/8 : 105/2.8VR Micro : 180/2.8 ais : 105mm f/1.8 ais : 24mm/2 ais
    {Sigma}; ->10-20/4-5.6 : 50/1.4 : 12-24/4.5-5.6II : 150-600mm|S
    {Tamron}; -> 17-50/2.8 : 28-75/2.8 : 70-200/2.8 : 300/2.8 SP MF : 24-70/2.8VC

    {Yongnuo}; -> YN35/2N : YN50/1.8N


  3. #23
    Member
    Join Date
    15 Mar 2014
    Location
    Currambine, Perth
    Posts
    445
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)
    This is an interesting discussion, I must say I've never really paid enough attention to one colour blowing out more than the others.

    From the electromagnetic spectrum you would think the opposite would the true as the blue end has more energy that the Reds and the sensor is only collecting energy.

    Mind you I play with light nearly every work day and the eye can't see blue as well as red.

    I would also expect some fluorescent light adding to the Red energy levels, remembering all fluorescent subjects emit red shifted light. So if a component of sunlight is causing a subject to fluoresce then that light will be a red shifted emission from the component causing the fluorescence. Does that make sense to anyone but me?

    Must stop you be calling me Arthur soon, without the depth of knowledge




    Sent from my iPhone using Tapatalk
    Regards

    Wayne

    Nikon D610, Samyang 24mm 1.4, Tamron 24-70 2.8, Nikkor 50mm 1.4G, Nikkor 70-300mm 4.5, Manfrotto & MeFOTO tripods, Ninja pano head & LEE filters


  4. #24
    A royal pain in the bum! arthurking83's Avatar
    Join Date
    04 Jun 2006
    Location
    the worst house, in the best street
    Posts
    8,777
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by wayn0i View Post
    .....

    From the electromagnetic spectrum you would think the opposite would the true as the blue end has more energy that the Reds and the sensor is only collecting energy.

    .....
    Yeah, that's something that'd be easy to 'assume' but the reflective non reflective idea actually makes more sense in a way.

    the more blue the light the lower the frequency .. hence more penetrative power.
    If it penetrates more, then it's obviously not going to reflect as much.

    If we imagine the surface(subject) as a sieve, and the wavelength of light as the material being sifted. The smaller particles(wavelengths) get through the sieve, the large ones are stopped/reflected/collected(or whatever).

    ps. I have absolutely no idea if this holds true in any way(in quantum physics) .. it just kind'a makes sense from a physical perspective.

  5. #25
    Member
    Join Date
    15 Mar 2014
    Location
    Currambine, Perth
    Posts
    445
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by arthurking83 View Post
    Yeah, that's something that'd be easy to 'assume' but the reflective non reflective idea actually makes more sense in a way.

    the more blue the light the lower the frequency .. hence more penetrative power.
    If it penetrates more, then it's obviously not going to reflect as much.

    If we imagine the surface(subject) as a sieve, and the wavelength of light as the material being sifted. The smaller particles(wavelengths) get through the sieve, the large ones are stopped/reflected/collected(or whatever).

    ps. I have absolutely no idea if this holds true in any way(in quantum physics) .. it just kind'a makes sense from a physical perspective.
    I'm not sure that's how it works, surface reflection occurs because the surface colour is the same as the incident light colour, hence shine a red light on a red surface and it lightens the red. White light has the entire visible light spectrum so it lightens any colour, obviously if the surface and incident are different then the surface is either unchanged or darkens.

    Is anyone interested is colour wheel and absorption, it's not a big deal for non-technical photography I think.




    Sent from my iPhone using Tapatalk

  6. #26
    http://steveaxford.smugmug.com/
    Threadstarter

    Join Date
    19 Nov 2007
    Location
    About in the middle between Byron Bay, Ballina and Lismore
    Posts
    3,150
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    It makes sense, Wayne. One of the situations where this effect is apparent is with Scarlet Honeyeaters in sunshine. If the red pigment ( I assume that it is a pigment) fluoresces, then this could be part of the problem. In general it could be another factor than makes red the most likely colour to blow.

    Arthur. My apologies too. I shouldn't show my frustration like that. The reason I mentioned colour spaces is that every photo I take either ends up as AdobeRGB or sRGB. If I convert to sRGB, then I might make the problem worse as reds which are nearly saturated before the conversion, may be over saturated after. I lazily called the before space AdobeRGB as it is the colour space I usually convert to and it causes no extra problems. It would have been more correct to call it the Canon 5D III colour space. Since I have no way to display or print any colour spaces other than AdobeRGD or sRGB, I tend to (erroneously) call all colour spaces one or the other. This is a side issue for me as I always use AdobeRGB except when I am selling photos that may be used with sRGB. Then I have to be careful with photos that have very bright reds. I don't need to worry so much with blues or greens as they have never caused a problem.
    Since I am happy with my conclusions on the main part of the subject, why reds and not blues or greens, I guess you don't need to comment on that.

    - - - Updated - - -

    Wayne, I must admit to being fascinated by colour. The whole idea of a colour wheel is peculiar as it relates to our visual system and not at all to wavelengths of light.

  7. #27
    Member
    Join Date
    22 Feb 2013
    Location
    Sydney
    Posts
    152
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    If the histogram seems right then how are people judging the reds to be oversaturated? Is it a measurable thing, or just perception?

    Red is a colour that stands out more to most humans (assuming good colour vision) - our brains flag it as important (ancient danger sense from blood/fire??). Maybe it's just that.
    -- Mister Q

  8. #28
    Arch-Σigmoid Ausphotography Regular ameerat42's Avatar
    Join Date
    18 Sep 2009
    Location
    Nthn Sydney
    Posts
    23,522
    Mentioned
    24 Post(s)
    Tagged
    0 Thread(s)
    A 2ple of points from the immediate foregoing...

    1. Blue light has higher frequency than red light... Energy of various frequencies are IMO beyond the scope of the discussion.
    2. Sieves aren't necessary EXCEPT as the idea of a color filter (ie, the Bayer filter), ans therefore EVERYthing gets
    absorbed by the sensor that GETS THRU the filter (and obviously, and by definition, what the sensor is "sensitive" to).

    My own opinion favours two things: some compromise in the A/D conversion in the camera electronics, AND, some compromise
    in any logical handling of the D info afterwards.

    - - - Updated - - -

    Addendum:
    Somewhere in the discussion a description of "oversaturation" might be handy.
    CC, Image editing OK.

  9. #29
    A royal pain in the bum! arthurking83's Avatar
    Join Date
    04 Jun 2006
    Location
    the worst house, in the best street
    Posts
    8,777
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    From the outset, I have to say I have no idea on how it works either!
    So the only thing I'm sure about is that I I'm not sure how it works too

    Quote Originally Posted by wayn0i View Post
    I'm not sure that's how it works, surface reflection occurs because the surface colour is the same as the incident light colour, hence shine a red light on a red surface and it lightens the red. .....
    But, I do know that it's not as simple as you describe.
    I don't know this because I'm a theoretical particle physicist.
    I know this because of some of the stuff I've amassed over the years has shown me that it's not so simple.

    as one example: I have this old lens. It's not rare, it's very cheap .. and simply an obscure thing that 99.999% of photographers would ignore or avoid.
    It has red and orange markings, just as many other lenses do.
    When I shine any old torch onto it .. it's just brighter.
    When I shine either of my UV torches onto it, not much happens other than the red and orange markings fluoresce(they glow massively bright) to the point where even my eyes can't make out the actual numbers of the markings because they're over exposing.

    So the blue light(UV) is making the two red shifted colours(orange and red) even brighter than they otherwise normally are.

    In another thread, Steve writes about 'structured colours'.
    I've looked into it on only a couple of sites(I found the wikipedia entry easy enough to understand).

    Then in your description it seems to hold true:
    If you look for images of UV, Vis and IR versions of human skin, the UV version can look almost black(but definitely much darker) where the Vis rendering will look normal to us(for a Caucasian that will be slight red shifted hue), and under IR the skin rendering will look pretty much porcelain white.

    So from that, I think that it's the underlying structure of the material and or the the structure of the pigmentation that is the cause of some of these effects.
    Once again I don't know this, but simply think it's a possible reason. I suppose something else for me to look into as well.

    As an interesting side note too: did you know that bluer light can render a sharper image in terms of detail.
    It probably doesn't hold true for the very slight relative differences between visible blue and visible red light
    If the detail is there to be captured, under a more UV oriented lighting setup more detail can be captured compared to a setup where lower frequency lighting is used(ie. Visible of IR).

    - - - Updated - - -

    Quote Originally Posted by ameerat42 View Post
    .....
    Somewhere in the discussion a description of "oversaturation" might be handy.
    What you say makes sense Am.

    Now firstly I'm not claiming to be an expert on the topic.
    Secondly, I keep coming back to the topic of software(as a partial problem to this topic).

    I can't find enough material yet, but as a quick point I want to make.
    I have so many images of instances where in one software editor(remember we're referring to raw files here!!) a red channel blown out by about +1Ev, but the same unedited image is rendered about -1Ev below full saturation in another editor.
    This is going by the supposed histogram.
    Then the issue is using another editor again(for verification) can display a perfect exposure in the red channel.

    A 1Ev shift either way in the respective editor(s) can produce almost identical histograms to the other version.
    There's no change in the actual raw file to explain such variations .. so the only plausible explanation (to me) is that this is primarily a software issue.

    Steve made a comment earlier about histograms that don't display overexposure while the image appears too, and I've seen this myself too.
    Again, the software used in such situations has to be questioned too(one of my dislikes of Adobe raw conversion software).

    Problem at the moment is that I haven't reinstalled many of the software to show this, but I will soon.
    Then I'll have to find more images to post as samples .. hopefully to confirm that I'm not seeing things, or going mad.

  10. #30
    http://steveaxford.smugmug.com/
    Threadstarter

    Join Date
    19 Nov 2007
    Location
    About in the middle between Byron Bay, Ballina and Lismore
    Posts
    3,150
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by MrQ View Post
    If the histogram seems right then how are people judging the reds to be oversaturated? Is it a measurable thing, or just perception?

    Red is a colour that stands out more to most humans (assuming good colour vision) - our brains flag it as important (ancient danger sense from blood/fire??). Maybe it's just that.
    The frame may not be overexposed on a normal histogram, but may be on the red channel with an RGB histogram. The average metering says it is ok, but you need to use the rgb histogram. Does that make sense?

    - - - Updated - - -

    Arthur, you said
    "In another thread, Steve writes about 'structured colours'.
    I've looked into it on only a couple of sites(I found the wikipedia entry easy enough to understand). "
    Structural colours are very common in nature, but probably have no bearing on this discussion as reds are, it seems, usually pigments. It is common to get good red pigments, but very uncommon to get good blue pigments. Consequently, blues are often structural colours and greens are often a combination of a yellow pigment and a blue structural colour. While structural colours can be so bright that they are almost painful (eg some blue butterflys), they are usually not that bright. We could have a thread dedicated to structural colours, but I think I tried that and there were few takers. Suffice to say, we can safely ignore structural colours in this thread.

    - - - Updated - - -

    Again, to Arthur, you said
    "Steve made a comment earlier about histograms that don't display overexposure while the image appears too, and I've seen this myself too.
    Again, the software used in such situations has to be questioned too(one of my dislikes of Adobe raw conversion software)."

    I think that compared to the RGB histogram on the camera that there may be a small shift towards over exposure in the RAW processor, but the effect, if any, is very small. The main effect is when using the camera light meter (pre photo) or the average histogram after the photo. There, the difference is in camera or in RAW processing software. I really see no indication that this is merely a software problem. Also, it occurs with C1 as well as Adobe.

  11. #31
    Member
    Join Date
    15 Mar 2014
    Location
    Currambine, Perth
    Posts
    445
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)
    Arthur,
    Just quickly on your lens orange and dots example, these are two different responses the White light being reflection the uv light being fluorescence so completely different modes mate.




    Sent from my iPhone using Tapatalk

  12. #32
    Administrator bitsnpieces's Avatar
    Join Date
    01 May 2014
    Location
    St Albans
    Posts
    1,285
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Simple solution, and I dare say it, switch to Sony - live view with custom white balance/kelvin, and I mean custom, not just picking a mode like sunset, cloudy, incandescent, etc, but each one can further be customised - make it less red, add more blue, more green, or more red, less blue, more green, etc - I've found it very useful

    *now tries to find Ricktas to see where he's run off to to hide also*
    David Tran
    Sony a55
    Sony DT 18-70mm f/3.5-5.6
    Now sits as an antique as it no longer focuses properly.

    Wishlist: Sony RX10iv (or RX10v if it ever comes out)

  13. #33
    Ausphotography Regular
    Join Date
    18 May 2007
    Location
    Singapore
    Posts
    1,703
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)
    I'll have to read through the entire thread in more detail when I get more time as I've been moving the past weekend but just skimming through I didn't see anyone touch on Bayer filtration.
    Not sure if this is at all relevant but could some of the issue be due to Bayer demosiacing of an area containing almost entirely one colour since Bayer requires neighboring pixels for RGB data?
    I haven't dabbled in Sigma fovean sensors yet but do they have the same reds (or other colour channels for that matter) saturation issue?
    Last edited by swifty; 10-08-2015 at 12:15pm.
    Nikon FX + m43
    davophoto.wordpress.com

  14. #34
    http://steveaxford.smugmug.com/
    Threadstarter

    Join Date
    19 Nov 2007
    Location
    About in the middle between Byron Bay, Ballina and Lismore
    Posts
    3,150
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Swifty, I have heard this mentioned on other sites, but it never developed much traction. That may be because most people, like me, don't really understand it. Still, it doesn't explain why only red or why it is correctable by underexposing. I did read a post by one person who said that this type of thing had always occurred, even with film. Good photographers have always understood that light meters from a distance are not always enough. You have to spot meter possible highlights separately. Of course, doing this with a flighty, little bird isn't possible, but then, there were very few flighty, little bird photos in the days of film.

  15. #35
    Ausphotography Regular
    Join Date
    18 May 2007
    Location
    Singapore
    Posts
    1,703
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)
    Yea.. I must admit I don't understand it that well either. Since the histograms are generated from the jpegs, if the reds are around 250-255, the green and blue channels are very low, let's say <20, then what's the value of the entire pixel?

    Also, if you shot a colour chart under controlled situations. As you increase exposure closer to the highlight limits, is it just the pure red patch that has problems or the pure blue and greens too?

  16. #36
    http://steveaxford.smugmug.com/
    Threadstarter

    Join Date
    19 Nov 2007
    Location
    About in the middle between Byron Bay, Ballina and Lismore
    Posts
    3,150
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    So it's just the jpeg that is used for histograms? That should have a definite colour space (Arthur???).
    The blown highlights seem to be confined to reds, at least in nature. I'm sure there are examples of blues and greens (people have commented on blue fairy wrens), but I do not see it. I have photos of red fungi, taken in shadow, that registers as pure red, eg 180 red, 0 green, 0 blue. I never see that with blues or greens.

    This one is very red.


    I understand that the algorithm for calculating total exposure is : green x 0.6, red x 0.3, and blue by 0.1, based on our eyes sensitivity. I'm not sure that quite makes sense, but that's what I have heard (seen in posts).
    Last edited by Steve Axford; 10-08-2015 at 1:59pm.

  17. #37
    A royal pain in the bum! arthurking83's Avatar
    Join Date
    04 Jun 2006
    Location
    the worst house, in the best street
    Posts
    8,777
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    I found a couple of images of the 'software issue' I previously referred too.

    ViewNX2:
    ViewNX2_red channel.jpg

    This screen cap is set with ViewNX2 to indicate the blown highlights being referred too in the histogram window. Hence the image is black with the bright red blotch. The black hides the image and simply displays the lost highlights(for any Nikon user/ViewNX2 user interested in how this works with the image loaded press H for highlight indication .. press S for shadow losses.

    note the histogram tho.

    ACR(version can't be remembered, but not too distant)
    ACR_red channel.jpg

    Histogram of a supposed blown out red channel is not only totally different but wildly totally different on the same raw file.
    It should be noted that no processing at all has been made via any of the raw editors here.

    The image is straight from the camera(raw of course).

    Some of the processing I've tried.
    In ViewNX2, set Picture Control to Portrait drops the blown out area to a very few micro dots(pixels) no WB or any other edit. Blown highlights gone.
    Change WB to an appropriate setting of fluoro(high colour rendering in this case). This setting balances the rest of the image and looks quite natural. For reference that WB temp turns out to be 4100K, zero R-G shift.

    In ACR(which I'm quite hopeless with .. but not the point)
    While the image needs no editing to recover any detail, it is quite dark by comparison in some areas.
    To my eye(at the time .. image being shot in 2010 tho!) the ACR rendering was pretty much wrong, and too contrasty and lost shadow detail. Colours generally OK, even the large red window blind that the image references looked similarly coloured and exposed(ie. very bright)
    But the Nikon camera, and subsequently VNX2 rendering were closer to reality overall.
    Same(or similar as can be) settings for WB contrast and so on were trialled in ACR to those in VNX2 to no real avail to match the VNX2 look.

    The point isn't which editor created a better rendering or image .. or works better.
    More so why such a massive difference in the red channel on the exact same image.
    Colour spaces don't explain it either, as I've tried various colour spaces too up to ProPhoto(via CaptureNX2). The difference in software(CNX2 vs VNX2) also don't explain it, as they render identically.
    Also Nikon's latest software CNX-D also renders the images identically to Nikon's older software.

    So (at least part of) the answer lies in the software's rendering capability.
    it should be remembered that for a software manufacturer to allow compatibility with the various hardware(ie. cameras/sensors) they need to produce profiles.
    They can only do so much for various conditions, and for conditions outside of those boundaries the user should create appropriate camera profiles.

    At the risk of 'going on again' the issue can be as simple as:

    manufacturer provides their set profiles. Those profiles were made in lab conditions, under various lighting types(eg. fluoro/incandescent/halogen/simulated sunlight etc)
    But that profile isn't as accurate as a real profile made by the user under real conditions. As an example; a landscape shot outdoors at sunset under a cloudy sky filled with smoke or something weird like that.
    The specifics aren't important, but I'm just trying to highlight that the differences may be.

    Of course in some situations tho, such as the structured colours previously referred too will make a red channel hard to capture with a good overall exposure balance .. butterfly wings, birds feathers etc.
    Same with the UV fluorescence too.

    But in many instances, the more common ones not explained by the extreme situations as above, the solution could be as simple as software.
    Again I centre my experiences around a Nikon environment and have little to no experience with many other brands.

    I used to do the underexposing thing myself too, until I noticed that the issue wasn't as real as I thought it was.
    I just changed the way I process the images .. via a few meagre edit steps.

    ps. hopefully that wasn't too arduous for a reply

    - - - Updated - - -

    Quote Originally Posted by Steve Axford View Post
    So it's just the jpeg that is used for histograms? That should have a definite colour space (Arthur???).....
    Yep! Gotcha.

    But how does this affect the raw file back home on your computer?

    This is only relevant on the camera's review screen.(to a degree)

    The jpg file on the cameras review screen has been common knowledge for many years. It's based on two jpg images within your raw files(of which there or should be three).

    I think only recently, some manufacturers have used aRGB capable LCDs on their cameras(or something to that effect) or screens with adjustable hue capability(or whatever).
    I just can't really remember all the newest tech stuff with respect to this.

    But I did mention that yes a colour space setting does affect a raw file in a very limited and strange way, and that was only in the embedded jpg files in the raw file.
    Once again, if I have to reiterate it again, I used to shoot in aRGB mode on the camera too, because all these supposed experts used to shout loudly that it's the widest gamut and best quality blah, blah blah ....

    But on the file you're actually interested in, it has no real life bearing .. it just creates potential issues later .. and I tried to explain use the lowest common denominator to eliminate these issue completely.

    set the camera to sRGB, and your software to ignore this setting for any raw files, and use a wider colour space.

    But, I have to reiterate that for the purpose of this discussion, colour space(in camera) has no bearing.

    I should explain too that long ago I also looked into this colourspace and embedded preview file topic.
    I shot in one colour space, converted to another, extracted the jpg preview file, reset the colour space on that preview file, inserted it back into the raw file and totally screwed the thumbnail of the raw file whilst not affecting the actual raw file rendering.
    The only thing that looked like rubbish on those raw files was the thumbnail, so in any software that renders the raw file via the embedded jpg files the initial view was that it looked like rubbish .. until opened properly.

  18. #38
    http://steveaxford.smugmug.com/
    Threadstarter

    Join Date
    19 Nov 2007
    Location
    About in the middle between Byron Bay, Ballina and Lismore
    Posts
    3,150
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Still not sure this has any meaning today, Arthur, particularly as it only seems to apply to Nikon. I am really talking about real world examples. I'll show you 4 photos that show the strongest red, blue and green of all my fungi photos (I have perhaps 10,000 to choose from). The red is way out in front with the purity of the colour, followed by the blue with the green last. The green bioluminescent fungi is quite strong, about the same as the blue but nothing like the red. The natural light green is very muddy. This, by itself, can explain why this problem occurs mainly with red.

    RED most saturated sample - R211, Green0, Blue0


    GREEN most saturated sample - R133, Green213, Blue214


    GREEN bioluminescent most saturated sample - R4, Green222, Blue109


    BLUE most saturated sample - R8, Green131, Blue252
    Last edited by Steve Axford; 10-08-2015 at 3:45pm.

  19. #39
    Arch-Σigmoid Ausphotography Regular ameerat42's Avatar
    Join Date
    18 Sep 2009
    Location
    Nthn Sydney
    Posts
    23,522
    Mentioned
    24 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by swifty View Post
    ...but could some of the issue be due to Bayer demosiacing of an area containing almost entirely one colour since Bayer requires neighboring pixels for RGB data?
    I haven't dabbled in Sigma fovean sensors yet but do they have the same reds...
    In the direction of this end, I have taken a few shots of red objects with same - just what was to hand in the late afternoon and sun-lit.
    Will pot 'em up later, but will do minimal processing - just basically just -> jpegs.

  20. #40
    A royal pain in the bum! arthurking83's Avatar
    Join Date
    04 Jun 2006
    Location
    the worst house, in the best street
    Posts
    8,777
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by Steve Axford View Post
    Still not sure this has any meaning today, Arthur, particularly as it only seems to apply to Nikon. I am really talking about real world examples. .....
    1. which part? the raw file/slash embedded jpg files, or the software difference issue(or the entire thread )

    2. the problem with real world files is that they vary so much in composition and processing, that you can't really use them as real study cases for the topic of why the red channel blows out .. or if it actually does.
    A reference point is needed.
    As for images with colour purity, it's a trivial matter to print out a pure Red or Green or Blue only swatch and photograph it and present it as an instance of colour purity and possibly a blown channel if this is desired.
    I realise that this is not a real world situation for many of us(but it may be say for a product photographer in a studio shooting certain products)

    I understand your point about real world images tho .. for me that used to be predominantly landscapes captured at sunset .. where reds would blow out wildly causing me to underexpose, only to find out that if I used X brand software the issue was non existent. Then I'd follow that up with an application of -2% contrast reduction which would miraculously recover the blown red channel.

    We know that from this thread, that the red channel is more sensitive. The reason is almost certain to be many fold.

    1. red simply reflects stronger(as per the UV vs IR comparison)
    2. image sensors almost certainly are more sensitive to red channel as well(I know the D70s, and hence D100) had a sensor very sensitive to the red/IR channel way back in their days)
    3. software can alter this perception(because vendors produce their own specific raw conversion algorithms).

    From some of my observations over time as well, it's easier to recover shadow detail in blue when it's lost, but harder to recover lost blue highlight detail.
    It's exactly the opposite for reds tho, but in saying that I've only seen very few images of mine where I've grossly underexposed the red channel(that cyan-ish green colour isn't one I particularly look for).

    Just one last thing re colour spaces too. Have you ever tried ProPhoto as an alternative?
    Like I said earlier, I'm not all that well versed in Adobe, nor Capture 1 either.
    But I was under the impression that Lr was set by default to work in the ProPhoto colour space.

Page 2 of 3 FirstFirst 123 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •