Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

I thought they meant 'linear' like you had to play it from end to end (or use fastforward or rewind) and couldn't seek to arbitrary points. But you could just make a CD with one track that would be somewhat similar to that.

Adbot
ADBOT LOVES YOU

Shame Boy
Mar 2, 2010

taqueso posted:

I thought they meant 'linear' like you had to play it from end to end (or use fastforward or rewind) and couldn't seek to arbitrary points. But you could just make a CD with one track that would be somewhat similar to that.

That's entirely possible, though that'd be a sorta unusual use of the word linear in an audio context :shrug:

Also they made plenty of tape players with track skip :colbert:

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

and you could push ff and play simultaneously and just listen for the empty spot

and miss the start and then go back a lil too far

skooma512
Feb 8, 2012

You couldn't grok my race car, but you dug the roadside blur.

Cold on a Cob posted:

i know, poor little rich computer touchers, but lmao i was wondering which tech companies would do this first

https://twitter.com/Fullcarry/status/1304491784689471488

the replies are golden too

"if they live somewhere cheaper than it's fair they get paid less"

ok let's move Bezos to Milwaukee then so amazon can pay out more dividends hm?

Of course labor gets its price adjusted downward at the drop of a hat. Not upward though, that requires paperwork and we only have 300% profits this quarter and...

And then if you live in another country, nope no price arbitrage for you, pay the American price or gently caress off :smug:

ContinuityNewTimes
Dec 30, 2010

Я выдуман напрочь

Johnny Five-Jaces
Jan 21, 2009


Cold on a Cob posted:

ok let's move Bezos to Milwaukee then so amazon can pay out more dividends hm?

stay out of my town, techfuckers

Paradoxish
Dec 19, 2003

Will you stop going crazy in there?

Cold on a Cob posted:

"if they live somewhere cheaper than it's fair they get paid less"

legit do not understand the kind of brain damage that makes someone think this way, especially because it is absolutely, 100%, beyond a shadow of a doubt certain that this person believes in the concept of "meritocracy"

voiceless anal fricative
May 6, 2007

Paradoxish posted:

legit do not understand the kind of brain damage that makes someone think this way, especially because it is absolutely, 100%, beyond a shadow of a doubt certain that this person believes in the concept of "meritocracy"

It's usually the other way around, where people get a location allowance or something to compensate for increased cost of living. IMO it makes sense in big professions that have collective agreements like teachers or nurses.

Shame Boy
Mar 2, 2010

bike tory posted:

It's usually the other way around, where people get a location allowance or something to compensate for increased cost of living. IMO it makes sense in big professions that have collective agreements like teachers or nurses.

If you get money for moving somewhere expensive why wouldn't you lose money when you move somewhere cheaper huh smartypants :colbert:

Ruffian Price
Sep 17, 2016

Paradoxish posted:

legit do not understand the kind of brain damage that makes someone think this way, especially because it is absolutely, 100%, beyond a shadow of a doubt certain that this person believes in the concept of "meritocracy"
Friend of mine was working in a regional branch of a tech company that generally paid the local average for their offices in different countries, but everybody got the same per diem allowance for travel. HQ coders made a loving petition and got that lowered for all the others because they didn't think it was fair.

Shame Boy
Mar 2, 2010

Ruffian Price posted:

Friend of mine was working in a regional branch of a tech company that generally paid the local average for their offices in different countries, but everybody got the same per diem allowance for travel. HQ coders made a loving petition and got that lowered for all the others because they didn't think it was fair.

Lmao I can't imagine the kind of entitled idiot that has the balls to publicly petition their company to pay their coworkers less so they feel better about themselves.

I hope they got their tires slashed every time people had to fly in to HQ from the branch offices.

Hodgepodge
Jan 29, 2006
Probation
Can't post for 215 days!

Antonymous posted:

Dynamic range is the difference between the darkest and brightest detail a camera can capture in one image, defined as 'stops' or doublings of light, so 3 stops =8x more light, 3 doublings. What's the brightest light a camera can see, how many times do we cut that brightness in half before it's the darkest a camera can see? Right now digital can do about 14 stops. You'll see all kinds of numbers, from 12 to 15, although there is an official standard basically everyone makes up their own. It was common knowledge that film negative had 12-13 stops, so digital has been claiming 13+ for decades despite wildly differing results between cameras, you always have to test for yourself.

Digital is already scraping the physical limit on underexposure. Photons arrive statistically so there's some noise there called shot noise, plus random noise from heat and amplification, so noise cannot go to 0, and I forget the number but we're already counting slightly more than 50% of the photons that arrive on most digital sensors, so there's less than a stop of signal available for underexposure. The big difference in underexposure is signal/noise ratio and some cameras use a lot of processing and expensive components to remove noise, but they don't see more signal. This makes digital great at handling underexposure. The drive for higher resolution has made each 'pixel' (photosite) of the sensor pretty small, and unfortunately the size of the photosite determines how many photons it can count before it gets full and clips, making digital awful at over exposure, the image simply clips white and it's a sudden, hard limit. There's also quantization issues because all sensors are analogue and somewhere in the camera the analogue voltage has to be made into a digital number, which can add noise and clip values. If you want more dynamic range with a given sensor tech, you need bigger pixels and better analogue - digital conversion, and therefore less resolution and maybe less photos / second. But resolution and fast shutters sell cameras more so than dynamic range.

Black and white film uses silver salt that, when exposed to light and then developer, becomes metallic silver, and without exposure remains salt soluble in water (fixer), and so unexposed areas wash away and become clear, making a negative. This process has its own statistics, where the reaction takes more and more light and so it's very hard to react all of the silver atoms, making it almost impossible to lose detail in over exposed areas. and also it takes iirc 4 photons arriving in a very short period of time to react at all, so very dim light even in very long exposures might not activate any silver, which is called reciprocity failure. Even in normal exposure, this makes the film exponentially less sensitive in the shadows of the image so under exposure is bad, almost exactly opposite to digital.

For color film, the silver activates a dye and then all of the silver is washed away with bleach. While black and white sees maybe a little more dynamic range, color negative is really insensitive to changes in the image with over exposure. Black and white will lose contrast with over exposure and grain will get bigger. Color negative will eventually see a little flatting in contrast and hue shifting but seems much less sensitive in my experience. When I shoot super 16mm I over expose 2 stops because then your mid tones get all the fine grain and you get a little more perceived resolution / less perceived grain. You can also develop color film without bleach (or less than necessary) which gives you a black and white image on top of a color image, called 'bleach bypass'. Fight Club, among other films, was developed this way. By leaving silver in the color film the film is somewhat more fragile and some dust removal tech won't work (doesn't matter for cinema).



Open this image in a new tab and look close. Film has a 'toe' where contrast steepens right before black and a 'fog' where signal simply ends, rather than gets noisier like digital. Some color film stocks fog at different levels, so shadows might look blue or green or magenta - the color that bottoms out first. At -2 she's losing detail in her black shirt. But from -1 to +6 there's no way to tell if the film was over or under exposed. At +6 there's some softening but my guess is that to get +6 they needed to open their lens more or do a longer exposure where subtle motion will soften the image. +6 is hard to get to in interior. If this scene had a bright window behind her, or a lightbulb in the shot, you might notice contrast in those bright areas flattening and hues shifting. I'm thinking that by +10 her skin highlights would start to shift hues, but maybe not. Definitely less contrast.

http://www.johnnypatience.com/download/johnny_patience_portra_400_0_to_10.jpg
This is someone testing the same kind of test, normal exposure to +10 stops (which is 1024x brighter) on two different scanners to see which scanner handles over exposure better. Part of film photography now is also how to digitize it (in some sense film photography is delayed digital photography with an intermediate). You can see one scanner handles +10 just fine. The negative is almost certainly a black square to the naked eye, so the scanner is really picking up the subtleties in that blackness.

Here's a $75,000 cinema camera that's looking good -2 to +5.
https://www.youtube.com/watch?v=ccc9-zhGPbo

I also add that for cinema, where I actually work, shooting on film probably has little impact on the image honestly. People take the dust and grain out of things shot on film and they add it to things shot digitally, all of them get color corrected, there's really no difference. Tarantino and Nolan make prints but I don't think they are contact prints, you're still seeing a digital effects and color and whatever on the print. (was once upon a time in hollywood a contact print? i.e. film-to-film) Super 35 resolves about 3k-4k which is where most digital cameras land now as well (4k sensor resolves about 3k with good debayer algorithms).

On set working on film changes everything. The workflow removes relying on looking at a screen on set for a lot of departments, director, wardrobe, hair and makeup, production design, lighting. The monitors for film cameras are pretty poo poo and so everyone focuses on using their eyes and being extra careful. That's the real power of shooting on film.

awesome. it kind of sounds as if film is a totally different technology than digital in the end. honestly, i think with all the recent advances in condensed matter physics, film may have more room for improvement at the moment. if anyone cares to explore that avenue and also if we retain the capacity to build this sort of poo poo I guess

Hodgepodge
Jan 29, 2006
Probation
Can't post for 215 days!

Paradoxish posted:

legit do not understand the kind of brain damage that makes someone think this way, especially because it is absolutely, 100%, beyond a shadow of a doubt certain that this person believes in the concept of "meritocracy"

tbh i assume at least some of this is just bought and paid for. the tech companies can afford it and at least some of them no doubt simply think of it as social engineering

Kreeblah
May 17, 2004

INSERT QUACK TO CONTINUE


Taco Defender

Paradoxish posted:

legit do not understand the kind of brain damage that makes someone think this way, especially because it is absolutely, 100%, beyond a shadow of a doubt certain that this person believes in the concept of "meritocracy"

Same. It's not like the value produced by their work is any less. Why should the company steal more of it just because of where they happen to live?

Encrypted
Feb 25, 2016

Hodgepodge posted:

awesome. it kind of sounds as if film is a totally different technology than digital in the end. honestly, i think with all the recent advances in condensed matter physics, film may have more room for improvement at the moment. if anyone cares to explore that avenue and also if we retain the capacity to build this sort of poo poo I guess
Film's wide dynamic range came from silver grains in various sizes that can react at different rates and essentially soak up large amount of photons. There has been various sensor tech that mimic that property with photosites in various sizes. Or do this which is also capturing multiple levels of photons and then combining the data together with fast enough hardware/software.

Shame Boy
Mar 2, 2010

Kreeblah posted:

Same. It's not like the value produced by their work is any less. Why should the company steal more of it just because of where they happen to live?

Well you see the meritocracy only kicks in after the baseline level your employer is paying just to keep you alive and housed enough to come into work, which is unfair on the company if you think about it.

Morbus
May 18, 2004

Antonymous posted:

Dynamic range is the difference between the darkest and brightest detail a camera can capture in one image, defined as 'stops' or doublings of light, so 3 stops =8x more light, 3 doublings. What's the brightest light a camera can see, how many times do we cut that brightness in half before it's the darkest a camera can see? Right now digital can do about 14 stops. You'll see all kinds of numbers, from 12 to 15, although there is an official standard basically everyone makes up their own. It was common knowledge that film negative had 12-13 stops, so digital has been claiming 13+ for decades despite wildly differing results between cameras, you always have to test for yourself.

Digital is already scraping the physical limit on underexposure. Photons arrive statistically so there's some noise there called shot noise, plus random noise from heat and amplification, so noise cannot go to 0, and I forget the number but we're already counting slightly more than 50% of the photons that arrive on most digital sensors, so there's less than a stop of signal available for underexposure. The big difference in underexposure is signal/noise ratio and some cameras use a lot of processing and expensive components to remove noise, but they don't see more signal. This makes digital great at handling underexposure. The drive for higher resolution has made each 'pixel' (photosite) of the sensor pretty small, and unfortunately the size of the photosite determines how many photons it can count before it gets full and clips, making digital awful at over exposure, the image simply clips white and it's a sudden, hard limit. There's also quantization issues because all sensors are analogue and somewhere in the camera the analogue voltage has to be made into a digital number, which can add noise and clip values. If you want more dynamic range with a given sensor tech, you need bigger pixels and better analogue - digital conversion, and therefore less resolution and maybe less photos / second. But resolution and fast shutters sell cameras more so than dynamic range.

Black and white film uses silver salt that, when exposed to light and then developer, becomes metallic silver, and without exposure remains salt soluble in water (fixer), and so unexposed areas wash away and become clear, making a negative. This process has its own statistics, where the reaction takes more and more light and so it's very hard to react all of the silver atoms, making it almost impossible to lose detail in over exposed areas. and also it takes iirc 4 photons arriving in a very short period of time to react at all, so very dim light even in very long exposures might not activate any silver, which is called reciprocity failure. Even in normal exposure, this makes the film exponentially less sensitive in the shadows of the image so under exposure is bad, almost exactly opposite to digital.

For color film, the silver activates a dye and then all of the silver is washed away with bleach. While black and white sees maybe a little more dynamic range, color negative is really insensitive to changes in the image with over exposure. Black and white will lose contrast with over exposure and grain will get bigger. Color negative will eventually see a little flatting in contrast and hue shifting but seems much less sensitive in my experience. When I shoot super 16mm I over expose 2 stops because then your mid tones get all the fine grain and you get a little more perceived resolution / less perceived grain. You can also develop color film without bleach (or less than necessary) which gives you a black and white image on top of a color image, called 'bleach bypass'. Fight Club, among other films, was developed this way. By leaving silver in the color film the film is somewhat more fragile and some dust removal tech won't work (doesn't matter for cinema).



Open this image in a new tab and look close. Film has a 'toe' where contrast steepens right before black and a 'fog' where signal simply ends, rather than gets noisier like digital. Some color film stocks fog at different levels, so shadows might look blue or green or magenta - the color that bottoms out first. At -2 she's losing detail in her black shirt. But from -1 to +6 there's no way to tell if the film was over or under exposed. At +6 there's some softening but my guess is that to get +6 they needed to open their lens more or do a longer exposure where subtle motion will soften the image. +6 is hard to get to in interior. If this scene had a bright window behind her, or a lightbulb in the shot, you might notice contrast in those bright areas flattening and hues shifting. I'm thinking that by +10 her skin highlights would start to shift hues, but maybe not. Definitely less contrast.

http://www.johnnypatience.com/download/johnny_patience_portra_400_0_to_10.jpg
This is someone testing the same kind of test, normal exposure to +10 stops (which is 1024x brighter) on two different scanners to see which scanner handles over exposure better. Part of film photography now is also how to digitize it (in some sense film photography is delayed digital photography with an intermediate). You can see one scanner handles +10 just fine. The negative is almost certainly a black square to the naked eye, so the scanner is really picking up the subtleties in that blackness.

Here's a $75,000 cinema camera that's looking good -2 to +5.
https://www.youtube.com/watch?v=ccc9-zhGPbo

I also add that for cinema, where I actually work, shooting on film probably has little impact on the image honestly. People take the dust and grain out of things shot on film and they add it to things shot digitally, all of them get color corrected, there's really no difference. Tarantino and Nolan make prints but I don't think they are contact prints, you're still seeing a digital effects and color and whatever on the print. (was once upon a time in hollywood a contact print? i.e. film-to-film) Super 35 resolves about 3k-4k which is where most digital cameras land now as well (4k sensor resolves about 3k with good debayer algorithms).

On set working on film changes everything. The workflow removes relying on looking at a screen on set for a lot of departments, director, wardrobe, hair and makeup, production design, lighting. The monitors for film cameras are pretty poo poo and so everyone focuses on using their eyes and being extra careful. That's the real power of shooting on film.

I just want to add to this offtopic nerdpost with another offtopic nerdpost to point out that, fundamentally, digital image sensors do not have any limits re: overexposure or clipping. In principle, with fast enough readout, you can just "empty" and record the value from each pixel well before it clips, then immediately start collecting again and repeat this process for as long an integration period as you want. In this way, you can record accurate intensity values at each photosite for arbitrarily long "exposure times" and never clip, even for tiny pixels. Film can't do this because, obviously, you cant readout, un-expose, and re-expose the film in real time; once a particular region of film is fully exposed that's it.

HDR photography kind of does this (in an often badly implemented way), but consumer imaging sensors & camera software don't implement this kind of functionality more directly for a few reasons. One is that nobody really cares...the 10+ stops of dynamic range afforded by even lovely modern sensors is "good enough" for this just not to be a huge issue most of the time. A second reason is that fast & low noise readout with continuous integration is more complicated, consumes more power, requires more memory, and you can get similar results in scenes with high contrast just by compositing a couple different exposures if you really need to. But in like, telescopes or earth-facing satellites or things like that, nobody ever worries about "oh my sensor will clip if I expose for too long " because that just fundamentally isn't a thing on a sensor that you can readout and re-expose.

dex_sda
Oct 11, 2012


Kreeblah posted:

Same. It's not like the value produced by their work is any less. Why should the company steal more of it just because of where they happen to live?

When someone else is losing, you're winning, duh

Antonymous
Apr 4, 2009

Motion picture needs precise control of motion blur. Your integration time has to be for 1/48th of a second, no pauses no skips no double exposure, and ideally completed within 5ms or less (i.e. little rolling shutter).

I'm a camera user not a camera maker, so I'm just teaching myself this stuff, not an expert. but I won't shoot with a camera that uses any of those HDR schemes especially behind the scenes where I can't control them. So the limit for my work is probably full well capacity and will be for a while.

Morbus
May 18, 2004

Antonymous posted:

Motion picture needs precise control of motion blur. Your integration time has to be for 1/48th of a second, no pauses no skips no double exposure, and ideally completed within 5ms or less (i.e. little rolling shutter).

I'm a camera user not a camera maker but I won't shoot with a camera that uses any of those HDR schemes especially behind the scenes where I can't control them. So the limit for my work is probably full well capacity and will be for a while.

So, fundamentally, there is no difference in recorded scene intensity values or motion blur between:

1.) exposing a sensor for 1/48th of a second exactly, then reading out the recorded counts

2.) exposing a sensor for 1/4800 of a second, reading out (very) quickly, doing this 99 more times, and adding all the recorded counts together

If the "gap" due to readout is so fast as to be entirely negligible (which it can be, see for example ultra high speed digital video cameras).

Like, if i give you a high speed video shot at 4800 frames per second instead of 48fps, you can (in principle) reconstruct the 48fps video exactly from the former, provided that the readout and ADC noise are comparable. That last bit is tricky but it's just a cost and engineering complexity issue, not a fundamental barrier.

Antonymous
Apr 4, 2009

Encrypted posted:

Film's wide dynamic range came from silver grains in various sizes that can react at different rates and essentially soak up large amount of photons.

I left a lot of technical stuff out of my other post because I felt I'd said enough. But I think it's not just a random assortment of grain sizes, the color negative emulsion is several layers, with previous layers blocking light to following layers both because grain casts shadows onto grains below it and because there are color filters built in between layers of the emulsion, which means really bright light gets modulated down before it reaches them. It's really sophisticated and not something digital can take advantage of, and causes some of the unique flaws of color film that people like, like the softening of warm colors.

Antonymous
Apr 4, 2009

Morbus posted:

So, fundamentally, there is no difference in recorded scene intensity values or motion blur between:

edit: I know this technique does work for stills so my objection is more about implementing it for motion, so I deleted my general technical objection. If it's a matter of quality of components there is a rental market for $100,000+ cameras on big movies, so I wonder why it's not out there. If something out there is using this for good motion reproduction I'd love to call a rental house to test it.

Antonymous has issued a correction as of 09:35 on Sep 12, 2020

Morbus
May 18, 2004

Antonymous posted:

My understanding is, 1/4800 of a second integrated does not increase sensor dynamic range, you're just under exposing and frame averaging. This works where the sensor is linear, and if read noise is 0 like you said (in reality it isn't), and so fails in shadows where noise is not linear. Underexposing by 100x is going to gently caress up your image, even if you take 100 of them. Those highspeed cameras might take 2400 frames a second but they expose them correctly. I do test this stuff so if there's a camera that you think pulls this off with normal motion blur I'll call up a rental house and go play with it.

Red cameras have this feature, and they have too much of a latency in the reset of the readout and you will see banding in the motion blur, I've tested it. I actually think they didn't think their process through and they just average 48fps or 96fps before compression, rather than putting all the samples together temporally. Good for stop motion. Red also lets you shoot a frame at 1/48 and then another at 1/120 (or whatever you want) and this gives your highlights sharp blur and shadows normal blur when you blend them.

sorry to anyone reading this thread who couldn't give less of a gently caress about cameras lol

So, it's possible to create a CMOS or CCD sensor that is remarkably linear even if underexposed by 8 stops as in the above example. You can get very close to the theoretical (shot noise limited) case. This is easier if you optimize your sensor and camera for this from the beginning--knowing that you need very high accuracy, low noise, and linear response for photosites that are collecting a small number of photons repeatedly and very quickly. Sensors based on pixel self-reset and asynchronous readout are some examples (like this https://ieeexplore.ieee.org/abstract/document/7895171)

*Assuming* you have solved the issues with linearity, readout noise, ADC noise, etc. (which, again, is doable), and that you are basically operating in a shot noise limited fashion, then "just underexposing and frame averaging" will not gently caress up your image, and will increase the dynamic range but only insofar as it was limited by highlight clipping. You won't improve the SNR of the image at a given integration time, but you will eliminate the "problem" of photosites clipping

I'm not aware of any cine cameras that do this, and I'm not surprised. For one, doing it well almost requires compromising either the range or linearity of photosites when operated conventionally. Second, cameras operated conventionally are basically good enough, with dynamic range that exceeds any display or projector, so nobody really cares. Finally, none of this was really practical until maybe the last 10 years, and professional digital cinematography is only about as old. That being said I expect this to become commonplace in still cameras eventually, and eventually cine cameras, because there is no fundamental reason not to do it.

Encrypted
Feb 25, 2016

Antonymous posted:

I left a lot of technical stuff out of my other post because I felt I'd said enough. But I think it's not just a random assortment of grain sizes, the color negative emulsion is several layers, with previous layers blocking light to following layers both because grain casts shadows onto grains below it and because there are color filters built in between layers of the emulsion, which means really bright light gets modulated down before it reaches them. It's really sophisticated and not something digital can take advantage of, and causes some of the unique flaws of color film that people like, like the softening of warm colors.

Oh cool, I wasn't really thinking about it in 3D sort of way but it make sense.

Guess that also explains why some people went nuts over the foven sensors where the photosite for each color is stacked depending on the wavelength instead of using the traditional bayer pattern. The per-pixel sharpness is just stunning on those without any sort of moire defects.


That being said, the future seems to be something like high refresh 8K on largeish TVs with HDR where our eyes will become the limit and cannot discern any more pixels or brightness level beyond those. And then we can finally stop buying new TVs because there's nothing much else beyond those metrics to improve on.

This but with regular TV size and price.
https://www.tomshardware.com/uk/news/sharp-unveils-new-120inch-8k-display-up-to-120hz-hdmi-21-2048-dimming-zones

dex_sda
Oct 11, 2012


All I know about CCD tech is the stuff we use in telescopes. Those sensors are state of the art stuff that can detect individual photons and is designed for brutally long exposition (we're talking literal hours), but it needs to be hooked up to denoisers up the wazoo - a lot of setting up a telescope these days is getting accurate background so you can filter the noise out. If the background is overwhelming (light pollution being the prime problem) then there's nothing you can do of course, but you'd be surprised how much you can ENHANCE the image if you do things right

I've taken really sharp photos of very faint star clusters while just outside a city with a relatively small in-building telescope by working the photo all night, and our campus had a background analyst wizard who would spend weeks working the background and analysing the light pollution dependent on the day of the week so he could squeeze out more fidelity when he actually took a photo

You can't really do that with analog plates because you need quantized, discrete data to do statistical denoising at the level required for astronomy.

dex_sda has issued a correction as of 10:58 on Sep 12, 2020

atomicgeek
Jul 5, 2007

noony noony noony nooooooo

dex_sda posted:

All I know about CCD tech is the stuff we use in telescopes. Those sensors are state of the art stuff that can detect individual photons and is designed for brutally long exposition (we're talking literal hours), but it needs to be hooked up to denoisers up the wazoo - a lot of setting up a telescope these days is getting accurate background so you can filter the noise out. If the background is overwhelming (light pollution being the prime problem) then there's nothing you can do of course, but you'd be surprised how much you can ENHANCE the image if you do things right

Do you actually have to chant "ENHANCE" as you do it, because that would be rad

(I love this photography derail!)

dex_sda
Oct 11, 2012


atomicgeek posted:

Do you actually have to chant "ENHANCE" as you do it, because that would be rad

(I love this photography derail!)

it's more drinking a lot of coffee staring at a laptop screen in a shroud while you're shivering from the brutal cold in the dark because you open the dome so that it cools the ccd so there's less circuitry-induced noise in the detectors, all while you hope you calibrated everything right when taking the background photo

Antonymous
Apr 4, 2009


4k is already past visual sensitivity to real world images with motion, depth of field and so on for normal viewing. Usually they define 4k viewing distance as being able to tell a white pixel from a black pixel, e.g. read text. HDR is probably the one place to make a big change, but idk if I want to feel my eyes sting when an actor rakes a flashlight across the lens in a dark scene (I've watched footage on prototype HDR screens that do exactly this, even footage of christmas lights hurt to look at). Color gamut and bit depth are not yet reproducing at human sensitivity for real world stuff but color is where humans are much less sensitive and there are practical reasons not to work on this problem. I'd rather have less lossy compression and a 2k image any day.

We're already in the age of biological limit. A modern bluray will not be obsolete the way a VHS is. When 8k is standard they will tell you 16k is needed but it's not (for watching movies / tv).


Yes of course you are right, and anyway you can gain more highlight than lose in shadows in real world cases and that extended dynamic range is the point we were talking about which I forgot.

dex_sda
Oct 11, 2012


btw gently caress elon musk, motherfucking moron made ground-based astronomy esp. radio astronomy so much harder. you already have so much background to keep in mind and the introduced another really stupid source of it

Shame Boy
Mar 2, 2010

dex_sda posted:

it's more drinking a lot of coffee staring at a laptop screen in a shroud while you're shivering from the brutal cold in the dark because you open the dome so that it cools the ccd so there's less circuitry-induced noise in the detectors, all while you hope you calibrated everything right when taking the background photo

I thought astronomy cameras all had peltier or sterling cryocoolers these days, I know that one mikeselectricstuff took apart did anyway

...or well that was a DNA camera (yes really, apparently that's how a lot of DNA sequencers work, they make the DNA base pairs glow different colors and then film the flashes of light it outputs) but he mentioned it was the same kinda camera used for astronomy in that it could detect ultra-low-light levels, had an absurd resolution and needed poo poo like the cryocooler to get the noise floor down.

Morbus
May 18, 2004

Encrypted posted:

Oh cool, I wasn't really thinking about it in 3D sort of way but it make sense.

Guess that also explains why some people went nuts over the foven sensors where the photosite for each color is stacked depending on the wavelength instead of using the traditional bayer pattern. The per-pixel sharpness is just stunning on those without any sort of moire defects.


That being said, the future seems to be something like high refresh 8K on largeish TVs with HDR where our eyes will become the limit and cannot discern any more pixels or brightness level beyond those. And then we can finally stop buying new TVs because there's nothing much else beyond those metrics to improve on.

This but with regular TV size and price.
https://www.tomshardware.com/uk/news/sharp-unveils-new-120inch-8k-display-up-to-120hz-hdmi-21-2048-dimming-zones

If you make the pixel resolution really high, comparable to the wavelength of light, you can reproduce not just the magnitude but also the phase of a recorded scene and do real time holography. So we won't be able to stop buying new TVs until we're at around 1000K...

On the sensor side, some people would say we have enough megapixels but oh ho ho. If you had a few million megapixel sensor, divided into a few million photosites each consisting of it's own multi megapixel sensor in the focal plane of a microlens, you could record the light field at the sensor surface and reconstruct the image that would have been created by any lens at any aperture equal to or smaller than the sensor area. So you could have a camera the size of your phone, with a medium format sensor, capable of taking the same image as an 8mm or 600mm lens, with whatever depth of field you like, without any physical lens.

Shame Boy
Mar 2, 2010

dex_sda posted:

btw gently caress elon musk, motherfucking moron made ground-based astronomy esp. radio astronomy so much harder. you already have so much background to keep in mind and the introduced another really stupid source of it

Yeah I heard there's a lot of images being ruined by his stupid constellation of satellites flying across the field of view and leaving a big light skidmark, didn't hear about the radio astronomy thing though but it makes sense they'd mess up that too.

I think my favorite radio astronomy interference case is this one though:

quote:

After more than four years of searching, researchers using the Parkes radio telescope in New South Wales, Australia, have identified the source of some mysterious signals: a microwave oven in the facility’s break room.

...

Writing in a report published in April on the preprint server arXiv, the astronomers at Parkes described the meticulous hunt for the source of “perytons”, fleeting bursts of radio signals that have been intermittently detected by the telescope since 1998. The noise was interfering with their main mission: to understand the source of fast radio bursts (FRBs), brief and elusive signals with possible extragalactic origins.

Emily Petroff at Swinburne University of Technology in Melbourne, Australia, who is first author of the paper, says that the team knew early on that the perytons had an earthly explanation. But they nailed the kitchen appliance as the culprit when tests confirmed that the peryton signals — but not the FRBs — could be replicated by simply opening the door of the microwave as the oven was running and if the telescope was pointing in a certain direction.

If someone opens the door of the microwave in the break room before the timer runs out there's enough energy that leaks in the split second before it shuts off that it was swamping out the telescope. And they were probably microwaving fish too I bet :argh:

dex_sda
Oct 11, 2012


Shame Boy posted:

I thought astronomy cameras all had peltier or sterling cryocoolers these days, I know that one mikeselectricstuff took apart did anyway

...or well that was a DNA camera (yes really, apparently that's how a lot of DNA sequencers work, they make the DNA base pairs glow different colors and then film the flashes of light it outputs) but he mentioned it was the same kinda camera used for astronomy in that it could detect ultra-low-light levels, had an absurd resolution and needed poo poo like the cryocooler to get the noise floor down.

They do but it was a ccd retrofitted telescope so the cooling could have trouble sometimes. You gotta open the dome anyway so we just opened the doors to get more air in too. Also you wanted the optics themselves to cool down close to outside temp so you wouldn't get stuff like condensation in the middle of the photo.

50 meters over there were (tenured) people working on a bigger telescope with remote control and us students had to work in really crappy conditions lmfao.

God I miss it. I loved the cold, loved the awful campus provided coffee, loved the time when the rest of the group would go home after the mandatory class and I would be alone with the sound of crickets. Also hooked up with an astronomer lady for a while thanks to observation nights we shared. Good times

dex_sda
Oct 11, 2012


Shame Boy posted:

Yeah I heard there's a lot of images being ruined by his stupid constellation of satellites flying across the field of view and leaving a big light skidmark, didn't hear about the radio astronomy thing though but it makes sense they'd mess up that too.

I think my favorite radio astronomy interference case is this one though:


If someone opens the door of the microwave in the break room before the timer runs out there's enough energy that leaks in the split second before it shuts off that it was swamping out the telescope. And they were probably microwaving fish too I bet :argh:

Skid mark is right, you try to collect individual photons and suddenly you got a line of a dozen satellites fingerpainting your drat data. I haven't taken a photo in a long while but even before starlink it was a problem, and the non-musk sats were actually made not to mess astronomy up.

The worst is that it's primarily a problem for aspiring students who work on mediocre equipment. Large telescopes have programming ready to remove these kinds of artifacts, but a fresh student getting cold on a summer's night will get ruined photos without knowing how to fix and lose patience.

CommonShore
Jun 6, 2014

A true renaissance man


Antonymous posted:

digital is way more linear than tape

Not in that tape is harder to seek and search. He writes long instrumentals and wants people to just sit and listen.

You're also arguing with a guy (him) who isn't here and who writes and records music in an isolated log cabin and then sells the limited edition tapes under pseudonyms. I'm just pointing out that some artists have preferences for tape.

Ruffian Price
Sep 17, 2016

You were talking about time progression, they were talking about the dynamic response.

indigi
Jul 20, 2004

how can we not talk about family
when family's all that we got?
cameras are boring and suck poo poo

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

indigi posted:

cameras are boring and suck poo poo

turn on your viewfinder

JackBandit
Jun 6, 2011
I like to do all my photography with a 2003 flip phone

Adbot
ADBOT LOVES YOU

BonHair
Apr 28, 2007

I like how we started at "photography is 90% skill, all that gear is just icing on top" and went full into gear mode. Over in the guitar thread we had a pretty good discussion on how discussing gear is basically useless, the point is playing something cool. But discussing gear is so much easier, so so talk about gear dominates discussion, leading people to buy more gear to sound good instead of figuring out how to sound good by playing cool stuff.

My point is that the nature of discussing hobbies lead to superfluous spending. Capitalism is cool and good.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply