|
I thought they meant 'linear' like you had to play it from end to end (or use fastforward or rewind) and couldn't seek to arbitrary points. But you could just make a CD with one track that would be somewhat similar to that.
|
# ? Sep 11, 2020 22:43 |
|
|
# ? May 25, 2024 10:07 |
|
taqueso posted:I thought they meant 'linear' like you had to play it from end to end (or use fastforward or rewind) and couldn't seek to arbitrary points. But you could just make a CD with one track that would be somewhat similar to that. That's entirely possible, though that'd be a sorta unusual use of the word linear in an audio context Also they made plenty of tape players with track skip
|
# ? Sep 11, 2020 22:46 |
|
and you could push ff and play simultaneously and just listen for the empty spot and miss the start and then go back a lil too far
|
# ? Sep 11, 2020 22:49 |
Cold on a Cob posted:i know, poor little rich computer touchers, but lmao i was wondering which tech companies would do this first Of course labor gets its price adjusted downward at the drop of a hat. Not upward though, that requires paperwork and we only have 300% profits this quarter and... And then if you live in another country, nope no price arbitrage for you, pay the American price or gently caress off
|
|
# ? Sep 11, 2020 22:56 |
|
|
# ? Sep 12, 2020 01:03 |
|
Cold on a Cob posted:ok let's move Bezos to Milwaukee then so amazon can pay out more dividends hm? stay out of my town, techfuckers
|
# ? Sep 12, 2020 03:06 |
|
Cold on a Cob posted:"if they live somewhere cheaper than it's fair they get paid less" legit do not understand the kind of brain damage that makes someone think this way, especially because it is absolutely, 100%, beyond a shadow of a doubt certain that this person believes in the concept of "meritocracy"
|
# ? Sep 12, 2020 04:09 |
|
Paradoxish posted:legit do not understand the kind of brain damage that makes someone think this way, especially because it is absolutely, 100%, beyond a shadow of a doubt certain that this person believes in the concept of "meritocracy" It's usually the other way around, where people get a location allowance or something to compensate for increased cost of living. IMO it makes sense in big professions that have collective agreements like teachers or nurses.
|
# ? Sep 12, 2020 04:30 |
|
bike tory posted:It's usually the other way around, where people get a location allowance or something to compensate for increased cost of living. IMO it makes sense in big professions that have collective agreements like teachers or nurses. If you get money for moving somewhere expensive why wouldn't you lose money when you move somewhere cheaper huh smartypants
|
# ? Sep 12, 2020 04:56 |
|
Paradoxish posted:legit do not understand the kind of brain damage that makes someone think this way, especially because it is absolutely, 100%, beyond a shadow of a doubt certain that this person believes in the concept of "meritocracy"
|
# ? Sep 12, 2020 05:05 |
|
Ruffian Price posted:Friend of mine was working in a regional branch of a tech company that generally paid the local average for their offices in different countries, but everybody got the same per diem allowance for travel. HQ coders made a loving petition and got that lowered for all the others because they didn't think it was fair. Lmao I can't imagine the kind of entitled idiot that has the balls to publicly petition their company to pay their coworkers less so they feel better about themselves. I hope they got their tires slashed every time people had to fly in to HQ from the branch offices.
|
# ? Sep 12, 2020 06:57 |
|
Antonymous posted:Dynamic range is the difference between the darkest and brightest detail a camera can capture in one image, defined as 'stops' or doublings of light, so 3 stops =8x more light, 3 doublings. What's the brightest light a camera can see, how many times do we cut that brightness in half before it's the darkest a camera can see? Right now digital can do about 14 stops. You'll see all kinds of numbers, from 12 to 15, although there is an official standard basically everyone makes up their own. It was common knowledge that film negative had 12-13 stops, so digital has been claiming 13+ for decades despite wildly differing results between cameras, you always have to test for yourself. awesome. it kind of sounds as if film is a totally different technology than digital in the end. honestly, i think with all the recent advances in condensed matter physics, film may have more room for improvement at the moment. if anyone cares to explore that avenue and also if we retain the capacity to build this sort of poo poo I guess
|
# ? Sep 12, 2020 07:14 |
|
Paradoxish posted:legit do not understand the kind of brain damage that makes someone think this way, especially because it is absolutely, 100%, beyond a shadow of a doubt certain that this person believes in the concept of "meritocracy" tbh i assume at least some of this is just bought and paid for. the tech companies can afford it and at least some of them no doubt simply think of it as social engineering
|
# ? Sep 12, 2020 07:17 |
|
Paradoxish posted:legit do not understand the kind of brain damage that makes someone think this way, especially because it is absolutely, 100%, beyond a shadow of a doubt certain that this person believes in the concept of "meritocracy" Same. It's not like the value produced by their work is any less. Why should the company steal more of it just because of where they happen to live?
|
# ? Sep 12, 2020 07:50 |
|
Hodgepodge posted:awesome. it kind of sounds as if film is a totally different technology than digital in the end. honestly, i think with all the recent advances in condensed matter physics, film may have more room for improvement at the moment. if anyone cares to explore that avenue and also if we retain the capacity to build this sort of poo poo I guess
|
# ? Sep 12, 2020 08:13 |
|
Kreeblah posted:Same. It's not like the value produced by their work is any less. Why should the company steal more of it just because of where they happen to live? Well you see the meritocracy only kicks in after the baseline level your employer is paying just to keep you alive and housed enough to come into work, which is unfair on the company if you think about it.
|
# ? Sep 12, 2020 08:14 |
|
Antonymous posted:Dynamic range is the difference between the darkest and brightest detail a camera can capture in one image, defined as 'stops' or doublings of light, so 3 stops =8x more light, 3 doublings. What's the brightest light a camera can see, how many times do we cut that brightness in half before it's the darkest a camera can see? Right now digital can do about 14 stops. You'll see all kinds of numbers, from 12 to 15, although there is an official standard basically everyone makes up their own. It was common knowledge that film negative had 12-13 stops, so digital has been claiming 13+ for decades despite wildly differing results between cameras, you always have to test for yourself. I just want to add to this offtopic nerdpost with another offtopic nerdpost to point out that, fundamentally, digital image sensors do not have any limits re: overexposure or clipping. In principle, with fast enough readout, you can just "empty" and record the value from each pixel well before it clips, then immediately start collecting again and repeat this process for as long an integration period as you want. In this way, you can record accurate intensity values at each photosite for arbitrarily long "exposure times" and never clip, even for tiny pixels. Film can't do this because, obviously, you cant readout, un-expose, and re-expose the film in real time; once a particular region of film is fully exposed that's it. HDR photography kind of does this (in an often badly implemented way), but consumer imaging sensors & camera software don't implement this kind of functionality more directly for a few reasons. One is that nobody really cares...the 10+ stops of dynamic range afforded by even lovely modern sensors is "good enough" for this just not to be a huge issue most of the time. A second reason is that fast & low noise readout with continuous integration is more complicated, consumes more power, requires more memory, and you can get similar results in scenes with high contrast just by compositing a couple different exposures if you really need to. But in like, telescopes or earth-facing satellites or things like that, nobody ever worries about "oh my sensor will clip if I expose for too long " because that just fundamentally isn't a thing on a sensor that you can readout and re-expose.
|
# ? Sep 12, 2020 08:30 |
|
Kreeblah posted:Same. It's not like the value produced by their work is any less. Why should the company steal more of it just because of where they happen to live? When someone else is losing, you're winning, duh
|
# ? Sep 12, 2020 08:31 |
|
Motion picture needs precise control of motion blur. Your integration time has to be for 1/48th of a second, no pauses no skips no double exposure, and ideally completed within 5ms or less (i.e. little rolling shutter). I'm a camera user not a camera maker, so I'm just teaching myself this stuff, not an expert. but I won't shoot with a camera that uses any of those HDR schemes especially behind the scenes where I can't control them. So the limit for my work is probably full well capacity and will be for a while.
|
# ? Sep 12, 2020 08:42 |
|
Antonymous posted:Motion picture needs precise control of motion blur. Your integration time has to be for 1/48th of a second, no pauses no skips no double exposure, and ideally completed within 5ms or less (i.e. little rolling shutter). So, fundamentally, there is no difference in recorded scene intensity values or motion blur between: 1.) exposing a sensor for 1/48th of a second exactly, then reading out the recorded counts 2.) exposing a sensor for 1/4800 of a second, reading out (very) quickly, doing this 99 more times, and adding all the recorded counts together If the "gap" due to readout is so fast as to be entirely negligible (which it can be, see for example ultra high speed digital video cameras). Like, if i give you a high speed video shot at 4800 frames per second instead of 48fps, you can (in principle) reconstruct the 48fps video exactly from the former, provided that the readout and ADC noise are comparable. That last bit is tricky but it's just a cost and engineering complexity issue, not a fundamental barrier.
|
# ? Sep 12, 2020 08:56 |
|
Encrypted posted:Film's wide dynamic range came from silver grains in various sizes that can react at different rates and essentially soak up large amount of photons. I left a lot of technical stuff out of my other post because I felt I'd said enough. But I think it's not just a random assortment of grain sizes, the color negative emulsion is several layers, with previous layers blocking light to following layers both because grain casts shadows onto grains below it and because there are color filters built in between layers of the emulsion, which means really bright light gets modulated down before it reaches them. It's really sophisticated and not something digital can take advantage of, and causes some of the unique flaws of color film that people like, like the softening of warm colors.
|
# ? Sep 12, 2020 08:56 |
|
Morbus posted:So, fundamentally, there is no difference in recorded scene intensity values or motion blur between: edit: I know this technique does work for stills so my objection is more about implementing it for motion, so I deleted my general technical objection. If it's a matter of quality of components there is a rental market for $100,000+ cameras on big movies, so I wonder why it's not out there. If something out there is using this for good motion reproduction I'd love to call a rental house to test it. Antonymous has issued a correction as of 09:35 on Sep 12, 2020 |
# ? Sep 12, 2020 09:07 |
|
Antonymous posted:My understanding is, 1/4800 of a second integrated does not increase sensor dynamic range, you're just under exposing and frame averaging. This works where the sensor is linear, and if read noise is 0 like you said (in reality it isn't), and so fails in shadows where noise is not linear. Underexposing by 100x is going to gently caress up your image, even if you take 100 of them. Those highspeed cameras might take 2400 frames a second but they expose them correctly. I do test this stuff so if there's a camera that you think pulls this off with normal motion blur I'll call up a rental house and go play with it. So, it's possible to create a CMOS or CCD sensor that is remarkably linear even if underexposed by 8 stops as in the above example. You can get very close to the theoretical (shot noise limited) case. This is easier if you optimize your sensor and camera for this from the beginning--knowing that you need very high accuracy, low noise, and linear response for photosites that are collecting a small number of photons repeatedly and very quickly. Sensors based on pixel self-reset and asynchronous readout are some examples (like this https://ieeexplore.ieee.org/abstract/document/7895171) *Assuming* you have solved the issues with linearity, readout noise, ADC noise, etc. (which, again, is doable), and that you are basically operating in a shot noise limited fashion, then "just underexposing and frame averaging" will not gently caress up your image, and will increase the dynamic range but only insofar as it was limited by highlight clipping. You won't improve the SNR of the image at a given integration time, but you will eliminate the "problem" of photosites clipping I'm not aware of any cine cameras that do this, and I'm not surprised. For one, doing it well almost requires compromising either the range or linearity of photosites when operated conventionally. Second, cameras operated conventionally are basically good enough, with dynamic range that exceeds any display or projector, so nobody really cares. Finally, none of this was really practical until maybe the last 10 years, and professional digital cinematography is only about as old. That being said I expect this to become commonplace in still cameras eventually, and eventually cine cameras, because there is no fundamental reason not to do it.
|
# ? Sep 12, 2020 09:39 |
|
Antonymous posted:I left a lot of technical stuff out of my other post because I felt I'd said enough. But I think it's not just a random assortment of grain sizes, the color negative emulsion is several layers, with previous layers blocking light to following layers both because grain casts shadows onto grains below it and because there are color filters built in between layers of the emulsion, which means really bright light gets modulated down before it reaches them. It's really sophisticated and not something digital can take advantage of, and causes some of the unique flaws of color film that people like, like the softening of warm colors. Oh cool, I wasn't really thinking about it in 3D sort of way but it make sense. Guess that also explains why some people went nuts over the foven sensors where the photosite for each color is stacked depending on the wavelength instead of using the traditional bayer pattern. The per-pixel sharpness is just stunning on those without any sort of moire defects. That being said, the future seems to be something like high refresh 8K on largeish TVs with HDR where our eyes will become the limit and cannot discern any more pixels or brightness level beyond those. And then we can finally stop buying new TVs because there's nothing much else beyond those metrics to improve on. This but with regular TV size and price. https://www.tomshardware.com/uk/news/sharp-unveils-new-120inch-8k-display-up-to-120hz-hdmi-21-2048-dimming-zones
|
# ? Sep 12, 2020 09:59 |
|
All I know about CCD tech is the stuff we use in telescopes. Those sensors are state of the art stuff that can detect individual photons and is designed for brutally long exposition (we're talking literal hours), but it needs to be hooked up to denoisers up the wazoo - a lot of setting up a telescope these days is getting accurate background so you can filter the noise out. If the background is overwhelming (light pollution being the prime problem) then there's nothing you can do of course, but you'd be surprised how much you can ENHANCE the image if you do things right I've taken really sharp photos of very faint star clusters while just outside a city with a relatively small in-building telescope by working the photo all night, and our campus had a background analyst wizard who would spend weeks working the background and analysing the light pollution dependent on the day of the week so he could squeeze out more fidelity when he actually took a photo You can't really do that with analog plates because you need quantized, discrete data to do statistical denoising at the level required for astronomy. dex_sda has issued a correction as of 10:58 on Sep 12, 2020 |
# ? Sep 12, 2020 10:54 |
|
dex_sda posted:All I know about CCD tech is the stuff we use in telescopes. Those sensors are state of the art stuff that can detect individual photons and is designed for brutally long exposition (we're talking literal hours), but it needs to be hooked up to denoisers up the wazoo - a lot of setting up a telescope these days is getting accurate background so you can filter the noise out. If the background is overwhelming (light pollution being the prime problem) then there's nothing you can do of course, but you'd be surprised how much you can ENHANCE the image if you do things right Do you actually have to chant "ENHANCE" as you do it, because that would be rad (I love this photography derail!)
|
# ? Sep 12, 2020 10:55 |
|
atomicgeek posted:Do you actually have to chant "ENHANCE" as you do it, because that would be rad it's more drinking a lot of coffee staring at a laptop screen in a shroud while you're shivering from the brutal cold in the dark because you open the dome so that it cools the ccd so there's less circuitry-induced noise in the detectors, all while you hope you calibrated everything right when taking the background photo
|
# ? Sep 12, 2020 11:00 |
|
4k is already past visual sensitivity to real world images with motion, depth of field and so on for normal viewing. Usually they define 4k viewing distance as being able to tell a white pixel from a black pixel, e.g. read text. HDR is probably the one place to make a big change, but idk if I want to feel my eyes sting when an actor rakes a flashlight across the lens in a dark scene (I've watched footage on prototype HDR screens that do exactly this, even footage of christmas lights hurt to look at). Color gamut and bit depth are not yet reproducing at human sensitivity for real world stuff but color is where humans are much less sensitive and there are practical reasons not to work on this problem. I'd rather have less lossy compression and a 2k image any day. We're already in the age of biological limit. A modern bluray will not be obsolete the way a VHS is. When 8k is standard they will tell you 16k is needed but it's not (for watching movies / tv). Yes of course you are right, and anyway you can gain more highlight than lose in shadows in real world cases and that extended dynamic range is the point we were talking about which I forgot.
|
# ? Sep 12, 2020 11:06 |
|
btw gently caress elon musk, motherfucking moron made ground-based astronomy esp. radio astronomy so much harder. you already have so much background to keep in mind and the introduced another really stupid source of it
|
# ? Sep 12, 2020 11:06 |
|
dex_sda posted:it's more drinking a lot of coffee staring at a laptop screen in a shroud while you're shivering from the brutal cold in the dark because you open the dome so that it cools the ccd so there's less circuitry-induced noise in the detectors, all while you hope you calibrated everything right when taking the background photo I thought astronomy cameras all had peltier or sterling cryocoolers these days, I know that one mikeselectricstuff took apart did anyway ...or well that was a DNA camera (yes really, apparently that's how a lot of DNA sequencers work, they make the DNA base pairs glow different colors and then film the flashes of light it outputs) but he mentioned it was the same kinda camera used for astronomy in that it could detect ultra-low-light levels, had an absurd resolution and needed poo poo like the cryocooler to get the noise floor down.
|
# ? Sep 12, 2020 11:50 |
|
Encrypted posted:Oh cool, I wasn't really thinking about it in 3D sort of way but it make sense. If you make the pixel resolution really high, comparable to the wavelength of light, you can reproduce not just the magnitude but also the phase of a recorded scene and do real time holography. So we won't be able to stop buying new TVs until we're at around 1000K... On the sensor side, some people would say we have enough megapixels but oh ho ho. If you had a few million megapixel sensor, divided into a few million photosites each consisting of it's own multi megapixel sensor in the focal plane of a microlens, you could record the light field at the sensor surface and reconstruct the image that would have been created by any lens at any aperture equal to or smaller than the sensor area. So you could have a camera the size of your phone, with a medium format sensor, capable of taking the same image as an 8mm or 600mm lens, with whatever depth of field you like, without any physical lens.
|
# ? Sep 12, 2020 11:51 |
|
dex_sda posted:btw gently caress elon musk, motherfucking moron made ground-based astronomy esp. radio astronomy so much harder. you already have so much background to keep in mind and the introduced another really stupid source of it Yeah I heard there's a lot of images being ruined by his stupid constellation of satellites flying across the field of view and leaving a big light skidmark, didn't hear about the radio astronomy thing though but it makes sense they'd mess up that too. I think my favorite radio astronomy interference case is this one though: quote:After more than four years of searching, researchers using the Parkes radio telescope in New South Wales, Australia, have identified the source of some mysterious signals: a microwave oven in the facility’s break room. If someone opens the door of the microwave in the break room before the timer runs out there's enough energy that leaks in the split second before it shuts off that it was swamping out the telescope. And they were probably microwaving fish too I bet
|
# ? Sep 12, 2020 12:00 |
|
Shame Boy posted:I thought astronomy cameras all had peltier or sterling cryocoolers these days, I know that one mikeselectricstuff took apart did anyway They do but it was a ccd retrofitted telescope so the cooling could have trouble sometimes. You gotta open the dome anyway so we just opened the doors to get more air in too. Also you wanted the optics themselves to cool down close to outside temp so you wouldn't get stuff like condensation in the middle of the photo. 50 meters over there were (tenured) people working on a bigger telescope with remote control and us students had to work in really crappy conditions lmfao. God I miss it. I loved the cold, loved the awful campus provided coffee, loved the time when the rest of the group would go home after the mandatory class and I would be alone with the sound of crickets. Also hooked up with an astronomer lady for a while thanks to observation nights we shared. Good times
|
# ? Sep 12, 2020 12:21 |
|
Shame Boy posted:Yeah I heard there's a lot of images being ruined by his stupid constellation of satellites flying across the field of view and leaving a big light skidmark, didn't hear about the radio astronomy thing though but it makes sense they'd mess up that too. Skid mark is right, you try to collect individual photons and suddenly you got a line of a dozen satellites fingerpainting your drat data. I haven't taken a photo in a long while but even before starlink it was a problem, and the non-musk sats were actually made not to mess astronomy up. The worst is that it's primarily a problem for aspiring students who work on mediocre equipment. Large telescopes have programming ready to remove these kinds of artifacts, but a fresh student getting cold on a summer's night will get ruined photos without knowing how to fix and lose patience.
|
# ? Sep 12, 2020 12:27 |
|
Antonymous posted:digital is way more linear than tape Not in that tape is harder to seek and search. He writes long instrumentals and wants people to just sit and listen. You're also arguing with a guy (him) who isn't here and who writes and records music in an isolated log cabin and then sells the limited edition tapes under pseudonyms. I'm just pointing out that some artists have preferences for tape.
|
# ? Sep 12, 2020 14:56 |
|
You were talking about time progression, they were talking about the dynamic response.
|
# ? Sep 12, 2020 15:40 |
|
cameras are boring and suck poo poo
|
# ? Sep 12, 2020 16:42 |
|
indigi posted:cameras are boring and suck poo poo turn on your viewfinder
|
# ? Sep 12, 2020 16:45 |
|
I like to do all my photography with a 2003 flip phone
|
# ? Sep 12, 2020 16:49 |
|
|
# ? May 25, 2024 10:07 |
|
I like how we started at "photography is 90% skill, all that gear is just icing on top" and went full into gear mode. Over in the guitar thread we had a pretty good discussion on how discussing gear is basically useless, the point is playing something cool. But discussing gear is so much easier, so so talk about gear dominates discussion, leading people to buy more gear to sound good instead of figuring out how to sound good by playing cool stuff. My point is that the nature of discussing hobbies lead to superfluous spending. Capitalism is cool and good.
|
# ? Sep 12, 2020 18:01 |