|
Perhaps they have some reason to avoid allocating many small chunks of memory and possibly want the caller to simply provide a huge blob they can stuff everything into? Such a god-object is a useful pattern if you are implementing a C library with a fair amount of state and you want to enable the user of the library to take care of the allocation (e.g. so they can stuff it in a secure memory area on devices where such a thing exists). Avoids having to design some fancy allocation system. But okay - you speak of a game, so yeah... game programmers, man.
|
# ? Jan 3, 2014 08:18 |
|
|
# ? May 28, 2024 12:50 |
|
QuarkJets posted:Why doesn't it scale? You travel with your own atomic clock that stays in your reference frame. From time to time your computer checks in with the space system's central atomic clock to see how far you're off by and making an adjustment accordingly. QuarkJets posted:It's not going to be exact, but time-keeping between two non-inertial frames where you don't necessarily know your velocity relative to the central clock that well is going to be impossible. The atomic clock is basically like having a clock tower; each town (space system) gets its own and from time to time some people get together to make sure that all of the clocks are running about right. I don't see us being able to do much better than that with the technology that exists today QuarkJets posted:And I hardly see what's terrestrial about a system of atomic clocks orbiting stars so that spaceship pilots can have some reasonably accurate way of telling time, but okay
|
# ? Jan 3, 2014 11:45 |
|
ninjeff posted:What do you do when you've been recording events for one hundred years since your last sync (and sending those on to other systems), the spaceship from Earth arrives, and the updated clock tells you only fifty years have passed? ninjeff posted:This was my point. Measurable time is a local phenomenon and you can't get anywhere while holding onto it as a universal concept. ninjeff posted:The terrestrial part is the requirement to have a central universal clock. Let's say the people of Sirius V don't receive a clock from Earth for a lot longer than expected. At what point should they decide that the Earth is destroyed (or conquered by anti-space extremists) and that they need to organise a new universal clock with the other major planets?
|
# ? Jan 3, 2014 15:07 |
|
Also cool how if we move to another planet eventually one day or find other life you expect to just say "okay we're keeping our time scale" even though their day cycle is nothing similar and nothing makes sense since our entire time is built around current time of day on earth, so one second or hour or "day" is meaningless anywhere but earth, more so if you're in space. The only "correct" time is time relative to the big bang or whatever, but there's no standardised formats of that, and generally it'd probably be a hard number to store with small precision without taking up a lot more space than a standard UNIX timestamp. Edit: Actually I think that's my problem. I don't like how "Time" means "Time Of Day" instead of "Time". Time of day is a useless marker that serves only to guide us in activities like work, sleep, eating, etc etc, rather than something that should be used in computers. Jewel fucked around with this message at 16:29 on Jan 3, 2014 |
# ? Jan 3, 2014 15:16 |
|
2^64 seconds is 584 billion years, so we could actually use "seconds since the big bang". Figuring out when the big bang was might be a bit harder though.
|
# ? Jan 3, 2014 16:24 |
|
evensevenone posted:2^64 seconds is 584 billion years, so we could actually use "seconds since the big bang". Easy, it was t=0. Psh, next question please.
|
# ? Jan 3, 2014 16:43 |
|
Factor Mystic posted:Easy, it was t=0. Psh, next question please. Then I guess the real challenge is figuring out what time it is "Hey, what time is it?" "We... don't know yet."
|
# ? Jan 3, 2014 16:45 |
|
The internet should obviously use Swatch Internet Time, the best time.
|
# ? Jan 3, 2014 16:45 |
|
ninjeff posted:What do you do when you've been recording events for one hundred years since your last sync (and sending those on to other systems), the spaceship from Earth arrives, and the updated clock tells you only fifty years have passed?
|
# ? Jan 3, 2014 16:50 |
|
Otto Skorzeny posted:Sure you can. Make a distinction between physical seconds and logical seconds. evensevenone posted:2^64 seconds is 584 billion years, so we could actually use "seconds since the big bang". My general relativity is a little spotty, but isn't this impossible?
|
# ? Jan 3, 2014 16:59 |
|
The time in any frame of reference can be converted to the time in any other frame. The problem for software is figuring out what the frames are. GPS satellites have it easy, because they can do everything relative to Earth. So pretty much this:ManoliIsFat posted:Your db transaction log backups are gonna be so hosed.
|
# ? Jan 3, 2014 17:55 |
|
Jewel posted:Also cool how if we move to another planet eventually one day or find other life you expect to just say "okay we're keeping our time scale" even though their day cycle is nothing similar and nothing makes sense since our entire time is built around current time of day on earth, so one second or hour or "day" is meaningless anywhere but earth, more so if you're in space. The only "correct" time is time relative to the big bang or whatever, but there's no standardised formats of that, and generally it'd probably be a hard number to store with small precision without taking up a lot more space than a standard UNIX timestamp. Seconds are based around vibrations of the Cesium atom. These vibrations are accurate regardless of gravity. It's a universal constant.
|
# ? Jan 3, 2014 18:16 |
|
Jewel posted:Edit: Actually I think that's my problem. I don't like how "Time" means "Time Of Day" instead of "Time". Time of day is a useless marker that serves only to guide us in [a bunch of useful things], rather than something that should be used in computers.
|
# ? Jan 3, 2014 18:48 |
|
ninjeff posted:What do you do when you've been recording events for one hundred years since your last sync (and sending those on to other systems), the spaceship from Earth arrives, and the updated clock tells you only fifty years have passed? Get low-resolution time information from a common reference frame like observing some pulsars until you get a signal that allows you to correct for the much more manageable difference. Only a serious accident or equipment failure should cause more than a few minutes of skew from the 'universal standard absolute time' or whatever.
|
# ? Jan 3, 2014 19:32 |
|
SavageMessiah posted:Then I guess the real challenge is figuring out what time it is "gently caress it, lets just count from 1 January 1970!"
|
# ? Jan 3, 2014 19:55 |
|
code:
|
# ? Jan 3, 2014 21:05 |
pseudorandom name posted:"gently caress it, lets just count from 1 January 1970!" But when is that?
|
|
# ? Jan 3, 2014 21:15 |
|
Manslaughter posted:But when is that? 1388782132 seconds ago from this post.
|
# ? Jan 3, 2014 21:48 |
|
evensevenone posted:2^64 seconds is 584 billion years, so we could actually use "seconds since the big bang". 64 bits clearly isn't enough http://en.wikipedia.org/wiki/Timeline_of_the_far_future .
|
# ? Jan 3, 2014 22:11 |
baquerd posted:1388782132 seconds ago from this post. Impossible - global timezone variance range is 25 hours. There is no one point in time that can be considered January 1, 1970 for every place on Earth.
|
|
# ? Jan 3, 2014 22:40 |
|
That sure was a funny joke you've drilled through the floor.
|
# ? Jan 3, 2014 22:52 |
|
Okay yes you say a bunch of useful things, but not in terms of computers/programming. I'm not saying get rid of time of day, I'm just saying stop using time of day in things like transaction databases.
|
# ? Jan 4, 2014 02:02 |
|
You guys may laugh, but I'm going to be the smuggest skeleton in a billion years.
|
# ? Jan 4, 2014 02:14 |
|
Jewel posted:Okay yes you say a bunch of useful things, but not in terms of computers/programming. I'm not saying get rid of time of day, I'm just saying stop using time of day in things like transaction databases. But that doesn't solve the problem, it just moves it. Humans like to think in terms of ambiguous, terrible time. "Two weeks from now", not "12408944134 seconds since the big bang". "Tomorrow at 4:30PM", not "2246453 seconds since January 1st, 1970". So, if for instance I'm making an appointment book app, yes, I have to deal with the fact that an hour will appear twice in the list, and design my UI around that fact. Maybe add a simple hint telling them "no, this isn't a bug, this is the second hour because this is when DST changes". When they enter "2:30AM on Saturday in China", warn them to be more specific, because Tibet and Xinjiang traditionally use UTC+6, not UTC+8, even though that's not the "official" timezone.
|
# ? Jan 4, 2014 02:28 |
|
i think we can all agree that programmers would cherish the day computers stopped having to interact with humans or the extant universe
|
# ? Jan 4, 2014 02:32 |
|
JawnV6 posted:i think we can all agree that programmers would cherish the day computers stopped having to interact with humans or the extant universe I would finally be able to write everything in Haskell.
|
# ? Jan 4, 2014 03:56 |
|
ninjeff posted:What do you do when you've been recording events for one hundred years since your last sync (and sending those on to other systems), the spaceship from Earth arrives, and the updated clock tells you only fifty years have passed? You're extending this to have problems that my post didn't have: individual ships wouldn't be broadcasting times to systems, and systems wouldn't be broadcasting times between each other. Each star system has its own clock, and that's the local time. It'd be impossible and pointless to try to sync up all of the star systems to each other perfectly. Using a series of atomic clocks does allow you to keep extremely accurate time within each system that has one, which is important. Regular sync-ups (the kind that I described in my post) prevent the problem that you've stated, and even if you were going for 100 years without a sync-up for some reason and then get told that 50 years have passed then you just need to modify your records for time dilation: compress all of your timestamps once. quote:This was my point. Measurable time is a local phenomenon and you can't get anywhere while holding onto it as a universal concept I didn't say central universal clock. Even in the post that you're quoting I said that each system would have its own. e: I don't know why you keep trying to put words in my mouth, but stop it QuarkJets fucked around with this message at 05:45 on Jan 4, 2014 |
# ? Jan 4, 2014 05:35 |
|
QuarkJets posted:Regular sync-ups (the kind that I described in my post) prevent the problem that you've stated, and even if you were going for 100 years without a sync-up for some reason and then get told that 50 years have passed then you just need to modify your records for time dilation: compress all of your timestamps once.
|
# ? Jan 4, 2014 05:46 |
|
The Gripper posted:But what if the compression isn't uniform? If you're keeping accurate records of your acceleration with respect to the star's clock (not hard), then you can calculate what the time dilation was at any given point. If you're not, then you just have to use the average velocity of your reference frame (calculated with the difference in time between the two clocks), which is less accurate but there's not really an alternative.
|
# ? Jan 4, 2014 06:18 |
|
What if like, what one second is for me isn't the same as one second for you, though? Really makes you think.
|
# ? Jan 4, 2014 06:32 |
|
JawnV6 posted:i think we can all agree that programmers would cherish the day computers stopped having to interact with humans or the extant universe Yeah, then we can get started on the real issues like endianness
|
# ? Jan 4, 2014 06:41 |
|
JawnV6 posted:i think we can all agree that programmers would cherish the day computers stopped having to interact with humans or the extant universe Isn't that why people enter academia?
|
# ? Jan 4, 2014 07:25 |
|
evensevenone posted:Isn't that why people enter academia? Yep. Academia here, paid (poorly) to write in Haskell. Of course, some of that Haskell is written by a post-doc who used to be a Fortran programmer. I could probably grab a few things for this thread...
|
# ? Jan 4, 2014 10:22 |
|
Ithaqua posted:Watch this video by Jon Skeet. He talks about time starting at about 17 minutes, but the entire thing is pretty interesting. his combining character comes *before* the letter and not *after* >>> print("Les Mis\u0301erables") Les Miśerables
|
# ? Jan 4, 2014 12:27 |
|
JawnV6 posted:i think we can all agree that programmers would cherish the day computers stopped having to interact with humans or the extant universe
|
# ? Jan 4, 2014 15:20 |
|
Related to the discussion we had a pair of weeks ago about how successful can be a decompiler, today this has appeared at /. : http://valverde.me/2014/01/03/reverse-engineering-my-bank's-security-token/#.UslsCXnGyvJ td;lr; some brazilian guy didn't like the android app his bank provides to get OTP codes, so he decompiled it and re-implemented the whole thing to run in an Arduino compatible board showing the codes in a small LCD string.
|
# ? Jan 5, 2014 15:32 |
|
So now that we've established that timezones are awful, how about people's terrible understanding of how computers encode human language http://lucumr.pocoo.org/2014/1/5/unicode-in-2-and-3/ It seems there are some terribly brain damaged people in Python land and they're getting to influence the Unicode handling.
|
# ? Jan 5, 2014 21:34 |
|
The entire Unicode system is a minefield, consider that Python works on both Windows and OSX and both of those take different interpretations of Unicode coding patterns.
|
# ? Jan 5, 2014 21:40 |
|
Its always annoying that Window's solely uses UTF-16 for anything that handles text in Win32. WideCharToMultiByte and MultiByteToWideChar work flawlessly for doing conversions to and from UTF-16 but its still a pain in the butt to not just be able to say, "All our text is UTF-8" and throw everything into char strings.
|
# ? Jan 5, 2014 22:01 |
|
|
# ? May 28, 2024 12:50 |
|
MrMoo posted:The entire Unicode system is a minefield, consider that Python works on both Windows and OSX and both of those take different interpretations of Unicode coding patterns. Wat?
|
# ? Jan 5, 2014 22:03 |