|
AFAIK the line of thought went like this 1. Computers will glitch out when 2000 hits. 2. Computers have absolute control over Important Things 3. Glitching computers will bring about THE END OF CIVILIZATION AS WE KNOW IT 4. Therefore stockpile because I WILL RUN BARTERTOWN.
|
# ? Feb 18, 2015 18:28 |
|
|
# ? May 9, 2024 10:16 |
|
|
# ? Feb 18, 2015 18:44 |
|
Update on the McLovin campaign: Despite small % increases in consumers talking about McD actual consumer attitudes didn't budge and in some cases declined.
|
# ? Feb 18, 2015 19:58 |
|
Barudak posted:Update on the McLovin campaign: Despite small % increases in consumers talking about McD actual consumer attitudes didn't budge and in some cases declined. My store took part in that poo poo. Our store is already loved in our area so it was a huge waste of time that i managed to avoid by working third shift. Also it only lasted 14 days, which is a tiny blip in consumers' minds.
|
# ? Feb 18, 2015 20:06 |
|
pentyne posted:No joke tons of people were losing their minds over Y2K and the instant the clocks turned over the news were interviewing Australians about whether or not their computers went down. People were being interviewed for their disaster prep extremes and treated like rational sane people rather then the nuts that they were. I know this isn't the place for it, but can anyone give a rundown as to why it was actually a problem from a semi technical standpoint? Because computers don't store the year, it just stores the seconds since whatever epoch you're using doesn't it?
|
# ? Feb 18, 2015 20:16 |
|
Praseodymi posted:I know this isn't the place for it, but can anyone give a rundown as to why it was actually a problem from a semi technical standpoint? Because computers don't store the year, it just stores the seconds since whatever epoch you're using doesn't it? The computer would probably be fine on a low level since yes, it gets the current date by counting seconds from...some time in the 70s, I want to say. But imagine a piece of software designed to handle the payroll. It stored the date as two numbers representing decade and year, while the millennium and century were assumed to be 1 and 9 respectively. When 2000 hit the decade and year went to 00, making the computer think it was 1900. It wasn't exactly going to send nukes flying or anything, but employees might not get paid and it would think people are negative years old and all sorts of weird stuff. Of course, even systems that weren't designed to the year 2000 in mind really should have some sanity checking and default to doing the safest thing when weird poo poo like a negative age turned up, just in case someone made a typo on someone's birthdate or something like that. tl:dr: Y2K bug would have been a nuisance and probably caused a LOT of problems, but it wasn't apocalyptic my any means.
|
# ? Feb 18, 2015 20:23 |
|
When I worked there, the staff meetings were named and posted as "KKK Meets" (Krispy Kreme Krew). Everybody thought it was weird but nobody did anything about it. AMA about being a former KKK member.
|
# ? Feb 18, 2015 20:23 |
|
Slime posted:The computer would probably be fine on a low level since yes, it gets the current date by counting seconds from...some time in the 70s, I want to say. But imagine a piece of software designed to handle the payroll. It stored the date as two numbers representing decade and year, while the millennium and century were assumed to be 1 and 9 respectively. When 2000 hit the decade and year went to 00, making the computer think it was 1900. It wasn't exactly going to send nukes flying or anything, but employees might not get paid and it would think people are negative years old and all sorts of weird stuff. Of course, even systems that weren't designed to the year 2000 in mind really should have some sanity checking and default to doing the safest thing when weird poo poo like a negative age turned up, just in case someone made a typo on someone's birthdate or something like that. Like this guy said, it would cause a lot of little problems. The issue there is that a lot of little problems can be REALLY hard to ferret out, and can cause compounding issues. It definitely wouldn't've launched nukes or anything, but it could've hosed with, say, the power grid a bit if there were two integer dates in the controllers, or if they predicted output based on years and generated a certain amount (for 1900 it would've been 0, for 2000 it would've been way more than 0). It could've also paralyzed parts of our economy that used bad software, and it potentially could've taken months to fix. In the end, there was enough vision on the issue that it was solved with quite a bit of time to spare, and even if it HAD gone bad, it wouldn't've wrecked our society, just been an annoying few years while everything was dealt with. edit: In terms of scale, it was a potential economic Hurricane Katrina not a Krakatoa.
|
# ? Feb 18, 2015 20:27 |
|
Praseodymi posted:I know this isn't the place for it, but can anyone give a rundown as to why it was actually a problem from a semi technical standpoint? Because computers don't store the year, it just stores the seconds since whatever epoch you're using doesn't it? Basically, some programs in Windows didn't account for the change in millennia because it saved space in memory. This could cause errors for, say, accounting programs and poo poo, but not for 90% of what people thought it would. There is actually a similar issue with Linux that will occur in 2038, which people don't ever talk about because most consumers don't even know what Linux is.
|
# ? Feb 18, 2015 20:30 |
|
So the gist is it was people not being safe when they did date comparisons, fair enough.Lumberjack Bonanza posted:There is actually a similar issue with Linux that will occur in 2038, which people don't ever talk about because most consumers don't even know what Linux is. That's what I was comparing it to, and that makes more sense, as anything more than 1970 + 2^32 seconds can't be represented if you're keeping track of time like that.
|
# ? Feb 18, 2015 20:33 |
|
Praseodymi posted:So the gist is it was people not being safe when they did date comparisons, fair enough. Yeah, that's exactly it. Just some minor issue that can be (and was) resolved through very small updates. I remember being really freaked out at the time, though, 'cause I didn't understand computers well and good lord did the media lost their poo poo about it. Arsonist Daria has a new favorite as of 20:41 on Feb 18, 2015 |
# ? Feb 18, 2015 20:38 |
|
The KKK took my donuts away.
|
# ? Feb 18, 2015 22:00 |
|
Phanatic posted:Well, a few unnecessary abortions happened because tests for birth defects came back as false positives. But you know, whatever. I hadn't actually heard about that and yeah that's definitely worse. Still isn't "world got nuked into Fallout" bad. FreudianSlippers posted:Was that seriously the problem? This was the late 90's where most people didn't have PCs yet and the only people that really knew computers were huge nerds and being a nerd was A Bad Thing. I remember telling people I wanted to go to college to become a programmer and they'd think I was stupid because "computers won't last." Aside from that like was said people assumed computers could do all sorts of stupid poo poo they ultimately couldn't or thought they were smarter than they were. There was just a lot of technological ignorance at the time. You had a great many people who had never interacted with a computer directly that had only seen computers in the movies and thought you could do crap like launch nukes with a laptop that you got close enough to the Pentagon. So you had people saying that all of the missiles were going to launch or all the power plants would simultaneously explode and crap. And again as was said there were some minor hiccups here and there but absolutely nothing apocalyptic happened largely because everybody kind of, you know, saw it coming and could patch the software years ahead of time. Which was another issue; a lot of people had no idea you could do things like patch software.
|
# ? Feb 18, 2015 22:24 |
|
I wonder what it's like to be one of the tech dudes at a production company that makes News reports or CSI 'Hacker' segments. It would either be a nightmare or god drat hilarious.
|
# ? Feb 18, 2015 23:19 |
|
darkhand posted:I wonder what it's like to be one of the tech dudes at a production company that makes News reports or CSI 'Hacker' segments. It would either be a nightmare or god drat hilarious. I used to live with some people who really, really liked those shows and I just flat out couldn't watch them. With the crap those shows tell the viewer computers can do the things might as well be wearing goddamned wizard hats and waving wands and going "TADAAAA!" while the criminal just appears right in front of them.
|
# ? Feb 18, 2015 23:27 |
|
ToxicSlurpee posted:I used to live with some people who really, really liked those shows and I just flat out couldn't watch them. With the crap those shows tell the viewer computers can do the things might as well be wearing goddamned wizard hats and waving wands and going "TADAAAA!" while the criminal just appears right in front of them. https://www.youtube.com/watch?v=Vxq9yj2pVWk
|
# ? Feb 19, 2015 00:10 |
|
Oh man, I was hoping it was going to be the NCIS "two people/one keyboard" hacking scene.
|
# ? Feb 19, 2015 00:46 |
|
DrBouvenstein posted:Oh man, I was hoping it was going to be the NCIS "two people/one keyboard" hacking scene. Hey guess what, I have good news https://www.youtube.com/watch?v=u8qgehH3kEQ
|
# ? Feb 19, 2015 01:09 |
|
In last week's Arrow the hot nerd hacker not only zoomed in and enhanced a camera still but also changed the camera angle.
|
# ? Feb 19, 2015 01:20 |
|
Lumberjack Bonanza posted:There is actually a similar issue with Linux that will occur in 2038, which people don't ever talk about because most consumers don't even know what Linux is. I remember reading about a potential problem somewhere around then (2038 or so) that will affect DOS based 32 bit computers. Basically something about running out of storage to store the number of seconds since 1970-something and just not working anymore. Obviously a lot of computers now are 64 bit and I suspect its an easily fixable problem, but it does exist. I remember testing it out with an old palm pilot by setting the year forward until it froze (around 2020-something)
|
# ? Feb 19, 2015 01:36 |
|
ToxicSlurpee posted:This was the late 90's where most people didn't have PCs yet and the only people that really knew computers were huge nerds and being a nerd was A Bad Thing. I remember telling people I wanted to go to college to become a programmer and they'd think I was stupid because "computers won't last." Aside from that like was said people assumed computers could do all sorts of stupid poo poo they ultimately couldn't or thought they were smarter than they were. There was just a lot of technological ignorance at the time. You had a great many people who had never interacted with a computer directly that had only seen computers in the movies and thought you could do crap like launch nukes with a laptop that you got close enough to the Pentagon. The 90s weren't the 80s. 51 percent of Americans had a computer at home by 2000, 80 percent of children had access at least at school, and programming/computer science were constantly mentioned as a future high-demand field that students should go into. Sorry about your nerd persecution complex but it doesn't have any basis in reality.
|
# ? Feb 19, 2015 01:45 |
|
Re: McDonald's Pizza chat McDonald's Canada launched pizza at all of its franchises at the same time in 1992. They made franchisees pay for expensive ovens, freezers and prep equipment. They also made franchisees add new signs under the golden arches to show that the store did indeed sell pizza. The ad agency had the brilliant idea to use the McDonald's "M" turned on an angle as a "Z": Not only that, they did something that very few companies have the stones to do: Buy ad time on all stations, at the same time, during prime time. Meaning right before the Simpsons or something every single station in the country showed the same McDonald's pizza commercial. It was famous a the time for how much money they spent. This may have been one variation on it: https://www.youtube.com/watch?v=LeQQfI8Xg3o (Always a great idea to put your CEO in the ad, right?) Here are two very Canadian ads from 1992 after the big launch: https://www.youtube.com/watch?v=EEhUjabIE8w https://www.youtube.com/watch?v=n4yXsoqtOVY Now it seems the limitations of having pre-frozen pies meant that the result was greasy and chemical-tasting and the taste did not rival any of the major pizza chains in any way. It also took a long time to prepare (they claimed 5 minutes but in reality it was more like 10-15). You would order the pizza and then once it was ready one of the counter people would carry it out and put it on a stand in the middle of your table with shakers of parmesan and chili flakes. Anyways the experiment only lasted till 1999 and by that time they had discontinued the family size pizza and only did the personal size. They stopped selling pizza entirely and took down the expensive signs. I think there might have been a bit of a franchisee revolt due to the money lost. Cleverly they retrofitted the freezers and ovens to bake fresh (from frozen) bagels and muffins from then on.
|
# ? Feb 19, 2015 02:04 |
|
Is it too early to call the Kinect a failed peripheral? Because there are very few Kinect-required titles for Xbox One, and it appears that only one is any good.
|
# ? Feb 19, 2015 02:18 |
|
canyoneer posted:Is it too early to call the Kinect a failed peripheral? Because there are very few Kinect-required titles for Xbox One, and it appears that only one is any good. Kinect was already a failed peripheral last gen and now even moreso. The amount Microsoft hosed over people like Harmonix by pulling that thing out of the box to cut costs is astounding. PS. What is the one good kinect title? (Dance Central spotlight?) Also never forget the poncho reveal starring cirque du soleil for a video game peripheral.
|
# ? Feb 19, 2015 02:20 |
|
Tiny Brontosaurus posted:The 90s weren't the 80s. 51 percent of Americans had a computer at home by 2000, 80 percent of children had access at least at school, and programming/computer science were constantly mentioned as a future high-demand field that students should go into. Sorry about your nerd persecution complex but it doesn't have any basis in reality. I also live in rural Pennsylvania which is perpetually behind the rest of the nation. Has nothing to do with nerd persecution complexes. Aside form that "most people having a computer by 2000" does not mean most people had them in the 90's. Not everybody that had one even used it all that much and most of the kids in school looked at me like I was insane when they found out I had an e-mail address in 1995. We had computers at school but didn't really use them for all that much. Some classes used them exactly never. They were there sure but people weren't really using them for all that much but when they caught on they caught on something fierce. The main reason they caught on among people I knew in high school was when the school got a T1 line and people realized you could download music and put it on a zip/external drive. Then the nerdy computer kids suddenly became more popular.
|
# ? Feb 19, 2015 03:36 |
|
canyoneer posted:Is it too early to call the Kinect a failed peripheral? Because there are very few Kinect-required titles for Xbox One, and it appears that only one is any good. The Kinect is absolutely a failed peripheral, it's just that Microsoft was keeping the drat thing on life support long after it should have died. What's most puzzling is that it seem to actually have utility when modded for decidedly non-video game things, but Microsoft either doesn't know or doesn't care and isn't bringing the product to other divisions where it might see some real use.
|
# ? Feb 19, 2015 03:48 |
|
ToxicSlurpee posted:Aside form that "most people having a computer by 2000" does not mean most people had them in the 90's. Do you understand which year came before 2000.
|
# ? Feb 19, 2015 03:56 |
|
Phanatic posted:Well, a few unnecessary abortions happened because tests for birth defects came back as false positives. But you know, whatever. edit: Hell, there was another page. Well.
|
# ? Feb 19, 2015 04:10 |
|
Tiny Brontosaurus posted:Do you understand which year came before 2000. http://www.nsf.gov/statistics/seind00/c8/c8s3.htm: quote:A number of indicators show the growing and widespread use of computers and computer-based technologies in the late 1990s. The increase in the number of home computers is particularly noteworthy.[24] In 1999, for the first time ever, a majority of American adults (54 percent) had at least one computer in their homes. The percentage has been rising steadily since 1983, when only 8 percent had them. (See figure 8-16 and appendix table 8-30.) In addition, among all adults, the late '90s - the second half of the decade, the period from 1995 through 1999, which is not the same as "the year 2000 specifically" - were a time of growth and change for Americans' relationship with home computers, and numbers from the year 2000 are not actually indicative of numbers and standards in rural areas throughout the preceding decade
|
# ? Feb 19, 2015 04:57 |
|
InediblePenguin posted:"in 1995 most people didn't have computers" and "technically barely over half of americans had computers in 2000" aren't mutually exclusive even theoretically, why are you being such a dipshit about it? Don't take your weird hick anger out on me when your own cites prove you wrong
|
# ? Feb 19, 2015 05:01 |
|
You still seem to think "barely over half of Americans owned a computer" somehow negates "most Americans didn't understand how computers worked and poo poo" so whatever, dude I'm also not the person you were arguing with before so your "hick anger" comment is pretty out of the blue tbh but ok
|
# ? Feb 19, 2015 05:08 |
|
Most people now don't understand how computers work.
|
# ? Feb 19, 2015 05:12 |
|
Tiny Brontosaurus posted:The topic had been Y2K, which is 1999 specifically, and oh hey look In 1999, for the first time ever, a majority of American adults (54 percent) had at least one computer in their homes. In 1999, for the first time ever, the majority of adults had computers in their homes. Which means by definition that from 1990 to 1998 most homes did not have computers which is kind of, you know, 90% of the decade. Which does in fact justify my argument that the 90's were a time when computers were becoming increasingly popular and less the exclusive territory of nerds. The years around 2,000 were the big turning point. Piell posted:Most people now don't understand how computers work. These days most people I've met are at least vaguely aware of how stupid computers are and that stuff like the Y2K bug can't launch nukes. Yeah people are still technologically illiterate overall but it isn't nearly as bad as 20 years ago.
|
# ? Feb 19, 2015 05:13 |
|
ToxicSlurpee posted:In 1999, for the first time ever, the majority of adults had computers in their homes. Which means by definition that from 1990 to 1998 most homes did not have computers which is kind of, you know, 90% of the decade. Which does in fact justify my argument that the 90's were a time when computers were becoming increasingly popular and less the exclusive territory of nerds. The years around 2,000 were the big turning point. But we were talking about 1999, do you understand when 1999 is.
|
# ? Feb 19, 2015 05:17 |
|
Tiny Brontosaurus posted:But we were talking about 1999, do you understand when 1999 is. I think there were crusades happening.
|
# ? Feb 19, 2015 05:18 |
|
mind the walrus posted:The Kinect is absolutely a failed peripheral, it's just that Microsoft was keeping the drat thing on life support long after it should have died. Nintendo opened a Pandora's Box of attempted rip-offs with the Wii. The Miis had the same effect as the motion controls; XBox Live avatars and Playstation Home tried and failed to draw that one in. It's really only Nintendo that can do stuff like that, and it's because they can still assure quality. Their first-party games will use their gimmick in the way they intended, usually doing quite well with it (as well as can be expected, at least), while also being a good game. The Playstation/Xbox big titles are things that other companies make, so they'll rarely be as willing to incorporate the peripheral, and quality of both the game and the integration will be less consistent. Really, without Wii Sports and Twilight Princess showing people both 'motion controls should work like this' and 'motion controls won't get in the way of us making what you want from us' the Wii would have failed too. The only other company that's managed to do similar is Valve, and they often do it the other way around. They use TF2 and Dota 2 to test possible new meta-game features, and then incorporate it into Steam as a whole if it works. That way, when other developers get their hands on it, they've seen how to make it work and can follow suit.
|
# ? Feb 19, 2015 05:20 |
|
mind the walrus posted:The Kinect is absolutely a failed peripheral, it's just that Microsoft was keeping the drat thing on life support long after it should have died. The original kinect was a failure in every measure except for sales, Microsoft sold a shitload of those things. Sure in a year they sat covered in dust in the back of a tv stand next to a Wii, but what mattered is people bought them. I think they put SDK's for the new Kinect, but I haven't really followed what is going on with that thing, I'm not even sure you can buy them separate yet.
|
# ? Feb 19, 2015 05:21 |
|
Tiny Brontosaurus posted:But we were talking about 1999, do you understand when 1999 is. Giving e/n a special Tiny Brontosaurus Pointless Slapfight Quarantine Thread was not enough. Every subforum should have one.
|
# ? Feb 19, 2015 05:22 |
|
Tiny Brontosaurus posted:But we were talking about 1999, do you understand when 1999 is. We were talking about the way people in the late '90s viewed computers and the roots of their ignorance about them, actually, and you're the only one who is obsessed with limiting the discussion to solely the year 1999 (as if this year took place in a vacuum, and the "y2k virus" was never discussed prior to that nor by people whose lives and interactions with computers had been During The Years Leading Up To 1999)
|
# ? Feb 19, 2015 05:23 |
|
|
# ? May 9, 2024 10:16 |
|
Cleretic posted:Nintendo opened a Pandora's Box of attempted rip-offs with the Wii. The Miis had the same effect as the motion controls; XBox Live avatars and Playstation Home tried and failed to draw that one in. To be fair, the Wii was also the dumping ground for lovely bargain bin games, and I can't name a third-party title Also, Microsoft released an SDK for Kinect on Windows that is intended for actual business use. For one thing, I've seen it used in a FIRST Robotics competition as an alternate control scheme for the robots.
|
# ? Feb 19, 2015 05:31 |