|
Fuzzy Mammal posted:Idf keynote summary looked hella lame. Did I overlook anything? Xpoint in consumer products in 2016 is cool
|
# ? Aug 18, 2015 19:54 |
|
|
# ? May 27, 2024 03:01 |
|
mayodreams posted:That's apparently why they are calling it 'Intel Optane.' That's loving worse.
|
# ? Aug 18, 2015 20:45 |
|
Saukkis posted:I bought a 4€ Qi receiver for my Galaxy S4 that fits inside the standard back cover and bulges only a little bit. The Galaxy S5 charging pad cost 30€ and works fine with the S4 and Lumia 930, except the Lumia is quite picky about the spot.
|
# ? Aug 18, 2015 20:56 |
|
go3 posted:um yeah thats a pretty horrible way to go about getting a bunch of storage My HTPC storage is not on trial here!
|
# ? Aug 18, 2015 21:02 |
|
Botnit posted:My HTPC storage is not on trial here! Does your HTPC have more space for internal drives? It may be worth opening up those enclosures, though usually voiding the warranties, to put then inside... Unless you live in a cold climate and heat your living room with power bricks.
|
# ? Aug 18, 2015 21:16 |
|
let i hug posted:Am I understanding right that 3D xpoint is going to effectively be both ram and ssd storage rolled into one? Also why did they give it such a terrible name?
|
# ? Aug 18, 2015 21:34 |
|
Botnit posted:My HTPC storage is not on trial here! I can't tell you how many external enclosures I saw fail when I worked for a film school. Those things, especially the USB powered 'go' drives, are complete poo poo and will die on on you. Period. If you are lucky, they will leave the data intact on the drive. However, the GO drives typically do not have a SATA connection on the drive which means you are hosed if the enclosure shits the bed. We are just trying to look out for your data.
|
# ? Aug 18, 2015 22:04 |
|
Combat Pretzel posted:I thought it wasn't fast enough to replace RAM, we'd be needing memristors for this. It's not as fast as DRAM now, but there are probably applications for which it's fast enough or will be in the next few years.
|
# ? Aug 18, 2015 22:23 |
|
cisco privilege posted:My motorola phone is thin as hell and apparently supports wireless charging. Without any kind of wireless quick charge though it's easier just to plug it into the quickcharge2.0 adapter for 15 minutes every couple days. This is a big limiting factor for Qi right now: the power profiles for many of the charging stations are really, really low. Which means either they charge slowly, or quite frequently they don't charge at all because they're "too far away." I'm sorry, lovely Qi charger, but if the 3mm of my slim-profile rubber case is "too far" then you should probably look into sucking less.
|
# ? Aug 18, 2015 22:25 |
|
I do plan to get a wireless charger at some point but they're definitely not something for quick use. Some newer phones are coming out with a quickcharge wireless option that seems neat but for normal use it doesn't seem as versatile as just having a couple cable chargers or a USB cable around.
|
# ? Aug 19, 2015 00:08 |
|
Considering the Apple Watch has wireless charging, I wonder how long it'll be before the iPhone has it. Maybe the 6S?
|
# ? Aug 19, 2015 00:22 |
|
Josh Lyman posted:Considering the Apple Watch has wireless charging, I wonder how long it'll be before the iPhone has it. Maybe the 6S? Considering the charging loop costs all of like 25c, the answer is probably some mix of "when they can get it working reliably though an aluminum back" and "when Qi pays them enough."
|
# ? Aug 19, 2015 00:26 |
|
Josh Lyman posted:Considering the Apple Watch has wireless charging, I wonder how long it'll be before the iPhone has it. Maybe the 6S? The 6s leaked parts are more or less identical to the 6 so i would guess in 2016 at the earliest, probably later than that since it's not really a thing people use outside of (competing) phones.
|
# ? Aug 19, 2015 00:33 |
|
Combat Pretzel posted:I thought it wasn't fast enough to replace RAM, we'd be needing memristors for this. quote:and in a DIMM form factor for Xeon systems for even greater bandwidth and lower latencies DrDork posted:This is a big limiting factor for Qi right now: the power profiles for many of the charging stations are really, really low. Which means either they charge slowly, or quite frequently they don't charge at all because they're "too far away." I'm sorry, lovely Qi charger, but if the 3mm of my slim-profile rubber case is "too far" then you should probably look into sucking less.
|
# ? Aug 19, 2015 01:29 |
|
Saukkis posted:But you're right, the technology should be much more wide spread. I don't think I'll be willing to buy another mobile device without Qi charging. gently caress cables. Josh Lyman posted:Considering the Apple Watch has wireless charging, I wonder how long it'll be before the iPhone has it. Maybe the 6S?
|
# ? Aug 19, 2015 01:49 |
|
mayodreams posted:I can't tell you how many external enclosures I saw fail when I worked for a film school. Those things, especially the USB powered 'go' drives, are complete poo poo and will die on on you. Period. If you are lucky, they will leave the data intact on the drive. However, the GO drives typically do not have a SATA connection on the drive which means you are hosed if the enclosure shits the bed. I know, I was just making a John Oliver joke about British dentistry. The stuff I have on them isn't important, it's just HTPC stuff that I keep instead of deleting because of a 3 day bandwidth limit of 20gb, so any time I download something to watch I keep it. My only other options at this point are to open up those HDD's like Vulgar said and hope I can salvage them by putting them in a NAS or replacing them with much bigger internals, but 8TB internals on Newegg are $500+ (except a Seagate that apparently has a huge failure rate) so that'd be $750 to $1,500 to replace what I have now. I didn't just go out and buy a poo poo load of HDD's at one time to do this, I've just been adding them over the years as I fill one up. I'd ideally prefer to replace them when/if 3.1 external HDD's come out and power themselves, then I could just get three 10TB's, free up more space and lose a poo poo load of wires, but it's not that important to me since I don't actually see any of them in an entertainment shelf. I can't rule out I'll end up causing the next California wildfire though.
|
# ? Aug 19, 2015 02:04 |
|
Anyone know if Xeon Phi cards that are currently released will work in HP Z440 or Z460 workstations. HP's technical support is useless.
|
# ? Aug 19, 2015 06:31 |
|
cycleback posted:Anyone know if Xeon Phi cards that are currently released will work in HP Z440 or Z460 workstations. HP's technical support is useless. Wait, everyone hasn't already modified their code to work with CUDA core acceleration? Who buys Xeon Phi's?
|
# ? Aug 19, 2015 15:24 |
|
A Bad King posted:Wait, everyone hasn't already modified their code to work with CUDA core acceleration? Who buys Xeon Phi's?
|
# ? Aug 19, 2015 15:28 |
|
Anime Schoolgirl posted:The Chinese, apparently They're hardly worth the money when a GeForce Titan outruns Phi by a 30%-70% margin on code that plays nice with Tesla, and because Nvidia was first, pretty much anyone who has (grant) money relying on their hardware has shifted and recoded to play nice with Tesla. We're getting to FPGA-level performance on GPU hardware; but China wants Intel's me-too?
|
# ? Aug 19, 2015 17:00 |
|
A Bad King posted:They're hardly worth the money when a GeForce Titan outruns Phi by a 30%-70% margin on code that plays nice with Tesla, and because Nvidia was first, pretty much anyone who has (grant) money relying on their hardware has shifted and recoded to play nice with Tesla. We're getting to FPGA-level performance on GPU hardware; but China wants Intel's me-too?
|
# ? Aug 19, 2015 17:14 |
|
I don't know much of anything in this space but not to long ago I remember hearing the complaints about CUDA is that it sucks to write for, won't give you the performance boost they claim unless you write it correctly, requires you to hire expensive software guys in order to write and maintain good CUDA code. I'm assuming those complaints were overblown?
|
# ? Aug 19, 2015 18:26 |
|
Also, have fun if you don't use C as your scientific language of choice. CUDA Fortran exists, but only for one proprietary compiler and is just as much of a pain in the dick to code for. The next HPC purchase I make will probably be a Phi.
|
# ? Aug 19, 2015 18:29 |
|
WhyteRyce posted:I don't know much of anything in this space but not to long ago I remember hearing the complaints about CUDA is that it sucks to write for, won't give you the performance boost they claim unless you write it correctly, requires you to hire expensive software guys in order to write and maintain good CUDA code. I'm assuming those complaints were overblown? Software engineers will work 120 hours a week for their salary they cannot spend, so those complaints are silly for the beneficiaries of the written code. No end-user cares that it is hard, as long as you can make your currency swap 430 milliseconds faster than the other guy. Also, from my understanding, Phi isn't easier to code for if you want it to be remotely comparable to Tesla -- you're just going from one PITA to another PITA if you want the same performance out of a Phi as you do out of a CUDA board properly caressed by good CUDA code. Grundulum posted:Also, have fun if you don't use C as your scientific language of choice. CUDA Fortran exists, but only for one proprietary compiler and is just as much of a pain in the dick to code for. The next HPC purchase I make will probably be a Phi. C is perfect and best for everything in every way. A Bad King fucked around with this message at 19:05 on Aug 19, 2015 |
# ? Aug 19, 2015 19:02 |
|
WhyteRyce posted:I don't know much of anything in this space but not to long ago I remember hearing the complaints about CUDA is that it sucks to write for, won't give you the performance boost they claim unless you write it correctly, requires you to hire expensive software guys in order to write and maintain good CUDA code. I'm assuming those complaints were overblown? I don't know what performance boost they claim. For financial stuff I'm sure CUDA developers cost a fortune. For scientific stuff you're looking at fairly average pay if you can find someone for the job. Grundulum posted:Also, have fun if you don't use C as your scientific language of choice. CUDA Fortran exists, but only for one proprietary compiler and is just as much of a pain in the dick to code for. The next HPC purchase I make will probably be a Phi. If someone wanted me to use Cuda Fortran I'd probably quit my job. Having to deal with fortran code frequently is unfortunate enough, but taking something that is well suited to one language and bringing it to another language it has no business being associated with would really irk me. There's a reason Fortran isn't used for systems programming and C is. With that said, I'd work on a fortran project that used CUDA but it would have to be called through a binding/dll of some kind and not be done in CUDA Fortran. Khorne fucked around with this message at 19:37 on Aug 19, 2015 |
# ? Aug 19, 2015 19:21 |
|
A Bad King posted:Wait, everyone hasn't already modified their code to work with CUDA core acceleration? Who buys Xeon Phi's? For certain workloads GPUs don't really offer speed ups and are just a hassle. The new Phi has potential as from what I understand will just looks like a bunch of x86 cores.
|
# ? Aug 20, 2015 16:00 |
|
So I'm running a 4670k in my year-old machine, and I just had to do a system restore to resolve a very odd bug. Ever since my last windows 10 update a couple of days ago, my computer had been running really sluggishly and experiencing a really high CPU-load, even after I ran malware and virus scans. Eventually I got fed up and looked more closely at RealTemp and noticed that, instead of running at 3.4 GHz like it was supposed to, my processor was running at only 785ish MHz - and it was locked there constantly, regardless of what the CPU load was. A system restore seems to have resolved the issue, but I was curious - does anyone know how in the gently caress that sort of thing happens?
|
# ? Aug 20, 2015 18:26 |
|
cycleback posted:For certain workloads GPUs don't really offer speed ups and are just a hassle. The new Phi has potential as from what I understand will just looks like a bunch of x86 cores. A bunch of very, very slow cores (1GHz, no branch prediction or OOO, awkward memory access.) You need to vectorize and multithread your code to get anywhere near optimal performance, which may be no easier than implementing a CUDA kernel. The nice thing is that by the time you optimize your code for the Phi it is usually much faster for regular X86. You could argue that it might make sense to buy Phis now that they're $200 to prototype your apps in native mode for KNL, which is going to address many of the KNC architecture issues, if you are going to be running on one of the KNL-based supers coming online next year or 2017.
|
# ? Aug 20, 2015 18:35 |
|
Spiritus Nox posted:So I'm running a 4670k in my year-old machine, and I just had to do a system restore to resolve a very odd bug. Ever since my last windows 10 update a couple of days ago, my computer had been running really sluggishly and experiencing a really high CPU-load, even after I ran malware and virus scans. Eventually I got fed up and looked more closely at RealTemp and noticed that, instead of running at 3.4 GHz like it was supposed to, my processor was running at only 785ish MHz - and it was locked there constantly, regardless of what the CPU load was. A system restore seems to have resolved the issue, but I was curious - does anyone know how in the gently caress that sort of thing happens? Check the power options, make sure it's not set to "power saving" or the like. vv Yeah, you'd think it's only for laptops, but since desktop CPUs also use the same speedstep tech these days (for years now), they can also be told to clock down by the OS. It could possibly be that, it was just an idea HalloKitty fucked around with this message at 18:57 on Aug 20, 2015 |
# ? Aug 20, 2015 18:41 |
|
HalloKitty posted:Check the power options, make sure it's not set to "power saving" or the like Huh. Didn't even realize you could do that on a desktop. Anyway, like I said, I did a system restore to fix the problem, so while it's not on Power Saver now I can't really say what it was like before the restore. That would make the most sense, though.
|
# ? Aug 20, 2015 18:47 |
|
Spiritus Nox posted:So I'm running a 4670k in my year-old machine, and I just had to do a system restore to resolve a very odd bug. Ever since my last windows 10 update a couple of days ago, my computer had been running really sluggishly and experiencing a really high CPU-load, even after I ran malware and virus scans. Eventually I got fed up and looked more closely at RealTemp and noticed that, instead of running at 3.4 GHz like it was supposed to, my processor was running at only 785ish MHz - and it was locked there constantly, regardless of what the CPU load was. A system restore seems to have resolved the issue, but I was curious - does anyone know how in the gently caress that sort of thing happens? Sounds like you have an Intel DPTF driver fuckup. What you're telling us sounds very similar to a problem associated with throttling down to >9watts of power usage, and that throttle is done by the DPTF driver when Windows 10 thinks you're putting yourself in a low power state. When you wake from sleep, Windows doesn't talk to the DPTF driver effectively, and so you remain at that 9watt state. Happens on my Lenovo Yoga Pro 2. I had to edit the Batch file for the driver so that the lowest limit would be 11 watts, but that is for a 4210U proc.
|
# ? Aug 20, 2015 19:08 |
|
A Bad King posted:Sounds like you have an Intel DPTF driver fuckup. What you're telling us sounds very similar to a problem associated with throttling down to >9watts of power usage, and that throttle is done by the DPTF driver when Windows 10 thinks you're putting yourself in a low power state. When you wake from sleep, Windows doesn't talk to the DPTF driver effectively, and so you remain at that 9watt state. Interesting. Any reason to think it'll happen again now that I've done the system restore?
|
# ? Aug 20, 2015 19:09 |
|
Spiritus Nox posted:Interesting. Any reason to think it'll happen again now that I've done the system restore? So you went back down to Windows 8.1? In all honesty, check your BIOS. You don't need DPTF to throttle down -- it is meant for chromebooks and other ultraportables to save power by getting lower than Lil Jon is even comfortable with. If you have a BIOS setting for DPTF, turn it off. Maybe that will help.
|
# ? Aug 20, 2015 20:13 |
|
A Bad King posted:So you went back down to Windows 8.1? No. I just restored to a point before Windows downloaded the update that seemingly caused the problem. Still running 10, and at the moment everything's running without a hitch.
|
# ? Aug 20, 2015 20:14 |
|
Spiritus Nox posted:No. I just restored to a point before Windows downloaded the update that seemingly caused the problem. Still running 10, and at the moment everything's running without a hitch. Great, don't do what you did that caused the thing again! The only procs that should ever run below 9W are the -Y and core-M series. A Bad King fucked around with this message at 20:25 on Aug 20, 2015 |
# ? Aug 20, 2015 20:23 |
|
Couldn't take waiting for the 6700 anymore and got a 6600, probably use it until 6700's are either abundant or wait for an extreme version or a refresh. Paired it with a Maximus Hero because I'm subhuman garbage who wastes money. Also newegg eggpoints have to be the dumbest promotion I've ever seen. Almost nothing they sell gives them, what does give them gives you almost none, they're almost completely worthless if you do have them, and they have to be spent within 90 days.
|
# ? Aug 21, 2015 08:53 |
|
A Bad King posted:Great, don't do what you did that caused the thing again! Over the years on various devices I've developed a hatred for software updates. The amount of times an update has either broke or removed useful functions... It's probably why windows 10 has so many non-optional updates.
|
# ? Aug 21, 2015 09:18 |
|
Botnit posted:Couldn't take waiting for the 6700 anymore and got a 6600, probably use it until 6700's are either abundant or wait for an extreme version or a refresh. Paired it with a Maximus Hero because I'm subhuman garbage who wastes money. I don't even know if I'm going to buy into this generation, and I'm sitting on a desktop Llano APU. The Haswell-E now looks like a great buy, what with those 6 cores and 12 threads...
|
# ? Aug 21, 2015 13:31 |
|
A Bad King posted:I don't even know if I'm going to buy into this generation, and I'm sitting on a desktop Llano APU. The Haswell-E now looks like a great buy, what with those 6 cores and 12 threads... Anime Schoolgirl fucked around with this message at 14:10 on Aug 21, 2015 |
# ? Aug 21, 2015 14:07 |
|
|
# ? May 27, 2024 03:01 |
|
Anime Schoolgirl posted:Haswell-E overclockability is pisspoor. I thought the 3.3ghz 5820k can hit 4.5+ghz. that doesn't seem piss poor to me.
|
# ? Aug 21, 2015 14:52 |