Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
No Gravitas
Jun 12, 2013

by FactsAreUseless

Mr Chips posted:

cool, i'll try and come up with recipie for you. Presumably the Intel compiler kit for the Phi includes mpicc and mpirun? I've only ever used the openMPI toolkit for this, to build x86_64 binaries.

edit: going by this: https://software.intel.com/en-us/articles/how-to-run-intel-mpi-on-xeon-phi, it doesn't look like too much of a deviation from what I've done in the past..

I do not have the Intel compiler available. I only have the MPSS GCC installed, in addition to the host's GCC compiler.

Adbot
ADBOT LOVES YOU

Mr Chips
Jun 27, 2007
Whose arse do I have to blow smoke up to get rid of this baby?

No Gravitas posted:

I do not have the Intel compiler available. I only have the MPSS GCC installed, in addition to the host's GCC compiler.

Ahhh...it might be a bit of a wild goose chase if we don't have the Intel MPI dev tools for Phi.

No Gravitas
Jun 12, 2013

by FactsAreUseless

Mr Chips posted:

Ahhh...it might be a bit of a wild goose chase if we don't have the Intel MPI dev tools for Phi.

Yup, I think so. Sadly, those don't grow on trees and I already tried to source them fruitlessly twice in the past few weeks. (I will earn money on the Phi, so I cannot just get a free/student license. My school was not helpful either.)

If you do get a binary generated, I will be more than happy to take it for a spin and see how well it runs.

Mr Chips
Jun 27, 2007
Whose arse do I have to blow smoke up to get rid of this baby?
no worries, will see if I can get something from Intel, but I'm not working with the HPC team any more

No Gravitas
Jun 12, 2013

by FactsAreUseless

Mr Chips posted:

no worries, will see if I can get something from Intel, but I'm not working with the HPC team any more

If/when you have stuff for me to run, I'm firefly@gmx.ca.

EDIT: And here is my lovely, lovely setup. :barf: territory has been reached.

http://imgur.com/a/ieOfU

Ugh, that carpet alone is vomit-inducing.

EDIT2: I forgot to add to the pictures: The side of the computer is actually closed when I run it. Only the front is popped off because it would really restrict the airflow otherwise. Instant 2C difference and a bit more over time.

No Gravitas fucked around with this message at 05:20 on Dec 19, 2014

No Gravitas
Jun 12, 2013

by FactsAreUseless
The second Noctua together with a better duct did it. I'm running very hot, but not throttling at full load (no Intel compiler, so no vector units, however much of a different that would make) anymore.

I'm still going to buy a 80mm PWM fan for massive cooling in case some other load will need it, but for now I'm running fine.

Josh Lyman
May 24, 2009


I'm catching up on the thread and saw all the G3258 excitement over the summer. Is it basically an HTPC special? I'm running a 3570K so I can't imagine it's much of a replacement for a desktop PC.

GokieKS
Dec 15, 2012

Mostly Harmless.
It's obviously not an upgrade to the 3570K (really, nothing is a meaningful upgrade over that at this point), but it's a great option for anything that doesn't require more than 2 cores, especially if you're comfortable with overclocking. HTPC, normal desktop use, even gaming, though recent games that don't work properly on systems that doesn't support 4 threads (e.g. Dragon Age) has put a bit of a damper on that.

Josh Lyman
May 24, 2009


GokieKS posted:

It's obviously not an upgrade to the 3570K (really, nothing is a meaningful upgrade over that at this point), but it's a great option for anything that doesn't require more than 2 cores, especially if you're comfortable with overclocking. HTPC, normal desktop use, even gaming, though recent games that don't work properly on systems that doesn't support 4 threads (e.g. Dragon Age) has put a bit of a damper on that.
That's what I figured. That Microcenter bundle with the MSI motherboard is now $99 which certainly isn't bad.

Josh Lyman fucked around with this message at 04:57 on Dec 24, 2014

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Josh Lyman posted:

I'm catching up on the thread and saw all the G3258 excitement over the summer. Is it basically an HTPC special? I'm running a 3570K so I can't imagine it's much of a replacement for a desktop PC.

For gaming purposes once overclocked it comes pretty drat close to a core i5... Certainly better than an i3. Deadshit developers arbitrarily preventing their game from working on a dual core cpu has started to put a damper on it though.

Josh Lyman
May 24, 2009


1gnoirents posted:

I have a box of old wafers in my closet. I wonder how much they were worth when they came off the crayon
I took home two 300m wafers after an internship at a semiconductor firm. One of them fell off a shelf a decade back, but the other one is still safe.

I think. It's been inside a black plastic container all these years.

chocolateTHUNDER
Jul 19, 2008

GIVE ME ALL YOUR FREE AGENTS

ALL OF THEM

Josh Lyman posted:

That's what I figured. That Microcenter bundle with the MSI motherboard is now $99 which certainly isn't bad.

Mind posting a link to that in here? I couldn't find it on microcenters website, unless it already ended :(

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

chocolateTHUNDER posted:

Mind posting a link to that in here? I couldn't find it on microcenters website, unless it already ended :(

It's been a multi-month long deal, it doesn't seem like they're going to end it soon:
http://www.microcenter.com/site/brands/G3258Bundle.aspx

sincx
Jul 13, 2012

furiously masturbating to anime titties
.

sincx fucked around with this message at 05:55 on Mar 23, 2021

Khagan
Aug 8, 2012

Words cannot describe just how terrible Vietnamese are.
Am I correct in assuming that Intel will not have mainstream Broadwell and skip directly to Skylake?

calusari
Apr 18, 2013

It's mechanical. Seems to come at regular intervals.

Khagan posted:

Am I correct in assuming that Intel will not have mainstream Broadwell and skip directly to Skylake?

If Intel's own roadmaps are to be believed, then yes. Too bad they won't be skipping directly to Skylake-K, unlocked Broadwell doesn't really make any sense.

Not Al-Qaeda
Mar 20, 2012
is it bad if my cpu avg temp. while playing games is ~80 *C?

Panty Saluter
Jan 17, 2004

Making learning fun!

Not Al-Qaeda posted:

is it bad if my cpu avg temp. while playing games is ~80 *C?

That's toasty for sure. Are you overclocking? What CPU? Stock cooler?

Rime
Nov 2, 2011

by Games Forum

calusari posted:

If Intel's own roadmaps are to be believed, then yes. Too bad they won't be skipping directly to Skylake-K, unlocked Broadwell doesn't really make any sense.

This was just a rumor last I checked, when did they say no K series broadwell?

1gnoirents
Jun 28, 2014

hello :)
Last I heard the K broadwell and some form of skylake were releasing at the same time. But, I hardly trust cpu rumors of that kind much even if they were true at first. I was more interested in skylake k but I havent seen any info on that. I can't imagine its too much long after the official skylake release though. I haven't been around much for Intel, but releasing this fast seems unusual to me

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


Keep in mind that you cannot reliably overclock the i5-4690K past the i7-4790K's stock turbo clocks. Often enough? For making it worth a coinflip, probably. To make it an actual wise investment on a computer you're probably going to have for five years, not so much. Sure, the 4790K is considerably more expensive, but boards and heatsinks for overclocking worth a drat on Haswell aren't cheap either (to the point where it takes a hell of a deal to make it worthwhile).

Don't be surprised if Intel publicly doesn't see the point for K chips past Skylake.

Or possibly for Skylake.

dont be mean to me fucked around with this message at 17:58 on Jan 8, 2015

1gnoirents
Jun 28, 2014

hello :)

Sir Unimaginative posted:

Keep in mind that you cannot reliably overclock the i5-4690K past the i7-4790K's stock turbo clocks. Sure, the 4790K is considerably more expensive, but boards and heatsinks for overclocking worth a drat on Haswell aren't cheap either (to the point where it takes a hell of a deal to make it worthwhile).

Don't be surprised if Intel publicly doesn't see the point for K chips past Skylake.

Or possibly for Skylake.

Oof this hurts. I can totally see this happening.

On the other hand, giving us cpus that are already near maxed out isnt exactly a bad thing I guess.

But on the third hand, if I don't want 8 threads I dont want pre-limited clocks on an i5 if there is no K to be had. I hope they'd at least sell the option of a high clocked i5

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
I was going to post something desperately facetious about the hardware needs of MMO gamers here but got too depressed about power efficiency to do so :smith:

Rime
Nov 2, 2011

by Games Forum
A three month report on the G3258:

- Fantastic for gaming and general purpose when overclocked, still can't believe Intel released this chip when it makes even some i5's completely redundant.

- Terrible for everything else. Eg: Photoshop CC will drive it to 100% usage and heavily lag when painting a simple line.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Rime posted:

- Terrible for everything else. Eg: Photoshop CC will drive it to 100% usage and heavily lag when painting a simple line.

Check your voltage & temperatures. I saw reports of something similar where people would report the Starcraft 2 menu bringing their system to its knees at 2fps, but the game itself would run great. It's likely that Photoshop CC is doing something that's driving the CPU to run so hot it's throttling, which would result in terrible performance.

You may need to turn down your clocks and voltage, because I guarantee that at stock clocks it performs fine in Photoshop CC. Obviously half the speed of something quad core in multithreaded things, but certainly smoothly.

Rime
Nov 2, 2011

by Games Forum
Yeah, reverting to stock was the first thing I did. it just doesn't seem to play nice when rendering the input from a Wacom tablet for some reason. Spikes to 100% and stays there till it's caught up to the pen. :shrug:

cstine
Apr 15, 2004

What's in the box?!?

Rime posted:

Yeah, reverting to stock was the first thing I did. it just doesn't seem to play nice when rendering the input from a Wacom tablet for some reason. Spikes to 100% and stays there till it's caught up to the pen. :shrug:

Weird. I've got one at 4.2ghz and zero issues with Photoshop CC/Lightroom CC.

I assume you're on 8.1 and current versions of CC?

Col.Kiwi
Dec 28, 2004
And the grave digger puts on the forceps...

Rime posted:

Yeah, reverting to stock was the first thing I did. it just doesn't seem to play nice when rendering the input from a Wacom tablet for some reason. Spikes to 100% and stays there till it's caught up to the pen. :shrug:
This sounds to me like the driver for the tablet is broken/bugged. Or mayyyyybe the tablet has a hardware issue though I'd consider that way less likely. Is it always completely fine if you aren't using the tablet? If so... not a great sign

Rime
Nov 2, 2011

by Games Forum
It's fine if I just use the mouse, but it's also not computing the opacity on the fly based on the pressure reading from the stylus in that case. I'm on Windows 7, I just couldn't see why bugged hardware would cause the CPU usage to max out until it finished rendering the stroke since it was working just fine on my old Nahalem rig.

VVV: The problem with FC4 is that it was coded to specifically use cores 2 & 3 since the first core is reserved on the consoles. You'll have to track down a working copy of the injector or you're SOL.

Rime fucked around with this message at 03:25 on Jan 12, 2015

1gnoirents
Jun 28, 2014

hello :)
This might be the wrong place but is there a way to make your PC report as a quad core when its a dual core? I'm trying to help my buddy out with a G2358 he just built but far cry 4 wont run because it doesnt meet the quad core requirement. I tried some kind of injector that didn't work. The other specs are good (970) and other games work great but I've heard in passing that this kind of arbitrary limiting might become a thing.

Nintendo Kid
Aug 4, 2011

by Smythe

1gnoirents posted:

This might be the wrong place but is there a way to make your PC report as a quad core when its a dual core? I'm trying to help my buddy out with a G2358 he just built but far cry 4 wont run because it doesnt meet the quad core requirement. I tried some kind of injector that didn't work. The other specs are good (970) and other games work great but I've heard in passing that this kind of arbitrary limiting might become a thing.

This isn't going to help. The game crashes when it doesn't have 4 hardware threads available (either a quad core without hyperthreading, or a dual core with hyperthreading). There is currently nothing to be done to fix that, so the game simply won't run.

No matter what you do, on a pure dual core processor, the game simply doesn't function.

No Gravitas
Jun 12, 2013

by FactsAreUseless
Would it run on a virtual machine with 4 virtual cores backed by 2 physical cores? I mean, it won't run well, but does it run in any capacity?

Rime
Nov 2, 2011

by Games Forum

Nintendo Kid posted:

This isn't going to help. The game crashes when it doesn't have 4 hardware threads available (either a quad core without hyperthreading, or a dual core with hyperthreading). There is currently nothing to be done to fix that, so the game simply won't run.

No matter what you do, on a pure dual core processor, the game simply doesn't function.

It runs on the G3258 just fine (as fine as an Ubishit game at least) with the hack that got released a while back, recent patches may have broken it though.

Nintendo Kid
Aug 4, 2011

by Smythe

Rime posted:

It runs on the G3258 just fine (as fine as an Ubishit game at least) with the hack that got released a while back, recent patches may have broken it though.

Yeah from what I've heard the current patches no longer function with that hack.

Instant Grat
Jul 31, 2009

Just add
NERD RAAAAAAGE
I still can't fathom why they'd choose to make the game refuse to run on less than four threads. Maybe AMD is paying them under the table?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Instant Grat posted:

I still can't fathom why they'd choose to make the game refuse to run on less than four threads. Maybe AMD is paying them under the table?

It's Ubisoft, so downright stupidity in marketing decisions wouldn't surprise me, but I doubt AMD has the money for this kind of stunt.

It runs fine on a G3258 as seen online videos and so on, so if they don't fix it, then it could be malice. As if Ubisoft gives a gently caress about their reputation anyway.

HalloKitty fucked around with this message at 13:38 on Jan 12, 2015

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

No Gravitas posted:

Would it run on a virtual machine with 4 virtual cores backed by 2 physical cores? I mean, it won't run well, but does it run in any capacity?
I'm surprised that Microsoft still hasn't implemented a Direct3D proxy for HyperV, given that things start to look like they're going to use it for sandboxing in future.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Combat Pretzel posted:

I'm surprised that Microsoft still hasn't implemented a Direct3D proxy for HyperV, given that things start to look like they're going to use it for sandboxing in future.
I suspect it's because most people that are doing 3D acceleration in VDI environments are already on VMware and are unlikely to convert anytime soon, so this would put Microsoft in the position of chasing a market that they can't quite estimate so well yet. People that are going to virtualize render farms have already done it (it's a bit of a serious hit or you're GPGPU-bound and using PCI-e passthrough galore), so if anything the reason they'd do it would be to put it into Azure offerings, not so much for Hyper-V.

calusari
Apr 18, 2013

It's mechanical. Seems to come at regular intervals.

Rime posted:

This was just a rumor last I checked, when did they say no K series broadwell?

There are K series unlocked Broadwell desktop CPUs, but there aren't any mainstream Broadwell desktop CPUs (ie locked multiplier)

Adbot
ADBOT LOVES YOU

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

necrobobsledder posted:

I suspect it's because most people that are doing 3D acceleration in VDI environments are already on VMware and are unlikely to convert anytime soon, so this would put Microsoft in the position of chasing a market that they can't quite estimate so well yet.
If the feature sheets posted a while ago on their blogs aren't just PR bullshit, they intend to leverage their hypervisor inside Windows 10 client to sandbox applications. I'd figure they'd like to have accelerated 3D graphics covered by then, so I'm crossing my fingers. RemoteFX is already available, but restricted to the server builds for whatever reason. And the Quadro and FirePro range of cards. Forgot what performance was like, too. Given all this enlightened driver bullshit, you'd think they could manage to abstract things accordingly and pass the Direct3D stuff between host and guest quasi directly instead of doing RemoteFX and virtual GPUs.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply