Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
jackhunter64
Aug 28, 2008

Keep it up son, take a look at what you could have won


https://www.youtube.com/watch?v=mIbEIhizoXY

WoW running at 1200P around 40-50 FPS with everything turned up to 10. Obviously that's just the starting area and not a busy city or anything, but turning the graphics settings up that high in this game is usually demanding even on a PC GPU, so knocking stuff like shadows and ambient occlusion down to a sensible level means you could very likely hit a constant 60 FPS on a fanless laptop. Which is just absurd.

Adbot
ADBOT LOVES YOU

Binary Badger
Oct 11, 2005

Trolling Link for a decade


Ok Comboomer posted:

I hope they make their own add-on GPU card for Mac Pro for purely aesthetic reasons.

They already make AfterBurner.

Afterburner isn't exactly a GPU card, it's an FPGA card, with programmable ASIC chips.. it can be a superfast calculation card though, which it is but programmed specifically to speed up certain transcoding operations..

quote:

Make a PCIe GPU, Apple. Make a better GPU than Intel and ship it before intel.

Oddly enough, Imagination Technologies which developed the PowerVR GPU that was part of iPhones up to the iPhone 7, and from which Apple is said to have derived some of the technologies used in their own GPU in the A11 series and later, was shopping around a PCIe card with a PowerVR GPU as far back as the 2012 CES.

Apple could make a PCIe card with an expanded version of their M1 GPU; it'd probably make more sense to put it on a PCIe card as they get better and better at making GPUs.. but I doubt they're going to go whole hog on it, preferring to allow AMD to make Vega IIIs or Radeon VIIIs..

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

jackhunter64 posted:

https://www.youtube.com/watch?v=mIbEIhizoXY

WoW running at 1200P around 40-50 FPS with everything turned up to 10. Obviously that's just the starting area and not a busy city or anything, but turning the graphics settings up that high in this game is usually demanding even on a PC GPU, so knocking stuff like shadows and ambient occlusion down to a sensible level means you could very likely hit a constant 60 FPS on a fanless laptop. Which is just absurd.

How much would knocking it down to 1080p help?

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

Binary Badger posted:

Afterburner isn't exactly a GPU card, it's an FPGA card, with programmable ASIC chips.. it can be a superfast calculation card though, which it is but programmed specifically to speed up certain transcoding operations..


Oddly enough, Imagination Technologies which developed the PowerVR GPU that was part of iPhones up to the iPhone 7, and from which Apple is said to have derived some of the technologies used in their own GPU in the A11 series and later, was shopping around a PCIe card with a PowerVR GPU as far back as the 2012 CES.

Apple could make a PCIe card with an expanded version of their M1 GPU; it'd probably make more sense to put it on a PCIe card as they get better and better at making GPUs.. but I doubt they're going to go whole hog on it, preferring to allow AMD to make Vega IIIs or Radeon VIIIs..

I just don’t know how you keep the Mac Pro market happy and get commensurate performance gains by 1) going with an exclusively on-chip GPU (maybe they pair an iGPU with a dGPU?) 2) having AMD provide dGPU grunt. Like unless Apple wants to do that driver stuff themselves why not just scale up what they’re already doing in-house for non AMD GPU Macs? M1 seems to be proving itself and its GPU tiles are tiny and relatively few.



Make a Threadripper-sized die you cowards

arbybaconator
Dec 18, 2007

All hat and no cattle

My Air is not arriving until tomorrow, it looks like. Has anyone seen any videos of what iOS apps look like on the M1? I’m really curious about that.

mediaphage
Mar 22, 2007

Excuse me, pardon me, sheer perfection coming through

arbybaconator posted:

My Air is not arriving until tomorrow, it looks like. Has anyone seen any videos of what iOS apps look like on the M1? I’m really curious about that.

Ugly with hideous mac title bars and the stoplight in the corner but otherwise like normal iOS apps

Fame Douglas
Nov 20, 2013

by Fluffdaddy
double post

Fame Douglas fucked around with this message at 00:05 on Nov 18, 2020

Fame Douglas
Nov 20, 2013

by Fluffdaddy

jackhunter64 posted:

https://www.youtube.com/watch?v=mIbEIhizoXY

WoW running at 1200P around 40-50 FPS with everything turned up to 10. Obviously that's just the starting area and not a busy city or anything, but turning the graphics settings up that high in this game is usually demanding even on a PC GPU, so knocking stuff like shadows and ambient occlusion down to a sensible level means you could very likely hit a constant 60 FPS on a fanless laptop. Which is just absurd.

While that's pretty impressive for integrated graphics (and I can play it just fine on my fanless i5 Surface, though obviously at lower settings), multisampling anti-aliasing being disabled does help performance a great deal. For a machine like this, I'd only use lightweight post-processing anti-aliasing (preferably CMAA).

Another thing to consider is that you'd have to let the game run for a while to ascertain what settings are reasonable, because thermal throttling certainly will be a factor (and even more so in the summer).

In general, I'd expect these machines to run Retail WoW well and Classic WoW without a sweat at all.

Ok Comboomer posted:

How much would knocking it down to 1080p help?

That'd be knocking down the resolution scale to around ~85%, so it should add about 5-10 FPS maybe?

Fame Douglas fucked around with this message at 00:07 on Nov 18, 2020

Fame Douglas
Nov 20, 2013

by Fluffdaddy
Also, not to make this the WoW thread, but if people are looking for a nice add on updater and aren't afraid of the command line, CurseBreaker is amazing. Works on Classic and Retail, supports all the usual add on sites & GitHub repos: https://github.com/AcidWeb/CurseBreaker

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



jackhunter64 posted:

https://www.youtube.com/watch?v=mIbEIhizoXY

WoW running at 1200P around 40-50 FPS with everything turned up to 10. Obviously that's just the starting area and not a busy city or anything, but turning the graphics settings up that high in this game is usually demanding even on a PC GPU, so knocking stuff like shadows and ambient occlusion down to a sensible level means you could very likely hit a constant 60 FPS on a fanless laptop. Which is just absurd.

He says it's using Rosetta, but I had thought they'd enabled native support in the maintenance patch today. Is there a way to launch via Rosetta instead of natively?

Mister Facetious
Apr 21, 2007

I think I died and woke up in L.A.,
I don't know how I wound up in this place...

:canada:

arbybaconator posted:

My Air is not arriving until tomorrow, it looks like. Has anyone seen any videos of what iOS apps look like on the M1? I’m really curious about that.

The Verge review covered that, and they're hit or miss in their usability on a non-touchscreen laptop. Hell, Apple programmed an entire accessibility suite for them. There's some ChromeOS-style teething problems.

@09:40
https://www.youtube.com/watch?v=OEaKQ0pxQsg&t=580s

Fame Douglas
Nov 20, 2013

by Fluffdaddy

SourKraut posted:

He says it's using Rosetta, but I had thought they'd enabled native support in the maintenance patch today. Is there a way to launch via Rosetta instead of natively?

Maybe that person isn't in the NA region. Patch days are on Tuesday for NA and Wednesday for the EU - which means the patch isn't out for another three hours in Europe (and all other areas of the world attached to the EU region in Battle.net)

Binary Badger
Oct 11, 2005

Trolling Link for a decade


Ok Comboomer posted:

I just don’t know how you keep the Mac Pro market happy and get commensurate performance gains by 1) going with an exclusively on-chip GPU (maybe they pair an iGPU with a dGPU?) 2) having AMD provide dGPU grunt. Like unless Apple wants to do that driver stuff themselves why not just scale up what they’re already doing in-house for non AMD GPU Macs? M1 seems to be proving itself and its GPU tiles are tiny and relatively few.

This is something that is entirely in Apple's court; at this point they could go either way, but I'm not sure 1) is a great idea; it works for laptops and tablets, but for desktops it's somewhat different; increasing the power of an integrated GPU would be bad for scalability, plus I don't think they would want to create different CPUs with different iGPU flavors based on need..

Moving the GPU to a PCIe board would allow them to scale up the Apple GPU and offer upgradability in the form of faster Mx GPUs / Radeon / Vegas; but would Apple want to enter the world of battlin' PCIe GPUs? Of course their only market would be Macs..

quote:

Make a Threadripper-sized die you cowards

We haven't seen a takeapart of the M1 Macs yet but it's doubtless not as big as even a 29xx Threadripper..

shrike82
Jun 11, 2005

i'm reading up about iOS app support and apparently it uses a separate storefront from the mobile devices where developers can opt-out of being listed.
most major apps have opted out so you're not going to see them (e.g. Google apps, Facebook apps, most major games)

Fame Douglas
Nov 20, 2013

by Fluffdaddy

Binary Badger posted:

Moving the GPU to a PCIe board would allow them to scale up the Apple GPU and offer upgradability in the form of faster Mx GPUs / Radeon / Vegas; but would Apple want to enter the world of battlin' PCIe GPUs? Of course their only market would be Macs..

I'd rather be able to put in the latest Nvidia GPU, to be honest. I can't imagine Apple would be able to compete in the dGPU market, and even the past Mac Pros have all been hampered by terrible GPU options (at least the latest AMD GPU offerings look to be pretty promising, though Nvidia's CUDA is still an important factor to the professional market)

shrike82
Jun 11, 2005

Fame Douglas posted:

I'd rather be able to put in the latest Nvidia GPU, to be honest. I can't imagine Apple would be able to compete in the dGPU market, and even the past Mac Pros have all been hampered by terrible GPU options.

they're not even willing to support Nvidia cards as eGPUs so you're going to be waiting a while

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!
my MBP has an NVidia GPU :smugdog:



because it’s old

Inept
Jul 8, 2003

shrike82 posted:

they're not even willing to support Nvidia cards as eGPUs so you're going to be waiting a while

I wonder how much of a pain in the rear end Nvidia is to deal with on the manufacturing side if two large companies (Apple and Microsoft) have wanted nothing to do with them for a while now.

squirrelzipper
Nov 2, 2011

Fame Douglas posted:

I'd rather be able to put in the latest Nvidia GPU, to be honest. I can't imagine Apple would be able to compete in the dGPU market, and even the past Mac Pros have all been hampered by terrible GPU options (at least the latest AMD GPU offerings look to be pretty promising, though Nvidia's CUDA is still an important factor to the professional market)

The heat death of the universe might arrive before CUDA on any new Mac.

shrike82
Jun 11, 2005

tbf, gpu tech are a competitive advantage these days for both consumer/enterprise tech so it might be less a relationship issue and more these companies deciding they need their own proprietary tech.

mediaphage
Mar 22, 2007

Excuse me, pardon me, sheer perfection coming through

Fame Douglas posted:

I'd rather be able to put in the latest Nvidia GPU, to be honest. I can't imagine Apple would be able to compete in the dGPU market, and even the past Mac Pros have all been hampered by terrible GPU options (at least the latest AMD GPU offerings look to be pretty promising, though Nvidia's CUDA is still an important factor to the professional market)

i'm not sold on this, not because i don't think it's hard, of course it is, but they've been doing some impressive stuff with their own μarch in recent years so they're hard to predict (also they have made some pretty big pro missteps so ¯\_(ツ)_/¯ )

they'll probably never do it but with all their talk about being a new ecofriendlier apple it would be interesting to see them allow for some upgradeability in that segment. given that the rumours are pointing to "it's like a mac pro but tiny" i'm not holding my breath

Yeast
Dec 25, 2006

$1900 Grande Latte

Rinkles posted:

why does everyone hate the touch bar?

It ships only on 'pro' machines.

Professional users typically have years if not decades of muscle memory for which keys do what.

If you have to look down from the screen to do something, you're literally wasting time and its invalidating the point of it.

I have one on my 16". I use it to adjust volume and pause music. That's it.

LionArcher
Mar 29, 2010


well why not posted:

The 'professional' windows laptop segment - computers for people who do presentations, zoom calls, some light creative tasks - now has to explain to buyers why their $1000 XPS, Gram or Zenbook has half the battery life of a Macbook Air and spinning fans.


"disappointing" at best, "embarassing" would by my opinion

These are the worst macs they will ever make. They are perfect for causal users and even some power users. I won’t buy them either because gen 1 and I too need three monitors because... I wrote novels and that’s how I like it. But the gen 2 and 3 of these computers, not to mention the more pro machines. I can’t wait for them. This poo poo is just going to get more and more impressive over the next 18-24 months.

cowofwar
Jul 30, 2002

by Athanatos
Having a macbook pro that I can work on for hours without looking at the battery and not having it burn the poo poo out of my legs and make me sterile is very nice.

A discrete GPU will gently caress over the battery performance so I imagine it is more likely that Apple will just use an M2 or whatever with more memory.

Penisaurus Sex
Feb 3, 2009

asdfghjklpoiuyt
I’m not even sure how much more GPU grunt a ‘pro’ machine would need.

By all accounts editing lighter 4K footage is a breeze with these gen 1 SoCs. If you just account for iterative improvements on the GPU (and maybe more cores) how many generations are we from breezing through heavy 4K editing with lots of post processing or animation work?

I think my perspective is colored by the idea that you really shouldn’t be doing heavy computational work locally, so there’s a limit to what I’d ever ask for in a laptop or desktop. If you’re regularly working with massive computational work, you’re in the kind of business where you should have access to a server to handle that sort of thing.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Fame Douglas posted:

I'd rather be able to put in the latest Nvidia GPU, to be honest. I can't imagine Apple would be able to compete in the dGPU market

Why not, though?

I may be risking a certain person coming back here to whine when I say it, but GPUs are a well known embarrassingly parallel problem. To increase graphics performance, add more GPU cores and memory bandwidth.

And, based on recent announcements by Imagination, with the Imagination TBDR GPU tech, there's a path to making multiple GPUs collaborate on rendering one scene, and it should work far better than the schemes NVidia and AMD have tried (alternate frame rendering, SLI). This is relevant to Apple GPUs because they're the result of Apple performing a hostile takeover of Imagination GPU tech without actually buying the company outright.

So, it's possible that the next M1 series chip (for bigger rMBP 13" / rMBP 16" / iMac) will look like a bigger M1 with an oversized geometry engine to support feeding one or more external tile engines. This wouldn't be the same as a traditional dGPU architecture, where the extra chip is like a complete subsystem unto itself, but should be able to deliver comparable performance.

Take all that with all the salt you want, the above is just me reading possibilities into some recent announcements. I have no inside info.

Fame Douglas
Nov 20, 2013

by Fluffdaddy

Penisaurus Sex posted:

I’m not even sure how much more GPU grunt a ‘pro’ machine would need.

By all accounts editing lighter 4K footage is a breeze with these gen 1 SoCs. If you just account for iterative improvements on the GPU (and maybe more cores) how many generations are we from breezing through heavy 4K editing with lots of post processing or animation work?

I think my perspective is colored by the idea that you really shouldn’t be doing heavy computational work locally, so there’s a limit to what I’d ever ask for in a laptop or desktop. If you’re regularly working with massive computational work, you’re in the kind of business where you should have access to a server to handle that sort of thing.

I mean, all kinds of scientific computations happen locally (and on servers built for that purpose as well, of course). But I guess that's a market Apple is pretty much forfeiting. Their definition of "Pro" seems to be "Youtuber editing video". Why buy an expensive Apple machine when all it does is hinder you?

Fame Douglas fucked around with this message at 02:09 on Nov 18, 2020

Fame Douglas
Nov 20, 2013

by Fluffdaddy

BobHoward posted:

Why not, though?

I may be risking a certain person coming back here to whine when I say it, but GPUs are a well known embarrassingly parallel problem. To increase graphics performance, add more GPU cores and memory bandwidth.

And, based on recent announcements by Imagination, with the Imagination TBDR GPU tech, there's a path to making multiple GPUs collaborate on rendering one scene, and it should work far better than the schemes NVidia and AMD have tried (alternate frame rendering, SLI). This is relevant to Apple GPUs because they're the result of Apple performing a hostile takeover of Imagination GPU tech without actually buying the company outright.

So, it's possible that the next M1 series chip (for bigger rMBP 13" / rMBP 16" / iMac) will look like a bigger M1 with an oversized geometry engine to support feeding one or more external tile engines. This wouldn't be the same as a traditional dGPU architecture, where the extra chip is like a complete subsystem unto itself, but should be able to deliver comparable performance.

Take all that with all the salt you want, the above is just me reading possibilities into some recent announcements. I have no inside info.

All we know is some marketing fluff and no usable product. Being skeptical they'll be able to compete with established players like Nvidia seems very sensible.

shrike82
Jun 11, 2005

they've tweaked their mobile solutions for PCs. that's a very far cry from building out dGPUs that don't have mobile applications.

MrBond
Feb 19, 2004

FYI, Cheese NIPS are not the same as Cheez ITS
Is it really though?





The Tomb Raider one isn't even running natively - that's with the CPU part being translated from intel -> arm. Yes an AMD 560X is not a new GPU but we're also talking about laptop-style integrated graphics here.

shrike82
Jun 11, 2005

there's no question that Apple will have a solution that beats other iGPUs but are they even interested in competing at the dGPU level?
they're not interested in games. and i'm skeptical they're interested in competing in the GPU compute space - they'd be offering a desktop solution without a cloud solution which seems backwards.

Fame Douglas
Nov 20, 2013

by Fluffdaddy
Yes, that's a far cry from what current dGPUs are able to achieve. Integrated graphics are always going to be constrained, the M1 not being terrible doesn't mean Apple is suddenly the best at everything always. And it's hard to believe Apple would suddenly be able to compete with Nvidia if they were to spin out.

American McGay
Feb 28, 2010

by sebmojo
Only in the Mac Hardware thread can someone post a video of World of Warcraft running at 40fps and people get impressed.

The new chips will probably be a good thing for Macs moving forward, but I don't think it's really going to move any sort of needle in the games industry. Pump the brakes a bit.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

If they were to do Windows-based dGPU and compete there with NVIDIA and AMD (and in some futures Intel) they’d also need to build out the WDDM/DX/etc software stack with all the tricks and workarounds and optimizations for various games that the dGPU vendors have accumulated over the years. Plus a devrel capability in that space.

I think they will settle for “that’s pretty amazing for iGPU” for quite a while.

Binary Badger
Oct 11, 2005

Trolling Link for a decade


At the very least, it's comforting to know that Apple no longer has to settle for Intel's lackluster integrated GPUs in its laptops any longer.

Pulcinella
Feb 15, 2019

Ok Comboomer posted:

I just don’t know how you keep the Mac Pro market happy and get commensurate performance gains by 1) going with an exclusively on-chip GPU (maybe they pair an iGPU with a dGPU?) 2) having AMD provide dGPU grunt. Like unless Apple wants to do that driver stuff themselves why not just scale up what they’re already doing in-house for non AMD GPU Macs? M1 seems to be proving itself and its GPU tiles are tiny and relatively few.



Make a Threadripper-sized die you cowards

“Do it themselves” or even “we’ll do it for you” seems to be one of the paths Apple is on. Octane just released their GPU accelerated 3D renderer for Metal so it runs on AMD and Apple GPUs (previously it was NVIDIA only. It’s not even really AMD supported directly since there is no Windows AMD support until AMD gets their Vulcan drivers working). In a forum post they say they were working on it directly with Apple for years so I imagine Apple probably sent over some engineers to work on it in their office.

FCKGW
May 21, 2006

How long until we get some reverse Hackintoshes where they try and get Win 10 running on the M1 Macs?

Zil
Jun 4, 2011

Satanically Summoned Citrus


FCKGW posted:

How long until we get some reverse Hackintoshes where they try and get Win 10 running on the M1 Macs?

Time for WindowsRT to shine!

squirrelzipper
Nov 2, 2011

Fame Douglas posted:

Yes, that's a far cry from what current dGPUs are able to achieve. Integrated graphics are always going to be constrained, the M1 not being terrible doesn't mean Apple is suddenly the best at everything always. And it's hard to believe Apple would suddenly be able to compete with Nvidia if they were to spin out.

I’m a bit curious what you were expecting? Nvidia is never coming back to apple. I don’t mean just this post, I get your point over the last couple pages, I’m a person who could use CUDA, I rely on AMD pro render, I feel the pain although from a different viewpoint probably. But you have to figure out what you want, cause windows and Linux exist and work great.

Apple thinks they can deliver performance better than intel/amd/nvidia performance for their market and a big part of that bet is metal over open CL/GL or cuda. It’s the way they want to go and they’re going there. I’m personally excited because I think they can do something cool, but if I needed Nvidia I wouldn’t be using a Mac. CUDA is dead on the Mac. It’s never coming back. 3D and FPU acceleration on the Mac is Metal. That’s it. I’m stoked about the M2 or whatever, but it’s a bold play and they know some folks will switch. Some might come back too.

Adbot
ADBOT LOVES YOU

~Coxy
Dec 9, 2003

R.I.P. Inter-OS Sass - b.2000AD d.2003AD

badjohny posted:

If anyone gets one of these and could load up WoW on it, I would love to see some benchmarks or even general FPS. I am sure it will crush my first gen 2014 5k iMac.


If you have the 295X upgrade then the Apple Silicon GPU is probably worse.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply