Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Xarn
Jun 26, 2015

rjmccall posted:

the apple devtools group has been arguing for developer-focused mac hardware for years, up to and including rack-mount servers. we are fully aware that this isn't an internal-only problem

we use a huge number of macs for ci bots and device testing and things like that, so we are basically in the same situation as any third party — including complaining about the price, since like any large company apple does do departmental accounting and budgeting, so dt does in fact have to buy mac pros

This is amazing.

And also incredibly sad.

Adbot
ADBOT LOVES YOU

Zlodo
Nov 25, 2006
the shoemaker always wears the worst shoes

cinci zoo sniper
Mar 15, 2013




Xarn posted:

This is amazing.

And also incredibly sad.

cinci zoo sniper
Mar 15, 2013




though it's kinda ironic to see apple own itself by being able to afford half the dev compute other tech giants can

Gazpacho
Jun 18, 2004

by Fluffdaddy
Slippery Tilde
what kind of clownshow are we talking about if apple can't build an internal system to bypass the constraints of its own licenses

Cybernetic Vermin
Apr 18, 2005

i have to assume it'd look a bit too terrible if they just spun up osx on whitebox servers, despite the fact that they'd obviously be able to do so without much effort

Plorkyeran
Mar 22, 2007

To Escape The Shackles Of The Old Forums, We Must Reject The Tribal Negativity He Endorsed
the license isn't really the issue. apple could obviously bypass the "load this highly privileged binary blob from some rando dude" part of hackintoshes, but if they ran macos internally on non-apple hardware they'd need to make macos actually properly support that hardware and that's unlikely to be any simpler politically than getting the hardware people to just make something that can reasonably be used as a server

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe
you know, nevermind

rjmccall fucked around with this message at 19:51 on Jan 4, 2019

Vanadium
Jan 8, 2005

I'm gonna guess the hard part is not making it work, it's coming to a consensus between the fifty different departments that might or might not have a stake in whether it works.

kitten emergency
Jan 13, 2008

get meow this wack-ass crystal prison

Xarn posted:

This is amazing.

And also incredibly sad.

siri play despacito

Beamed
Nov 26, 2010

Then you have a responsibility that no man has ever faced. You have your fear which could become reality, and you have Godzilla, which is reality.


uncurable mlady posted:

siri play despacito

I'm Not Sure What You Mean. Here's what I found for Play Desperate Cheeto

Plank Walker
Aug 11, 2005

Beamed posted:

I'm Not Sure What You Mean. Here's what I found for Play Desperate Cheeto

thanks for reminding me how stupid voice interface is

its like a CLI where instead of saying "command not found" when you make a typo it just executes some random command that shares a few letters

DONT THREAD ON ME
Oct 1, 2002

by Nyc_Tattoo
Floss Finder

Plank Walker posted:

thanks for reminding me how stupid voice interface is

its like a CLI where instead of saying "command not found" when you make a typo it just executes some random command that shares a few letters

wow you're right, voice automation is the next evolution in devops automation.

JawnV6
Jul 4, 2004

So hot ...
part of my job at intel was forecasting validation compute needs, it was this weird task of taking next year's chip's expected performance and dividing by the complexity of the next next chip. goofiest dogfooding ever

Deep Dish Fuckfest
Sep 6, 2006

Advanced
Computer Touching


Toilet Rascal
more like coolest. that poo poo is legit impressive and chip designers make me feel really inadequate considering the ridiculous gap in quality and complexity between their output and that of a "software engineer" like me

DONT THREAD ON ME
Oct 1, 2002

by Nyc_Tattoo
Floss Finder

Deep Dish Fuckfest posted:

more like coolest. that poo poo is legit impressive and chip designers make me feel really inadequate considering the ridiculous gap in quality and complexity between their output and that of a "software engineer" like me

it's just a different product with different expectations. if my bosses had their product development schedule planned out for the next 10 years and there was an expectation to produce a product with few defects, I would probably plan a lot better and produce better results.

instead i need to write code that I can evolve quickly in response to frequent product changes. if you work on bullshit, it's expected that you write some code with bugs because it's way cheaper to write software with some bugs* (that you fix) than it is to write software with 0 bugs.

* as long as you're careful about catastrophic bugs

DONT THREAD ON ME fucked around with this message at 21:05 on Jan 4, 2019

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Deep Dish Fuckfest posted:

more like coolest. that poo poo is legit impressive and chip designers make me feel really inadequate considering the ridiculous gap in quality and complexity between their output and that of a "software engineer" like me

in current $job i wrangle fpgas (and also c/python/perl/tcl/shell)

the way I would put it is that even when you don’t have the pressure to deliver a 99.99% good design the first time it’s tried for real (which I don’t, because fpga), you are working with bad languages and utterly wretched debugging tools, the modify-test-debug cycle is several orders of magnitude slower than software, and if your design has to run at a decently high clock speed you are going to be forced to do the equivalent of writing super low level c code where you pay close attention to how coding style affects the compiler’s output, maybe even resorting to inline asm to get best results, and even then you may find yourself trying several full rewrites of a block to get performance where it needs to be

and that’s fpga, which is kinda easy mode. cutting edge high perf gpu/cpu cores are insanely difficult and expensive to design

all that said you may be surprised to know that you folks up in the sky routinely and effortlessly do much more complex things than those of us in the deep mines. we don’t have high level abstractions and efficient debugging tools. we’re typically narrowly focused on small optimizations rather than big complicated ideas.

a few years ago i had to correct the mistakes of someone with a software background who wrote a bunch of verilog which I inherited. most of these could be described as “did something in a hardware state machine that should’ve been pushed up to the driver”. good hardware design is about keeping it stupid simple. you only put complicated algorithms into hw when there’s no other way to achieve the desired result

this is so true that, with modern transistor densities, when there’s a call for a state machine that’s complex but also needs hard real-time guarantees, chip designers often throw in a whole embedded CPU core and have a software engineer implement the state machine instead. every modern gpu or cpu or cellphone chip is sprinkled with dozens of cortex-M0 class microcontrollers. these are often not even touted in marketing materials since they typically cannot run user supplied code. they’re just testament to the difficulty of using hardware design techniques to solve complicated problems.

DELETE CASCADE
Oct 25, 2017

i haven't washed my penis since i jerked it to a phtotograph of george w. bush in 2003
does the simulator run fast enough for those arm chips to test comprehensively? i remember a computer architecture student in grad school saying when they eventually build a test chip and turn it on for the first time, it runs more instructions in a second than the entirety of testing beforehand

JawnV6
Jul 4, 2004

So hot ...

DELETE CASCADE posted:

i remember a computer architecture student in grad school saying when they eventually build a test chip and turn it on for the first time, it runs more instructions in a second than the entirety of testing beforehand
when $PROJECT tapeout was delayed $X weeks, i made pretty much this exact joke

BobHoward posted:

this is so true that, with modern transistor densities, when there’s a call for a state machine that’s complex
i love reading your posts fyi

check this slide out

im stunned someone put a bloom filter into HW

JawnV6 fucked around with this message at 07:22 on Jan 5, 2019

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

DELETE CASCADE posted:

does the simulator run fast enough for those arm chips to test comprehensively? i remember a computer architecture student in grad school saying when they eventually build a test chip and turn it on for the first time, it runs more instructions in a second than the entirety of testing beforehand

test is very much a problem, yes, especially in academia

most commercial chip projects these days use at least one highly accelerated alternative to conventional simulation. there are multimillion dollar boxes based on massively parallel arrays of custom cpu cores designed specifically to accelerate HDL simulation. a cheaper approach which I have been involved with is to implement your asic in a multi-fpga board (or a stack of them if the chip is too large for one board).

both of these won’t run at a clock rate anywhere near the final product’s, tend to have reduced visibility of internal signals compared to a classic sim (especially fpga), and require living with the fact that you’re not directly testing your real design. however, the speed makes it all worthwhile as a supplement to conventional sim. 100 Hz would be a super nice sim speed iirc; by way of comparison I’ve worked on one fpga asic emulator which used some tricks to get a major data path up to ~160 MHz, not much slower than it had to run in the real chip. ~1 MHz is much more typical but even that’s so so much better than sim.

the other big win is that you can give fpga systems to software devs and have them start driver bringup work before your chip even tapes out

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

JawnV6 posted:

check this slide out

im stunned someone put a bloom filter into HW

that’s nuts

Deep Dish Fuckfest
Sep 6, 2006

Advanced
Computer Touching


Toilet Rascal

DONT THREAD ON ME posted:

it's just a different product with different expectations. if my bosses had their product development schedule planned out for the next 10 years and there was an expectation to produce a product with few defects, I would probably plan a lot better and produce better results.

instead i need to write code that I can evolve quickly in response to frequent product changes. if you work on bullshit, it's expected that you write some code with bugs because it's way cheaper to write software with some bugs* (that you fix) than it is to write software with 0 bugs.

* as long as you're careful about catastrophic bugs

i dunno, i really don't know if i'd be able to do that much better. this past year i've basically been left to do whatever i want at work, and i can't say i've produced much that was as polished as i'd liked. mind you that was mostly r&d work, but still. might just be i'm a terrible programmer all-around though

BobHoward posted:

in current $job i wrangle fpgas (and also c/python/perl/tcl/shell)

the way I would put it is that even when you don’t have the pressure to deliver a 99.99% good design the first time it’s tried for real (which I don’t, because fpga), you are working with bad languages and utterly wretched debugging tools, the modify-test-debug cycle is several orders of magnitude slower than software, and if your design has to run at a decently high clock speed you are going to be forced to do the equivalent of writing super low level c code where you pay close attention to how coding style affects the compiler’s output, maybe even resorting to inline asm to get best results, and even then you may find yourself trying several full rewrites of a block to get performance where it needs to be

i've worked with bad languages and utterly wretched debugging tools on gpu, so i get that. and i will say that having to divine the exact structure of the 10 layers of cache and whatnot that exists in hardware has not, in general, made me a happy person

BobHoward posted:

and that’s fpga, which is kinda easy mode. cutting edge high perf gpu/cpu cores are insanely difficult and expensive to design

all that said you may be surprised to know that you folks up in the sky routinely and effortlessly do much more complex things than those of us in the deep mines. we don’t have high level abstractions and efficient debugging tools. we’re typically narrowly focused on small optimizations rather than big complicated ideas.

maybe this is the mistake i'm making, assuming that everyone who designs a chip is designing the next branch predictor or instruction pipeline in intel's flagship chip. or i just don't have any experience lower than assembly so hardware all seems like magic to me

Qtotonibudinibudet
Nov 7, 2011



Omich poluyobok, skazhi ty narkoman? ya prosto tozhe gde to tam zhivu, mogli by vmeste uyobyvat' narkotiki

Deep Dish Fuckfest posted:

i dunno, i really don't know if i'd be able to do that much better. this past year i've basically been left to do whatever i want at work, and i can't say i've produced much that was as polished as i'd liked. mind you that was mostly r&d work, but still. might just be i'm a terrible programmer all-around though

r&d work probably should be like that. determining what's possible/exploring theory is very different from implementing robust systems. academic proof of concept code is famously bad and prone to fall over in anything other than its author's bespoke test environment because that can be worked out later if the concept is worth pursuing.

nobody produces truly polished, robust code when initially writing it. most of that polished robustness is going to come from incrementally fixing things that inevitably fail when end users try things that make sense, but that weren't accounted for in the initial write. any software engineer who believes they can account for all that poo poo ahead of time is deluding themselves. for all the crap that agile gets, it was invented for a reason.

if you want to get better at writing poo poo that's more robust to start, don't do r&d/exploratory/new functionality stuff. go back to the mountain of tech debt tickets you have for existing code, work through them, and try to think about why whatever failure occurred wasn't considered when the code was initially written. do that repeatedly rather than write more new code. go find your frontline support people and shadow them working through customer issues. stories like this poo poo are a trope not because engineering and management have some magical technical ability that support doesn't have, but because there's a smokescreen of PMs and engineering managers who either don't give a poo poo or are trying to present an image of poo poo working well to their higher-ups.

it's also okay to produce not-robust code: the aforementioned lovely academic poc code does serve a purpose; basic research is important. if that's what you enjoy and your employer allows you to do it, go for it.

Notorious b.s.d.
Jan 25, 2003

by Reene

rjmccall posted:

we use a huge number of macs for ci bots and device testing and things like that, so we are basically in the same situation as any third party — including complaining about the price, since like any large company apple does do departmental accounting and budgeting, so dt does in fact have to buy mac pros

this is loving hilarious

for context, i was a contractor at apple many years ago and back then it was a solaris shop... and they seemed to have no trouble building things?

idk

Notorious b.s.d.
Jan 25, 2003

by Reene

Zlodo posted:

the shoemaker always wears the worst shoes

in this tortured metaphor the shoemaker sells rags and broken needles in the shop window

Cybernetic Vermin
Apr 18, 2005

JawnV6 posted:

check this slide out

im stunned someone put a bloom filter into HW

once one realizes that it is likely a hugely ad-hoc hashing function it is one of those that i think just evolves over time really

i.e. you start off with a table of k flags to track whether a given address (since this is transactional stuff i'd guess it is a cache for checking address reuse) could have been used before by indexing into it with the log(k) lowest bits. you find that some common software patterns makes this inefficient (i.e. array strides of k are common) so in a later rev you change it to index by x^y (except likely not xor but some bespoke one-to-one relation on {0,1}^k which they grind out based on a collection of traces) where x and y are the log(k) lowest and next-to-lowest bits, bam you're not confused by those common strides anymore. unfortunately you are now confused by mixed strides and 1-offsets, so in the next rev you set the flag at x^y and x^rot(y,1). then you have the bloom filter

one of the neat bits of bloom filters that they really are extremely simple outside the hash specifics after all

Cybernetic Vermin fucked around with this message at 09:58 on Jan 5, 2019

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope

Xarn posted:

This is amazing.

And also incredibly sad.

at least they are dogfooding :shrug:

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe

Notorious b.s.d. posted:

this is loving hilarious

for context, i was a contractor at apple many years ago and back then it was a solaris shop... and they seemed to have no trouble building things?

idk

we have a ton of non-mac hardware, it’s just that we also need to have mac hardware for various reasons

The MUMPSorceress
Jan 6, 2012


^SHTPSTS

Gary’s Answer

Notorious b.s.d. posted:

this is loving hilarious

for context, i was a contractor at apple many years ago and back then it was a solaris shop... and they seemed to have no trouble building things?

idk

ime they don't let contractors touch like 95% of the infrastructure.

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

Notorious b.s.d. posted:

this is loving hilarious

for context, i was a contractor at apple many years ago and back then it was a solaris shop... and they seemed to have no trouble building things?

idk

it’s not like the company’s own software ran or built on Solaris

back in the mid-1990s some PowerPC stuff was built using xlc on a fleet of AIX RS/6000 running AIX, but that was all handled semi-transparently (it looked just like building locally) and eventually someone wrote a shim to run XCOFF binaries like xlc directly under MPW

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

i thought most if not all the infrastructure was hosted in google cloud (because bad business decisions happen at every level)

feedmegin
Jul 30, 2008

eschaton posted:

it’s not like the company’s own software ran or built on Solaris

back in the mid-1990s some PowerPC stuff was built using xlc on a fleet of AIX RS/6000 running AIX, but that was all handled semi-transparently (it looked just like building locally) and eventually someone wrote a shim to run XCOFF binaries like xlc directly under MPW

Not https://en.m.wikipedia.org/wiki/Apple_Network_Server (or before that A/UX) ?

feedmegin
Jul 30, 2008

BobHoward posted:

this is so true that, with modern transistor densities, when there’s a call for a state machine that’s complex but also needs hard real-time guarantees, chip designers often throw in a whole embedded CPU core and have a software engineer implement the state machine instead. every modern gpu or cpu or cellphone chip is sprinkled with dozens of cortex-M0 class microcontrollers. these are often not even touted in marketing materials since they typically cannot run user supplied code. they’re just testament to the difficulty of using hardware design techniques to solve complicated problems.

Truth and also literally my previous job. Makes things much more flexible than hardware too since your M0 can potentially talk to and be configured by the host driver. An M0 is a fraction of the full size on die of modern chips, like a tenth of a millimetre squared or something.

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

I’m not saying nothing Apple ever did was built on AIX or Solaris or HP-UX

after all there wasn’t just the Apple Network Server, there was also the Macintosh Application Environment, there was bring-up of A/UX which was probably done on Sun-2 or Sun-3 or HP 9000/200 or 300, there were Lisa and Macintosh which I think were first cross-assembled and cross-compiled on VAX/VMS

plus of course NeXT WebObjects and EOF ran on Solaris and HP-UX and Windows NT/2000/XP in addition to OPENSTEP/Mach, and NEXTSTEP itself was initially brought up on Sun-3 too

but in general what’s thought of as Apple’s products by normal users have generally been self-hosting after bring-up

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



u can't develop iOS on an iPhone :smug:

TOPS-420
Feb 13, 2012

the original mac wasn’t self hosting for a while either, you needed a lisa to develop for it

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?
tools like MacForth were available for third parties to develop with pretty quickly

Lisa was also used to host Mac development within Apple for a while though, it’s true

the first code was written using Apple II and III as VAX terminals though, from what I know

Notorious b.s.d.
Jan 25, 2003

by Reene

eschaton posted:

tools like MacForth were available for third parties to develop with pretty quickly

Lisa was also used to host Mac development within Apple for a while though, it’s true

the first code was written using Apple II and III as VAX terminals though, from what I know

writing code on an apple ii sounds like hell with the 40 column display and fuzzy text

i guess now we know the real reason the apple iii existed

The_Franz
Aug 8, 2003

Notorious b.s.d. posted:

writing code on an apple ii sounds like hell with the 40 column display and fuzzy text

i guess now we know the real reason the apple iii existed

don't forget that lowercase wasn't a thing on the apple ii either

Adbot
ADBOT LOVES YOU

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?
the III was a good system that rolled up all the stuff people were adding to the II into one package: memory expansion, a real time clock, lowercase, 80 column text, double high res graphics, RGB video out, analog sound…

most of that was stuff you could mod your II+ for: lowercase kits were common and standard, as were 80 column cards, and software supported them

and of course most of the III improvements went into the IIe and IIc after the III flopped, retrofit for the II architecture (since they had a bit more of a clean slate with the III—80 column text, double high res graphics, and the overall memory bank switching design are the biggies)

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply