Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
guidoanselmi
Feb 6, 2008

I thought my ideas were so clear. I wanted to make an honest post. No lies whatsoever.

I don't really have time and probably shouldn't effort post but this is a problem I've been dabbling in for the past several years in my spare time. Namely, with enough information about future X market demand (e.g. some consumer market), how might industry change to meet said demand and how would that effect labor demand and then that feeding back to the beginning of the equation.

There's a lot of major problems in assessing relevant parameter space and what exactly is being optimized on not to mention the validity of models and predictions. There's major issues in dynamics and chaotic behavior that emerge if you want to model things with higher dimensionality so you have to flatten the predictive space. There's various ways of doing it intelligently and computation is definitely good enough for the sort of machine learning to flatten the spaces across some discreet boundaries. Either way, it takes time to validate these models.

That's all a little esoteric but a fun example is predicting romantic relationships. You can develop an essentially non-dynamic model with free parameters to predict relationship outcomes and validate it for some small-N cases that sets those free parameters. Problem is the small-N isn't wholly descriptive of a larger population, like American relationships and Brazilian relationships may, for whatever reason, may obey the same theory by with different parameters to account for cultural locality. You can learn from two different sets if the data exists and optimize on that to have different sets of free parameters. That's a discreet boundary that helps in a lot of ways, but then we have continuous functionals for free parameters if we're going to consider weighting learned romantic outcomes on personality that are continuous (e.g. extraversion, open-mindedness). That's a lot trickier mathematically but tractable with enough data. The issue in general is you actually have hundreds of dimensions that you're flattening to several free parameters/functionals and the framework to define these best is almost impossible a priori. I imagine this is the sort of thing Palantir is good at.

So from that we have a lot of free parameters and hundreds of dimensions - moving to economic planning is a major leap given the complexity. But let's say you even have that utopian solution reconciling it with existing economic and cultural systems - let alone existing power structure - is the hardest part...

spoon0042 posted:

With computers now...


And because no one has posted this:

Red Plenty is a fun read and Spufford has really entertaining prose. There's a few papers on the actual feasibility of making these calculations:

http://users.cms.caltech.edu/~adamw/papers/letter_sigexc.pdf
http://crookedtimber.org/2012/05/30/in-soviet-union-optimization-problem-solves-you/

A bunch more papers I can't seem to find. This has some various essays and narratives: http://crookedtimber.org/wp-content/uploads/2012/07/RedPlenty.pdf

Adbot
ADBOT LOVES YOU

  • Locked thread