|
OneEightHundred posted:Probably more of a math question than a programming question: I'm trying to implement some BigNumber-type boilerplate and I know that it's possible to do long division bit-by-bit using compares, but is there a way to do it with fewer iterations by using integer math? Particularly if the divisor is larger than you can stuff in a register, like say you had an arbitrary 5-byte number and needed to divide it by a 3-byte number, is it possible to do that with only 8 and 16 bit math? If you have a fast divisor, you can basically do base-2^wordsize long division. Book description. code. Scaevolus fucked around with this message at 23:21 on Feb 17, 2013 |
# ? Feb 17, 2013 23:12 |
|
|
# ? May 29, 2024 23:19 |
|
I chased down some incorrect assumptions and it turns out that for this usage, dividing the top 2 digits of the current remainder by the top digit of the divisor yields the correct factor 74% of the time, yields 1 + the correct factor 24% of the time, and the remaining 2% I just fell back to a lovely binary search that tends to find the result after searching a quarter of the bit space on average. I guess that works.
OneEightHundred fucked around with this message at 02:11 on Feb 18, 2013 |
# ? Feb 18, 2013 02:08 |
|
Where does one find out about conventions/events? Especially ones that involve the chance of free stuff?
|
# ? Feb 18, 2013 06:00 |
|
This the right place to ask CUDA questions? I've got nvcc installed on linux but it's not compiling c++ code for some reason. I had to install gcc 4.4 and g++ 4.4 and copy them to the nvcc directory to to get nvcc to do anything at all but now when I try to compile c++ it's giving me some poo poo about "fatal error: iostream: No such file or directory" when I include iostream (durrr). Do I have to gently caress with nvcc.profile? My google-fu has been exhausted. I just want to be able to do me some hello_world_cuda.cpp. edit: gently caress me this thing is available in the repositories? I'll give that a shot. edit: Lesson learned! Check if the repositories have it first. Tres Burritos fucked around with this message at 07:37 on Feb 18, 2013 |
# ? Feb 18, 2013 07:29 |
|
I'm currently in an Operating Systems class. Half of it is pretty easy, but the other half is becoming a nightmare. We have to write programs on an Apple 2E using the 6502 Assembly language. Except we aren't given any instructions on how to write in 6502. Our book doesn't cover it, and scouring the internet only turns up guides that assume I've already written in 6502 or ask me to load programs that I don't have because they are just copies of the instruction manual that came with the Apple 2E, or are using a Commodore64. So, where can I find a complete idiots guide to writing in 6502 Assembly - one that understands I am emulating the Apple 2E and that I have never touched Assembly in my life?
|
# ? Feb 19, 2013 21:50 |
|
This is probably a good place to start as it covers the basics with some nice examples: http://skilldrick.github.com/easy6502/
|
# ? Feb 19, 2013 21:56 |
|
Doing my first contract job, in general for contract work, is it unheard of for you to be able to use your own hardware and software (barring necessities) for the work or is it something common? I ask because I've thought of it at my old full-time job, and it seems especially relevant now, it would be nice to have my own beefy laptop to not have to deal with lovely work-supplied laptops (like my current one) and not have to redownload and reimport all my settings etc. and just connect to the network, access teh svn etc. from my own machine.
|
# ? Feb 20, 2013 00:32 |
|
Ah, 6502, what a beautifully simple CPU.
|
# ? Feb 20, 2013 01:07 |
|
Mr. Crow posted:Doing my first contract job, in general for contract work, is it unheard of for you to be able to use your own hardware and software (barring necessities) for the work or is it something common?
|
# ? Feb 20, 2013 02:48 |
|
Plorkyeran posted:There's some exceptions, but in general the only time you don't use your own equipement as a contractor is if they're basically treating you as a FTE that's cheaper and easier to get rid of. This is good news! But for my wallet...
|
# ? Feb 20, 2013 14:28 |
|
Plorkyeran posted:There's some exceptions, but in general the only time you don't use your own equipement as a contractor is if they're basically treating you as a FTE that's cheaper and easier to get rid of. Mr. Crow posted:But for my wallet... If not, then The Gripper fucked around with this message at 15:01 on Feb 20, 2013 |
# ? Feb 20, 2013 14:59 |
|
Mostly I say because I've been wanting to get a Macbook for a while now to use as my 'coding/everything' machine and free up my desktop for ~video games~~ Considering the cost of entry for a macbook... Need to go to SH/SC and figure out the most cost effective way to get one, whether to buy the cheapest and upgrade out-of-pocket (and presumably voiding any warranty or applecare) or whatever.
|
# ? Feb 20, 2013 15:26 |
|
I'm working on a little project modding Skyrim and I'd like to make a pretty picture of the leveled lists. I already have stuff setup to get at all the data I'm just not sure how to visualize it. The lists themselves are pretty straight forward: a series of entries which are either an item or another leveled list, a level to use that entry at, and a number for how many of that entry will appear. They end up being like a bunch of overlapping trees. I'm pretty sure I've seen interactive visualizers that let you zoom in and highlight which nodes are connected where but for the life of me can't find a tool to setup one of my own. Anyone know of such a thing?
|
# ? Feb 20, 2013 17:55 |
|
LtSmash posted:I'm working on a little project modding Skyrim and I'd like to make a pretty picture of the leveled lists. I already have stuff setup to get at all the data I'm just not sure how to visualize it. The lists themselves are pretty straight forward: a series of entries which are either an item or another leveled list, a level to use that entry at, and a number for how many of that entry will appear. They end up being like a bunch of overlapping trees. I'm pretty sure I've seen interactive visualizers that let you zoom in and highlight which nodes are connected where but for the life of me can't find a tool to setup one of my own. Anyone know of such a thing? What exactly are you trying to do? Write a tool to edit .esps? TES5Edit does that. As to your specific question what language are you working with? What's wrong with just a basic tree view? Maybe the Composite Pattern will help?
|
# ? Feb 20, 2013 18:03 |
|
Mr. Crow posted:What exactly are you trying to do? Write a tool to edit .esps? TES5Edit does that. I have a Skyproc patcher that takes modded weapons and armor and enchants them and inserts them in the leveled lists which is in Java but I can easily dump the data for something else or a standalone program. I'd like a tool or library that makes pretty pictures of the leveledlists so I/other players can easily see where poo poo ends up visually and what 'top level' lists changed. I'd like to show the overlap between trees since some of the lists are reused a lot. I'm like 6 years out of date with this kind of stuff so I suspect there is an easy answer and I just suck at googling it.
|
# ? Feb 20, 2013 20:00 |
|
Sounds like the leveled lists constitute a DAG (I'm assuming the 'acyclic' part). It would probably be pretty easy to dump this information into a format that graphviz understands. You could model the edge direction as "includes", e.g. leveled list A includes leveled list B and item C, so A would have edges to B and C. If you invert the edge directions, this would mean "is affected by modifications", and you can find all root leveled lists that are affected by a certain modification with BFS or DFS.
|
# ? Feb 20, 2013 20:36 |
|
Lysidas posted:Sounds like the leveled lists constitute a DAG (I'm assuming the 'acyclic' part). It would probably be pretty easy to dump this information into a format that graphviz understands. That looks like it should do what I want. Thanks.
|
# ? Feb 20, 2013 21:49 |
|
Related to my previous question but specific to VS, besides the solution file, are there issues that would prevent me from using VS2012 with say VS2008? Basically what's the backwards compatibility like? And I guess in general do you think it will be an issue (obviously there will be one-offs where a company is using some old and obscure software I'll need to use) if I'm using my own machine, just keeping the latest and greatest on there?
|
# ? Feb 21, 2013 16:24 |
|
Mr. Crow posted:Related to my previous question but specific to VS, besides the solution file, are there issues that would prevent me from using VS2012 with say VS2008? Basically what's the backwards compatibility like? And I guess in general do you think it will be an issue (obviously there will be one-offs where a company is using some old and obscure software I'll need to use) if I'm using my own machine, just keeping the latest and greatest on there? Not a problem at all. The only place it will make a difference is if you are doing stuff with SQL Server (Integration Services, Reporting Services, etc) that is using the Visual Studio shell to host Business Intelligent Development Studio you'll have to keep Visual Studio or re-install the BIDS client.
|
# ? Feb 21, 2013 16:53 |
|
Now that the PS4 is out and new xbox following shortly, how much time/money, generally speaking, will it take established game studios to port their existing engines to 64bit? Anyone in the industry here know if some game devs will just leave their first round of games at 32bit just to meet deadlines and budget while forgoing the additional memory benefits of the new consoles?
|
# ? Feb 21, 2013 17:26 |
|
I'm pretty sure the consoles don't have 32-bit compatibility modes (since that adds a poo poo ton of work to writing the OS, to maintain compatibility with software that hasn't been written yet), so the developers don't have a choice on that front. It doesn't matter though, because both the PS3 and the Xbox360 were 64-bit, so the new consoles don't represent any change in that regard.
|
# ? Feb 21, 2013 17:40 |
|
Zhentar posted:I'm pretty sure the consoles don't have 32-bit compatibility modes (since that adds a poo poo ton of work to writing the OS, to maintain compatibility with software that hasn't been written yet), so the developers don't have a choice on that front. The lack of compatibility modes would be interesting. Also, I think the PS3/Xbox CPUs were 64bit/32bit capable but I'm not sure if the actual dev tools supported 64bit builds. Besides, tons of devs were porting their games to PC and AFAIK there are less than a handful of 64bit PC games and none of them are really all the popular.
|
# ? Feb 21, 2013 17:43 |
|
Shaocaholica posted:Now that the PS4 is out and new xbox following shortly, how much time/money, generally speaking, will it take established game studios to port their existing engines to 64bit? Mr. Crow posted:Related to my previous question but specific to VS, besides the solution file, are there issues that would prevent me from using VS2012 with say VS2008? Basically what's the backwards compatibility like? And I guess in general do you think it will be an issue (obviously there will be one-offs where a company is using some old and obscure software I'll need to use) if I'm using my own machine, just keeping the latest and greatest on there?
|
# ? Feb 21, 2013 17:44 |
|
uh hopping ISA's is so much bigger than 32/64 i'm not sure why the latter merits discussion
|
# ? Feb 21, 2013 17:51 |
|
Install Visual Studio versions in release order or you might run into goofy installer issues. Otherwise I've had no problems with having 2008, 2010 and 2012 installed and switching between them regularly.
|
# ? Feb 21, 2013 18:34 |
|
Shaocaholica posted:The lack of compatibility modes would be interesting. IIRC, while the PS3 and 360 use the 64-bit PPC instruction set, they only use 32-bit pointers, so you can still make most of the same assumptions that will break 64-bit PC builds.
|
# ? Feb 21, 2013 20:06 |
|
Shaocaholica posted:Besides, tons of devs were porting their games to PC and AFAIK there are less than a handful of 64bit PC games That's because there's not much value to it, not because it's too hard. You can't functionally depend on that much memory without cutting off 2/3rds of the market, so you're limited to doing things like extra caching (much of which the OS will do with unused RAM anyway) for small performance gains on top end systems, at the cost of added testing & support costs. It is likely that we'll start seeing more 64-bit PC games with the introduction of these new consoles, but only because it increases the value of designing something that actually takes advantage of that much memory, not because it's going to force them through some supposedly costly transition.
|
# ? Feb 21, 2013 20:20 |
|
Zhentar posted:That's because there's not much value to it, not because it's too hard. You can't functionally depend on that much memory without cutting off 2/3rds of the market, so you're limited to doing things like extra caching (much of which the OS will do with unused RAM anyway) for small performance gains on top end systems, at the cost of added testing & support costs. Targeting x86-64 gives you some other benefits over x86 other than a larger address space. You have twice as many general purpose registers, the calling convention allows you to pass more parameters via registers, etc.
|
# ? Feb 21, 2013 20:43 |
|
But at the same time, the processor is crammed full of features to make those things go fast, whether you're using x86-64 or not. And again, if it actually needs those to run fast enough, then you're cutting off 1/3rd of the target market. edit: and of course, there are non-trivial costs associated with making all of your pointers twice as big; you need more memory and get less cache for the same data. Zhentar fucked around with this message at 21:16 on Feb 21, 2013 |
# ? Feb 21, 2013 21:09 |
|
Ironically, 64-bit adoption might be slowed by the fact that streaming in data is becoming an increasingly common practice and once you have a game that supports that at all, a large working set becomes a lot less advantageous. The only thing that's really going to force it for PC is Microsoft switching to only 64-bit versions of their OSes, which I'm kind of surprised that they haven't done already.
OneEightHundred fucked around with this message at 21:26 on Feb 21, 2013 |
# ? Feb 21, 2013 21:23 |
|
You are? They only just took out 16-bit mode from Vista. 32-bit Windows is going to be here to stay for a long time coming.
|
# ? Feb 21, 2013 21:31 |
|
Zhentar posted:You are? They only just took out 16-bit mode from Vista. 32-bit Windows is going to be here to stay for a long time coming. Not shipping a 32-bit OS doesn't mean removing the ability to run 32-bit applications. Just knowing that everyone running {windows released in the past X years} is capable of running a 64-bit app (instead of some of them not being able to, because whoever set up their system installed 32 bit Windows because they're a goddamned idiot) would do a lot to help 64-bit adoption.
|
# ? Feb 21, 2013 21:54 |
Or they could maybe just ship both 32 and 64 bit binaries and let the 64 bit ones support higher quality rendering or whatever. "Game X requires a 64 bit OS to provide the full experience."
|
|
# ? Feb 21, 2013 22:09 |
|
I imagine we'll start seeing exactly that once the console market justifies investing the time in developing the higher quality version, and a 32-bit Windows executable requires a dumbed-down version of what the console gets.
|
# ? Feb 21, 2013 22:19 |
|
CryTech and DICE (and probably others I'm not thinking of) seem to do OK on PC despite cutting out a big chunk of the market with their requirements. I don't think a AAA title that's 64-bit only is much of a stretch at all given how often ]-[ARDCORE PC GAMERs update their systems and that most people who have stuck with PC games are used to having to upgrade every couple years, anyway. The only problem I could see would be a few holdouts who only use 32-bit Windows in order to try to eke out a couple more frames by saving bus bandwidth or something, and they're going to be a tiny minority.
|
# ? Feb 21, 2013 22:41 |
|
OneEightHundred posted:Ironically, 64-bit adoption might be slowed by the fact that streaming in data is becoming an increasingly common practice and once you have a game that supports that at all, a large working set becomes a lot less advantageous. The only thing that's really going to force it for PC is Microsoft switching to only 64-bit versions of their OSes, which I'm kind of surprised that they haven't done already. I don't really think thats the case. With 512mb and streaming, you can only show '512mb' of the world at any given time. Closer to 300mb if you count all the data needed stuff that isn't actually going to be turned into pixels. With 8gb and streaming, you now have that much more space to store a much denser more detailed world. I think the bigger problem will be that if the assets of next gen games get heavier since there is so much more memory, streaming them will also be slower since optical and hard drive based interfaces haven't really gotten much faster since last gen. SSDs might help a lot but thats not part of next gen spec so I wouldn't expect devs to target that kind of config even if people put them in themselves.
|
# ? Feb 21, 2013 22:46 |
|
Jabor posted:Not shipping a 32-bit OS doesn't mean removing the ability to run 32-bit applications. Microsoft loves this kind of inscrutably fine-grained product differentiation across their entire line. I could spin theories about intentionally (1) forcing OEMs/end-consumers to buy multiple licenses or (2) bloating the ecosystem by making any sort of Windows-based purchase or deployment more complex, but it's probably just because there are literally thousands of project and product managers in charge of Windows and it is completely impossible to get that kind of herd to agree that something is a bad idea, especially when that kind of pointless differentiation keeps several hundred of them employed.
|
# ? Feb 21, 2013 22:57 |
|
rjmccall posted:...it is completely impossible to get that kind of herd to agree that something is a bad idea, especially when that kind of pointless differentiation keeps several hundred of them employed. This explains a lot about the software industry.
|
# ? Feb 21, 2013 23:00 |
|
rjmccall posted:Microsoft loves this kind of inscrutably fine-grained product differentiation across their entire line. I could spin theories about intentionally (1) forcing OEMs/end-consumers to buy multiple licenses or (2) bloating the ecosystem by making any sort of Windows-based purchase or deployment more complex, but it's probably just because there are literally thousands of project and product managers in charge of Windows and it is completely impossible to get that kind of herd to agree that something is a bad idea, especially when that kind of pointless differentiation keeps several hundred of them employed. I always assumed it was born out of a determination to support old but correctly written software for as long as possible in order to keep customers coming back. If you're an IT manager and have to either A) order N Windows licenses at $W and spend H hours of employee time re-installing that old software at $0 or B) find new software (costing $?) that does the same thing on any platform (costing $?) and re-train your people to support that (H*?), you're probably going to pick the simpler one, A, because you know the values to insert in those variables ahead of time, so there's less risk and less to think about. It also upsets users less and probably saves other departments money that would be spent re-training people. I think his situation largely goes away with open source because you can use spend a little money to make your open source software work on the new systems, if it breaks, but I might be misunderstanding the situation there.
|
# ? Feb 21, 2013 23:24 |
|
|
# ? May 29, 2024 23:19 |
|
It's really more of a determination to support old, shoddily-written software that just so happened to work most of the time.
|
# ? Feb 21, 2013 23:39 |