Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
minidracula
Dec 22, 2007

boo woo boo

MrMoo posted:

I think Martin has burnt out and pretty much given up on both crossroads.io and nanomsg. Ømq is still plodding along.
Yeah, between the three of these, 0MQ seems to be the thing that, y'know, works.

I didn't really understand the Crossroads I/O thing except for "developer/project ownership drama ego blowup". I'm sure I'm oversimplifying. I didn't even know about nanomsg until tef mentioned it.

Adbot
ADBOT LOVES YOU

minidracula
Dec 22, 2007

boo woo boo
Post the metaprogrammiest code you got.

minidracula
Dec 22, 2007

boo woo boo

yaoi prophet posted:

what keyboard should i get yospos, are kinesises actually that much better than flat keyboards
Model M or go home.

minidracula
Dec 22, 2007

boo woo boo
This makes my experiences with Model 204 look like luxury in comparison.

Where's AD to regale us with MUMPS horrors?

minidracula
Dec 22, 2007

boo woo boo

Nomnom Cookie posted:

this is some bullshit. but maybe they'll come up with something worth stealing for a good language
Since it's Rustchat Q&A, can anyone tell me why they dropped typestates?

I don't really care, as I don't use Rust ATM, but I am curious.

Also seconding whoever it was that pleaded for a working Windows build/installer for all the right reasons (ITT or wherever I read it).

minidracula
Dec 22, 2007

boo woo boo

JewKiller 3000 posted:

i'm coming from the perspective that (cons 1 2) has type int list. if you're not then sorry about your broken mental model.

also i think it should be more like cons 1 [2] but i'm an ML programmer so
Hey what's up ML buddy. I like ML lots too, even though I'm still a die-hard Lisp slob; I guess I got religion late in life (PS: I didn't get much religion though). You can ease up on the bad Lisp programming hate though; I'm pretty sure no-one ITT actually writes Lisp that way.

(For realz though: for the longest time I never thought I'd be one of those people who cared real hard about type systems, and I wasn't; then one day I looked into my heart and found Pierce and it's been nothing but the Good News since).

minidracula
Dec 22, 2007

boo woo boo

JewKiller 3000 posted:

anyway, lisp is a neat idea that has influenced a lot of important poo poo, but without static types a la typed racket, it's a toy. also the main argument for lisp is macros, and those are almost impossible to write correctly. trace through macroexpand until you want to kill yourself. or just use a language with syntax
I'd just like to say for the record that you can have my macroexpand-1 (and unwind-protect (and call/cc in Scheme)) when you pry it from my cold dead hands. :colbert:

minidracula
Dec 22, 2007

boo woo boo

JewKiller 3000 posted:

if you haven't already, buy both of pierce's books. he's a loving genius, i can't even. i'm totally not on the level of people who care Real Hard, because those people have moved on to coq and agda. but ML is a sweet spot for me. it lets me express the types i want without having to prove any theorems

i also like robert harper at cmu, he makes some excellent blog posts under the name "existential type" that advocate for the ML way of thinking
Yeah, I have both of Pierce's tomes. And I've moved on to Coq (but not Agda yet... still moves too fast for me, too experimental, Haskell still hasn't owned me body and soul yet, and no tactics yet).

ML is still the shiz though. Real talk: Standard ML or OCaml? I kinda swing both ways, but I'm still relatively green with OCaml compared to SML (unless you count F#, which I sorta do... but still). I've stopped fighting against the OCaml tide a bit, but I haven't given up on SML either.

Bob Harper is the man, have you seen his new book? It's pretty boss. You can still get a "draft" PDF for free off his site. I giggled the other day when I noticed this "draft" PDF had been revised this June with errata and updates post-release.

minidracula
Dec 22, 2007

boo woo boo

JewKiller 3000 posted:

i am not qualified to argue about call/cc but oleg kiselyov certainly is: http://okmij.org/ftp/continuations/against-callcc.html
Yeah, Oleg is one of my icons; I aspire to be like Oleg someday. His arguments against call/cc are compelling. I was just rereading his article against call/cc as a primitive the other day, but I'm not entirely sold yet.

I do like his delimited continuations implementation for OCaml though, have been tinkering and futzing with that some.

Oleg isn't the first to argue against call/cc as a primitive; Felleisen has raised the issue in the past before as well.

minidracula
Dec 22, 2007

boo woo boo

unixbeard posted:

how many people here use ocaml professionally and dont work at jane st
I'm using it at work, and I don't work at Jane St. Does that count?

I dunno if I wanna say anything about "professionally" in YOSPOS...

minidracula
Dec 22, 2007

boo woo boo

unixbeard posted:

It seems like such a niche language, I have used it a bit but the only place I know that does anything serious with it was them. What sort of stuff do you use it for?
I am using it (currently) to build what's essentially a front-end to a compiler (kinda all I want to say right at the moment). I may not stick with it, but right now there's no problem. We'll see how it goes.

I should mention that although I am using OCaml (for this particular thing), I'm pretty sure no one else here is. You could safely dump this in the "research project" bucket at the moment. Though, that would apply to this entire effort, not just the use of OCaml.

minidracula
Dec 22, 2007

boo woo boo

hackbunny posted:

how does it feel to be so fuckin kewl
Feels good man.

minidracula
Dec 22, 2007

boo woo boo
OK, I skimmed through most of the HTTP chat for the past few pages until I got to this one and just had to post, furiously swiping my thumb up the screen of my phone, so please excuse even less reading than usual, but:

1. I think lots of things should use BEEP.

2. Use a real man's protocol: PCIe Gen 3.0 x16. DMA bitch.

If and/or how serious I am about either of these is left as an exercise to the reader.

minidracula
Dec 22, 2007

boo woo boo
Actual PL inquiry: has anyone here used Julia for anything (on Windows)? Does it run/install?

I'm hoping it's a better situation than trying the current stable Rust builds on Windows but I haven't tried yet.

Also, people interested in (simple, 2D) games programming should check out and play around with strlen's Lobster.

minidracula
Dec 22, 2007

boo woo boo

gucci void main posted:

imo just write your code so perfectly that there will never be an exception
I'm convinced there's a joke to be made involving exceptionalism and "check your privilege" applied to code, but I don't know what it is.

minidracula
Dec 22, 2007

boo woo boo

Notorious b.s.d. posted:

we had this discussion earlier, like 100 pages ago. i'm just gonna give the main points here
  • json schemas are non-standard.

  • json itself is nonstandard, parsers vary in strictness

  • interoperability matters. every language/platform has a parser generator library but i don't want to port my fuckin grammar to every platform's parser generator and hope i didn't create subtle bugs

serialisation and deserialisation is one of those things people like to pretend is easy
but it's not
I guess I'll go ahead and agree with all the above, but I still ended up specifying a dumbed-down subset of JSON for an interop log file thing for some hardware because it was pulling enough teeth to do that than to end up with something I felt would be "worse" (i.e. some sort of CSV thing) for my needs. XML would have likely made my known and unknown counterparts freak out even more.

It's bad in at least a few ways (before we even get to potential JSON-badness), but it's what seemed like a compromise between what I wanted and what wouldn't unduly explode brains elsewhere. I already want to change a few things about it, but I'm not sure I'll be able to. Here's hoping. I'll find out soon enough if I can make it slightly less bad.

minidracula
Dec 22, 2007

boo woo boo

Otto Skorzeny posted:

bought an impreza outback yesterday, good tymes
I didn't know you moved to Seattle?!

minidracula
Dec 22, 2007

boo woo boo

Internaut! posted:

I have no doubt an abstract, first principles, category-theoretic approach to software development has its advantages

I'm saying the effort:reward ratio is way off for the vast majority of programming needs, so I wouldn't sweat "not getting it"

I mean I work for a capital markets group that earned our bank ~$400m profits last year so we have money to spend and the will to spend it on systems and talent, we're rewritten and continue to rewrite our entire trading core in functional scala/clojure for maximum safety/reliability/performance, and I would say the code/docs are pretty readable to the average senior dev

we might have improved the system a percentage point or two by doing it the author's way but the disadvantages are that no one but math PhDs could write the code, no one but those same math PhDs could read or modify the code, and the number one issue that code written by math PhDs is almost always terrible
On the other hand, IBM locked up the batch processing and mainframe market (ignoring their more recent Johnny-come-lately business endeavors) in the 1950s and never looked back, and has made substantially more than $400 M/yr with much less investment in talent and systems off of just ongoing license fees for just that part of their global empire, so I guess your time spent learning whatever programming languages you know, and whatever techniques you have developed, and to whatever skill level you currently have was also wasted, needless effort?

I mean, you could have stuck with any one of System/360 assembly, Fortran, PL/I, or COBOL and 1960s coding styles and been fine if you define value in terms of how well the company executed in its financial goals based on programming and systems technology...

minidracula
Dec 22, 2007

boo woo boo

darthbob88 posted:

Has this article been posted here? Because it belongs here. https://www.usenix.org/system/files/1309_14-17_mickens.pdf

Marvellous.
Eh, I feel like it tries too hard for YOSPOS. Needs more drugs, Shaggar, and/or Time Cube.

I feel like this article might capture the authentic experience of what happens to you inside the Distributed Systems group of Microsoft Research though.

Also:

quote:

Excellence. Quality. Science. These are just a few of the words that have been applied to the illustrious research career of James Mickens. In the span of a few years, James Mickens has made deep, fundamental, and amazing contributions to various areas of computer science and life. Widely acknowledged as one of the greatest scholars of his generation, James Mickens ran out of storage space for his awards in 1992, and he subsequently purchased a large cave to act as a warehouse/fortress from which he can defend himself during the inevitable robot war that was prophesied by the documentary movie "The Matrix." In his spare time, James Mickens enjoys life, liberty, and the pursuit of happiness, often (but not always) in that order, and usually (almost always) while listening to Black Sabbath.

minidracula
Dec 22, 2007

boo woo boo

tef posted:

it's a crock of poo poo

... :words:

it's basically an entire community driven by CADT (ala jwz), each writing their own miniature version of ruby to solve all of the problems that the last one created
I still like Ruby, but I feel like my enjoyment of it is inversely proportional to my interaction with the bulk of the modern-day "Ruby community" and/or the bulk of modern-day software written in Ruby.

Almost certainly both, now that I think on it.

Anyway, for at least the past few years I've been writing more Python than Ruby anyway. So there's that, tef. :v:

minidracula
Dec 22, 2007

boo woo boo

tef posted:

p much every time i've written python i've defaulted to vendor package systems, life has been good
I'm actually using only a little vendor packaging, but outside of that virtualenv + pip has more or less worked for me so far. This is on both Windows and Linux.

When you say vendor package systems, do you mean Python vendor, or OS vendor? Or both? I guess I'm using both; ActiveState's PyPM on Windows, and apt-get on Ubuntu for minimal stuff. Then I just pip install stuff in virtualenvs. I'm still waiting to get seriously bitten.

minidracula
Dec 22, 2007

boo woo boo

MononcQc posted:

I don't know if I'd prefer reading the last few pages of this thread or getting actual pages from PagerDuty.
Actual pages from PagerDuty approximately 7000% more likely to mean something to you. Yes, even considering the "noise" pages.

minidracula
Dec 22, 2007

boo woo boo

Shaggar posted:

well the thing is java always has the right library for you. and its a better language than most and it performs better than most. its just the best language all around w/ a close second place going to c#.
Guys, I can't get this Java code to properly synthesize and PnR on my Kintex-7 parts, halp.

Do I need to public static void main SerdesFactory or what???

minidracula
Dec 22, 2007

boo woo boo

Notorious b.s.d. posted:

lol Stockholm syndrome

your vertical is so tied to third rate proprietary tools vendors you have convinced yourself this is a positive good
Well, you've certainly secured my vote for Logical Leap of the Day. Good luck, I hope you make it to the finals!

Arcsech, I'm sorry, but I can't back you this round what with your refutation being at least relevant to my argument. Do try to YOSPOS harder next time.

minidracula
Dec 22, 2007

boo woo boo
Whoa, I was already basically writing this thread off, but then there was a bunch of good discussion on versioning and isolation of things that started off with Notorious b.s.d. saying something I strongly agreed with (color me shocked!):

Notorious b.s.d. posted:

having a standard format on day one is brilliant

automatically downloading code from GitHub is the goofiest thing
Then more joined in on the fun:

abraham linksys posted:

yeah that's pretty up there too

want to make a new version of your open source library w/ breaking changes? guess you should make a new repo!
Somewhere around here I was rooting for Notorious b.s.d.:

Notorious b.s.d. posted:

this would be very not-awesome. it perpetuates the problem. artifact versions are not necessarily code versions. conflating the two gets us gems and cpan and pip and all the other vile things scripting languages have used to distribute code

i promised the thread a big old effort-post about this poo poo, but when i got working on it, i realized i could probably do something better with the resulting document than post it on yospos
And then I got what I wanted, both barrels full. I was actually happy about all this. In fairness, maybe I thought all this chat was great because I skimmed most of it rather than read it in depth, but, you know, it's YOSPOS. Investing actual time is probably a lose-lose.

Alas, my high spirits were soon brought back to this mortal plane with comments like:

Notorious b.s.d. posted:

if you use windows you have much bigger problems than artifact versioning or release management

you can't fix stupid
and:

Notorious b.s.d. posted:

you would be surprised how many of those problems disappear when you get the gently caress out of a windows shop

it's not that windows causes problems, it's that firms who choose windows are too dumb to live
Oh well, it was a nice thought.

Still, somewhat more seriously (and more serious than I probably ought to be in YOSPOS), I mostly agree with what Notorious b.s.d. said about versioning and isolation of stuff in production, and that Go "feature" bugs me enough for me to basically never use it. But I only ever play/tinker with Go right now, so who cares.

In other news:

Gazpacho posted:

i know that, as I understand the history the public-facing web teams got to banish the old (obidos) code and rethink everything years ago, supply chain has never had that chance afaict
More than once, too! obidos begat gp (Gurupa), and I think there's at least one other newer rev behind retail after that too, if not more (or they've just decided to hide gp behind more layers).

hackbunny posted:

don't know what else to post, also everything I say may be inaccurate because it's been years since I've worked for ReactOS or on Windows code at all

... :words: ...
This was awesome, thanks for this. Post more of this!
Also: do I know you?

minidracula
Dec 22, 2007

boo woo boo

FamDav posted:

kill yourself

Notorious b.s.d. posted:

nice content-free meta post about your vote on the thread. i am so glad we could get a ten paragraph explanation of your choice to click "1"
I confess I don't understand YOSPOS, so, I guess that's on me. Anyway, I fail to see how either of the above are "content" (even after, for your post, eliding 15 blank lines of text you included.) Here's to senselessness!

But, speaking of content: TIL about "hypercomputing", which apparently is actually a thing. Or, at least a term. Definitions of what it is -- if it is even anything in the first place -- seem to differ. This being the PL YOSPOS thread, I wondered what programming languages might be like on such a computer, if it could exist.

I guess I was prompted to go down this line of thought after reading a bit (and some fiddling with) some PLs that try to tackle quantum computer programming (e.g. Quipper site, paper (PDF), .tgz) and some quantum circuit simulators (e.g. QuIDDPro site). But, qubit-model quantum computers don't satisfy the available definitions of a hypercomputer, since they can be modeled in classic Turing machines (albeit in PSPACE complexity).

minidracula fucked around with this message at 06:53 on Nov 26, 2013

minidracula
Dec 22, 2007

boo woo boo

Gazpacho posted:

gurupa had filtered down to us, the switchover deadline for internal sites was in 2009 but most supply chain software is console apps talking to services, both layered over an unbelievable number of unowned and undocumented platform libraries

i think it was two months after i got there that i asked one of the colleagues where i could find documentation or a tutorial for the console UI library that the whole division was using, he said he didn't know, as if he was completely mystified that anyone would ask, and i promptly had a fit
Sounds legit. I got there around mid-2004 and IIRC, gp was the new hotness everyone in retail was moving things over to, but obidos had a long tail to live out. I also remember the horror stories of the monolithic executable, and seem to recall something about some sort of home-grown network transparent far pointer-esque thing that existed... but I may be imagining that (or maybe someone was pulling my leg; so glad I didn't have to deal with any of that). I think this was still in the days of the HP Superdomes, shortly before Vogels got there.

minidracula
Dec 22, 2007

boo woo boo

FamDav posted:

this just gets more and more depressing.
Yeah, but, you know, $61 billion in 2012 revenue, so some people aren't that depressed. Sure, it's gross, but is there a company at that scale that isn't? Isn't that why we all hate programming?

minidracula
Dec 22, 2007

boo woo boo

coffeetable posted:

hypercomputation is a very vague term that CS guys use when they want to mess with the philosophy end of the discipline. it can mean any one of a dozen things, so if you're interested in it pick one author and run with their + their associate's writings rather than googlin' around. my favourite of the bunch is aaronson's stuff on computation near closed timelike curves, but that's just me
Yeah, I didn't even know it was, like, an actual term with any kind of meaning (debated or not) until this evening. Hell, I only saw it for the first time earlier today, in the context of a government RFP. I only looked it up out of a mildly nagging sense that it couldn't be a completely made-from-whole-cloth word (i.e. by the anonymous author(s) of the RFP) if it made it into that document.

Can't say I'm too surprised to find out some serious people in computability think it's bunk though. Right now I've skimmed over a couple of Davis's papers arguing against hypercomputation being anything real (other than, at best, a flashier name for un-computability; Davis also discards that there are any possible results in "hypercomputation" in the sense the peddlers of the term assert there are), and one of Andrew Hodges's articles rebutting an original hypercomputation paper.

Like you said, I think I need to pick a thread and see how deep the rabbit/k-hole goes.

coffeetable posted:

quantum computers are not hypercomputers in any sense i've seen. like you said they can at most solve problems in PSPACE, and the inclination in the field at the moment is that they're almost certainly in NP too. in fact, while no-one's expecting it i don't think anyone would be ~too~ surprised if it turned out that BQP = BPP and they're only as powerful as probabilistic computers.

if you're interested in the "practice" of this stuff, the best intro i've seen is kaye's An Introduction to Quantum Computing. if you wanna know more about the complexity theory side, then chap 10 of arora & barak's A Modern Approachmight need to read some of the other chapters first

suffice it to say that we're a long, long way from a general programming language for qcomps. mostly because it'll be a long time before we need one - as far as we can tell there are only a handful of problems that they make noticeably easier, so if qcomps ever make it out of research it'll be as lil' black boxes with
Nice! I didn't know about Kaye, so thanks for that! I'll give that book a spin. So far I've only read Quantum Computing: A Gentle Introduction by Rieffel and Polak. Arora & Barak's a great book; I'll be sure to re-read Chapter 10. Thanks for that pointer too!

Regarding PLs for quantum computers, yeah, I agree. So far all the actual language work isn't focused on what a language might look like to "program" a quantum computer, but rather how to model quantum computations in the qubit model (but to be simulated on traditional computers). There doesn't seem to be a whole lot separating things like Quipper from more quantum circuit simulator things like jQuantum other than approach and notation. Again, that seems fine for the reasons you point out.

You have any opinions, thoughts, whatever on D-WAVE and their claims?

minidracula
Dec 22, 2007

boo woo boo

AWWNAW posted:

did u buy any of the books? cuz I'm trying to pick one out and dunno which one to get

so far i've just been reading the docs and online tutes
I didn't initially learn F# from books either, but I've bought almost every F# book out there that I know of (I do this for lots of languages). If you just wanted one to start with that covers the current latest and greatest version, I'd recommend Programming F# 3.0 by Chris Smith (O'Reilly, Amazon).

No Starch Press has a F# book in Early Access, where you buy the PDF ebook now, get access to DRM-free PDF chapters as they come out, and get the finished ebook bundle (PDF, ePub, MOBI) when it ships. Right now you can get also get 30% off (until a week from January 9) with coupon code EARLYBIRD. I don't know anything about this book yet though, nor do I know who Dave Fancher (the author) is, beyond the author blurb on the No Starch page.

minidracula
Dec 22, 2007

boo woo boo

Malcolm XML posted:

no and kill yourself

unicode syntax is what makes agda bad since u have to use their emacs plugin
You totes don't. You can write Agda without Unicode symbols if you want. It's true the source files should be UTF-8, but you can avoid anything outside of ASCII if you really must. Also you could just use some other editor that supports Unicode, or XCompose on Unix/Linux.

Most use of Unicode symbols in Agda are for names, and for easy reading similar to mathematical notation conventions, e.g. I can write ℕ (\u2115) instead of Nat. Agda is permissive about names; excepting for reserved words (which can't be part of a name) and a few other characters, names are almost any whitespace separated printable Unicode character sequence.

Zombywuf posted:

Agda is my favourite puzzle game.

I haven't ever actually made anything in it though.
Yeah, I don't do anything in Agda either, other than fool around. For anything "real", I gravitate more to Coq, or a HOL system. It seems like Agda is both younger, and more of a moving target right now.

You can always prove false over and over again though. But then, I'm also not huge into Haskell, despite tinkering with it for what seems like forever. Maybe Agda's connection to/implementation in Haskell has something to do with me not really getting deep into it. Almost 100% certain Zombywuf has written more real world Haskell than I have though.

minidracula fucked around with this message at 02:33 on Jan 15, 2014

minidracula
Dec 22, 2007

boo woo boo

pointsofdata posted:

can anyone explain how Mathematica works? it feels pretty functional and i think there's a type system there somewhere but it seems very flexible.
It's a term rewriting system largely implemented in C (last I knew), with a bunch of support libraries for various types of simplification and numerical methods, some open source (e.g. GMP).

You can write Mathematica code in a pretty functional style, and most of the official (and unofficial) documentation recommends doing so.

If you're interested in TRS's and equational programming/equational programming languages, a modern one to check out is Pure, by Albert Gräf. It uses LLVM under the hood, has a decent C FFI, and is the successor to his earlier language Q.

I also salvaged some earlier specific work on TRS's and such from the ravages of Internet time with the help of one of the original authors over a year ago; this is a good reminder to me that I need to write a couple wiki pages and turn that stuff loose already.

minidracula
Dec 22, 2007

boo woo boo

Subjunctive posted:

word coming back from CppCon is that Apple's shutting down their clang work and redirecting all those people to Swift or the highway. (or other internal positions that aren't compiler work.) leaves google as the only major clang contributor at this point, I think.

congratulations on your language coup, rjmccall!

unrelated: what's the state of the art for building robust Windows drivers? C++ and unit tests and prayer? is any of their model-checking stuff used in production?
As far as Microsoft supported and provided tools go, /analyze (formerly PREfast for Drivers) is very much in use, along with SAL (2.0) annotations. SLAM is baked into SDV (Static Driver Verifier), which itself is baked into Visual Studio nowadays, like /analyze functionality, rather than being a standalone tool in the WDK. You can still get SLAM in the SDVRP, which is the academic/research version of the SDV, too.

minidracula
Dec 22, 2007

boo woo boo
What the hell is Kotlin?

Not-Scala?

minidracula
Dec 22, 2007

boo woo boo
What is Shaggar's new av from? Did you buy that Shaggar, or was it bestowed upon you by Avatars Unknown?

minidracula
Dec 22, 2007

boo woo boo

Subjunctive posted:

Anyone know of a programmable HDMI dingus? I'd like to be able to poke it via USB to make it simulate being absent, or respond with a provided EDID. (I know very little about HDMI, be gentle.)
Like, how programmable do you mean, man?

We have a credit card sized board with a Zynq 7020 and HDMI on-board, (and Micro-USB), though the HDMI core that was written for it... needs work. Let me leave it at that.

To use it, you'd need to be comfy with an HDL (Verilog, VHDL) and Xilinx tools (Vivado WebPACK would work, so that's free, but a ginormous install).

I bet there's something cheaper; how much are you looking to spend? If you find something that works for you, let me know what you used, I'd be curious.

minidracula
Dec 22, 2007

boo woo boo

Vanadium posted:

is there any non-lisp with a good macro system that i can rub the rust people's heads in
MetaOCaml owns bones in this arena (which is admittedly pretty small).

Maybe doesn't apply (ahem) because it's not in the core language? Whatever. I don't know if you care about that as criteria.

minidracula
Dec 22, 2007

boo woo boo

JawnV6 posted:

being on big beefy processor hardware, chucking away determinism, and floating on an abstract pile of unknowns would be a drag for me

back in college i knew a phd who was feeding file systems into ACL2 for correctness proofs, back when they were trying to feed ACL2 into itself to make sure they could trust the results. seemed awful?
On the flip side, Centaur (VIA) uses ACL2 to verify (parts of) their x86 cores. Anna Slobodova, Sol Swords, Jared Davis, and team really pushed that forward. Davis just moved to Apple's new formal verification team. They've written a number of papers and given a number of presentations on the benefits they got out of it. Earlier work at AMD with ACL2 (formally verifying floating point units) has been around for a while too.

minidracula fucked around with this message at 04:33 on Aug 1, 2016

minidracula
Dec 22, 2007

boo woo boo

JawnV6 posted:

my friend, have you heard of the glory of unums?
Heh. I have, but I admit I haven't bought his CRC Press book yet. In any format.

Adbot
ADBOT LOVES YOU

minidracula
Dec 22, 2007

boo woo boo

JawnV6 posted:

8 years ago i worked on an embedded product where the testbench GUI was written in excel
'Sup HW buddy. 6-7 years ago (gently caress me, that long already?) I worked on a hardware project where the programmable HW used Excel as a 2D IDE for writing the parallel assembly that was also statically scheduled and timesliced; each row was another clock tick, each column a processing element. There was a relatively stupid number of processing elements; you had to zoom out, resize your columns, and/or use tiny font sizes to get all the columns to display width-wise on a stock 1920 wide screen.

The input to the actual command line assembler was a CSV file. Excel was what the developers of the programmable HW Si used to hand-write the assembly, and then export to CSV. Sadly, they didn't even really leverage anything out of the fact they were using Excel to make things easier for them, or for anyone else who had to deal with it, like simple formulas and VBA macros to perform calculations & expansions, a lint-like pass, or simple timing calculation checks, etc. It was just a table UI to be able to type CSV without having to do it directly.

Currently still champ for most bonkers low level development environment I've used for nominally "modern" HW...

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply