|
refleks posted:I already know R and Python, dont need another pos the trick is to forget r and python and do julia instead.
|
# ¿ Jan 27, 2022 10:46 |
|
|
# ¿ May 15, 2024 18:13 |
|
The Management posted:I don’t know anything about this language so I’m gonna assume it’s bad like every other trendy language except it is not very trendy, been around for some time and if anything i think hype has gone down a bit, which suggests it might actually be good!
|
# ¿ Jan 27, 2022 17:41 |
|
i am a bit of a fanboy in general. not without its own messes: the gc is garbage, compilation pauses kind of stupid, you can get fancy with the typing in two different directions (building both towers of overly generic type nonsense while also having code which just abuses dynamic types badly), but does the usual numpy'ish stuff well, does gpu really neatly, and has had a pretty good focus from the start which means it has a lot of the more important libraries really well done and very composable. e.g. the dtable stuff is a good case for composability, pretty freely gluing together distribution-unaware implementations with a distributed scheduler into a new thing without changing those components at all: https://julialang.org/blog/2021/12/dtable-performance-assessment/
|
# ¿ Jan 27, 2022 17:54 |
|
ate poo poo on live tv posted:If your language of choice does stop the world garbage collection it should just delete itself and save everyone time. i mean, most every gc has a stop the world phase, but also julias gc is bad in ways beyond having an excessive stop the world.
|
# ¿ Jan 28, 2022 09:51 |
|
akadajet posted:does Julia even have threads?? it has if anything too many parallelism constructs, including but not limited to threads.
|
# ¿ Jan 29, 2022 21:13 |
|
|
# ¿ May 15, 2024 18:13 |
|
as well as n:m tasks/channels, mpi, a bunch of its own distributed models, and that's only the stuff under the julia org itself. the excellent cuda support remains the killer application for me with julia. the getting started is little more than installing the package and throwing in a @cuda before any expression you want to evaluate on gpu. more to it only as you find yourself needing more features, making it trivial to experiment your way through stuff.
|
# ¿ Jan 31, 2022 16:50 |