|
Page break edit
|
# ? Nov 21, 2019 16:31 |
|
|
# ? May 25, 2024 21:49 |
|
quote:You can use the functional API to achieve what you want. The below is equivalent to what you posted. I chose the space-delimited form for the second argument, but a sequence or mapping can be used instead. Much appreciated! Phobeste posted:The functional api for things like namedtuples or enums is an insane code smell and a stain on the language imo. I mean come on Python's in an interesting spot re these data structures due to its evolved nature. The language has been active for a while now - While "There should be one-- and preferably only one --obvious way to do it. may have been true at one point, it no longer is. The language semantics remain elegant, but the standard library has a big surface area, which may be intimidating to new users, and cause experienced users to struggle reading other peoples' code. I've found my happy place, where dataclass + enum +Typing are BFFs, but this may be alien to some people, as other patterns (Including the other struct-likes, OOP inheritance/factory patterns etc) are to me. Our community has an idiomatic term "Pythonic", but I this doesn't feel useful, since there are so many patterns and practices. I understand that this is nowhere on the roadmap and will likely never happen, but I'd love to see a Python 4 that includes what we have in 3.8 now, but has stripped the standard library of clutter, and has modern tooling, docs, and best-practices included by default. I want it to feel like a designed language. Dominoes fucked around with this message at 17:29 on Nov 21, 2019 |
# ? Nov 21, 2019 17:21 |
|
I want python 4 to remove the walrus operator the rest I could take or leave
|
# ? Nov 21, 2019 17:34 |
|
I've read about it several times, but still don't know when/why I'd use it. Ie it seems like the examples are specific to patterns I don't use.
|
# ? Nov 21, 2019 17:40 |
|
the only example that sort of makes sense iscode:
replace := with a keyword phrase like 'which is' and the code becomes more readable as to wtf it is but also doesn't jive well either code:
mbt fucked around with this message at 17:48 on Nov 21, 2019 |
# ? Nov 21, 2019 17:46 |
|
Bundy posted:Ummm... No it isn't. Sorry, I forgot or plain wasn't aware of those attrs features. I really do prefer the simplicity of dataclasses, which also seem to be better supported by type checkers. (unless that's changed recently too?) Definitely agree that named tuples still suck. The class syntax is better, but... Just use attrs or dataclasses. Also, how named tuples were generated before 3.7 was cartoonishly bad.
|
# ? Nov 21, 2019 17:51 |
|
has anyone run into byzantine TLS errors when using httpx/asyncio/grpclib ? Vanishes if I switch to threadpools and requests.
|
# ? Nov 21, 2019 19:00 |
|
Malcolm XML posted:has anyone run into byzantine TLS errors when using httpx/asyncio/grpclib ? There's an environment variable mentioned in that link but it didn't seem to help me.
|
# ? Nov 21, 2019 23:17 |
|
KICK BAMA KICK posted:Oh wow I'm not 100% but this might be a weird thing I got once: long story short I was using a library that turned out was using grpc; once I implemented that some seemingly unrelated code started crashing later after starting some multiprocessing. The clue was some weird error messages that weren't coming from my code that mentioned grpc and I think some stuff about TLS errors, like you mentioned. Turned out to be grpc: it doesn't get along with fork, even though it wasn't being invoked in the subprocesses; took that library out and implemented it a different way and everything was fine. Not sure if your code forks at any point but that sounds really similar to my thing. Unfortunately not using grpcio, but an async compatible replacement called grpclib I just dumped it all for grpcio + requests and the issues vanished. Probably something to do with the hosed python ssl layer. It got rewritten in 3.8 but I'm on 3.7
|
# ? Nov 22, 2019 01:59 |
|
Asyncio is just a giant shitshow. Trio is actually the nicer replacement but like if I have to rewrite my app I'll just do it in go or something and get easy deployment for free
|
# ? Nov 22, 2019 02:02 |
|
Here's somebody deploying Python in the real world:https://www.zdnet.com/article/us-student-was-allegedly-building-a-custom-gentoo-linux-distro-for-isis/ posted:According to court documents, the suspect allegedly created a Python script to automate saving ISIS multimedia from official social media channels, so other members could re-post it on their own accounts, and help spread the terrorist group's propaganda.
|
# ? Nov 22, 2019 02:12 |
|
O'Reilly really shouldn't have published that Python for Terror book
|
# ? Nov 22, 2019 03:47 |
|
Python code:
|
# ? Nov 22, 2019 08:02 |
|
I really don't understand how to use asyncio, the dozen or so pages I've read about it are all dogshit. Is this basically just for external calls, like if you expect to have an expensive database query or some network request then you stash that as an async call and keep doing other stuff until you absolutely need those results? Or would there be some hypothetical advantage in doing 2 different series of numpy computations, on different and unrelated data, with the asyncio event loop?
|
# ? Nov 22, 2019 10:55 |
QuarkJets posted:I really don't understand how to use asyncio, the dozen or so pages I've read about it are all dogshit. Is this basically just for external calls, like if you expect to have an expensive database query or some network request then you stash that as an async call and keep doing other stuff until you absolutely need those results? Or would there be some hypothetical advantage in doing 2 different series of numpy computations, on different and unrelated data, with the asyncio event loop? It's for I/O not cpu/computation heavy tasks so no for numpy you'd look at using multiple processes. The asyncio lib is quite shite tbh the only credible thing it has going for it is it's been in stdlib long enough to get some support. Trio is now almost always the correct thing to reach for if your problem can be solved with async I/O except for maybe twisted in some cases. I've used both a fair bit in anger and even when I have to implement something that's not got a trio lib yet, it's still far more straightforward than asyncio. Callbacks and futures suck as does trying to debug them.
|
|
# ? Nov 22, 2019 11:52 |
|
Bundy posted:It's for I/O not cpu/computation heavy tasks so no for numpy you'd look at using multiple processes. Thank you, Trio looks cool and is already immediately easier to understand the basics of So would large file IO be a potential use-case? Say if you were reading 10 GB of poo poo into memory, presumably you're bottlenecked by disk access so you could spawn that IO task and then spawn a computational task that's not reliant on those IO results, for instance?
|
# ? Nov 22, 2019 20:37 |
|
What makes multiple processes better for cpu/computation heavy tasks?
|
# ? Nov 22, 2019 20:45 |
|
QuarkJets posted:Thank you, Trio looks cool and is already immediately easier to understand the basics of The main problem async libraries have to work around is that if the read/write is going to/from a cache the IO happens basically instantaneously and thus not blocking the async event loop. If the IO goes to disk then it's going to be waiting a relative eternity and thus blocking the async event loop. The problem with that is that there's no way to tell ahead of time if you're going to get a cache hit or a disk hit. So, everything goes into a thread pool and the async library fakes async access. Operating system support for async is spotty as well so there would be a lot of special-casing for a library to support, for example, the non-blockin read support added in Linux sometime in the past year or two. The problem with the thread pool solution is that it adds a massive amount of needless overhead for IO that comes from the cache. All this to say that if you currently use threads to speed up your disk I/O, you're probably not going to get any improvement using (for example) trio's async interface to disk I/O and you might even be slower because of the internal plumbing trio adds to manage this. If you don't currently use threads, you'll possibly get a speed up by using something like trio. Thermopyle fucked around with this message at 21:20 on Nov 22, 2019 |
# ? Nov 22, 2019 21:10 |
|
Dr Subterfuge posted:What makes multiple processes better for cpu/computation heavy tasks? Asyncio doesn't let you do multiple things at once, it just lets you do something else while you would otherwise be waiting for IO. If you have a lot of IO to do then you can save a lot of wall clock time by overlapping the time you spend blocked on different IO operations. If you're not spending time waiting for IO there is no time for asyncio to reclaim and you won't see a benefit.
|
# ? Nov 22, 2019 21:11 |
|
Thermopyle posted:If you don't currently use threads, you'll possibly get a speed up by using something like trio. That's where I sit For very large disk reads (e.g. > 1 GB) it's probably safe to assume that you won't just be accessing a cache, right? And disk reads block the event loop?
|
# ? Nov 22, 2019 21:22 |
|
QuarkJets posted:I really don't understand how to use asyncio, the dozen or so pages I've read about it are all dogshit. Is this basically just for external calls, like if you expect to have an expensive database query or some network request then you stash that as an async call and keep doing other stuff until you absolutely need those results? Or would there be some hypothetical advantage in doing 2 different series of numpy computations, on different and unrelated data, with the asyncio event loop? Nippashish posted:Asyncio doesn't let you do multiple things at once, it just lets you do something else while you would otherwise be waiting for IO. If you have a lot of IO to do then you can save a lot of wall clock time by overlapping the time you spend blocked on different IO operations. If you're not spending time waiting for IO there is no time for asyncio to reclaim and you won't see a benefit. asyncio is a mishmash of parallel/concurrent programming styles. Yes, you can actually run parallel tasks via a threadpool or asyncio.gather, yes the standard async/await syntax generates coroutines that are cooperatively scheduled in an event loop on a single thread. No it's not mature or well supported beyond a few good libraries like asyncpg and maybe aiohttp
|
# ? Nov 22, 2019 21:33 |
|
Nippashish posted:Asyncio doesn't let you do multiple things at once, it just lets you do something else while you would otherwise be waiting for IO. If you have a lot of IO to do then you can save a lot of wall clock time by overlapping the time you spend blocked on different IO operations. If you're not spending time waiting for IO there is no time for asyncio to reclaim and you won't see a benefit. Aren't threads just a different style of scheduling tasks for the processor? Threads could be sent to different cores, but it was my impression that isn't actually very common.
|
# ? Nov 22, 2019 21:59 |
|
Dr Subterfuge posted:Aren't threads just a different style of scheduling tasks for the processor? Threads could be sent to different cores, but it was my impression that isn't actually very common. In Python the GIL prevents threads from cooperating on computational tasks, so you're ultimately bound to having the effectiveness of one core even if a thread was for some reason running on a separate core You can get around the GIL by either launching more processes (each with their own GIL) or by calling code that's external to Python (I perceive Numba functions that get compiled by the LLVM with the GIL turned off as being "external")
|
# ? Nov 22, 2019 22:40 |
Dr Subterfuge posted:What makes multiple processes better for cpu/computation heavy tasks? If a task is asking questions/getting answers from a cpu, it's not waiting for IO so any paused task with async is useless, so you're in one thread on one cpu core. IO doesn't tend to hit the GIL so multithreading on multicore systems is fine for that but not for CPU heavy. Hence multiprocessing because each new process has a new GIL, but the overhead of a whole new process needs to be worth it.
|
|
# ? Nov 23, 2019 03:58 |
|
CarForumPoster posted:I want to update a Google Sheet row based on a bunch of HTML inputs and clicking a button using my Dash (Flask for Dashboards) app. CarForumPoster posted:is there a better way that I'm gonna feel dumb for not realizing in a minute? Why would you want to programmatically update a google sheet and not just use a database?
|
# ? Nov 23, 2019 07:27 |
|
amethystdragon posted:Why would you want to programmatically update a google sheet and not just use a database? I want a system that's redundant (e.g. task can be performed in the app or on the GSheet itself), can be used by people with no technical training, its for a very small company and I only need a one page table for this task*. *technically its a two page table as one of the sheets in the workbook is a sandbox for testing this app and some web scraping python code we're working on. CarForumPoster fucked around with this message at 21:37 on Nov 23, 2019 |
# ? Nov 23, 2019 21:30 |
|
Shot in the dark: Would it be possible to hack macros into Python? I want to make nicer syntax for matrices. Like Julia's, but in Python since Julia has some quirks* Any ideas? (Probably not feasible.) I was thinking that in Rust, you can use macros to make arbitrary syntax like this. Could write a pre-processor that parses a pseudo-Python file with nice matrix/math syntax and re-writes it in real Python, which then goes to the interpreter, but that wouldn't work in the REPL. *Ie it's slow in practice, plotting's a mess, and it has an awkward import/namespace system. ie, write something like this: Python code:
Python code:
Dominoes fucked around with this message at 07:20 on Nov 24, 2019 |
# ? Nov 24, 2019 06:28 |
|
So you only save 2 brackets per matrix row
|
# ? Nov 24, 2019 07:19 |
|
That code has twice the number of brackets as numbers, and nearly as many commas. I have IPython wired with an asquare function and use Python code:
Dominoes fucked around with this message at 07:23 on Nov 24, 2019 |
# ? Nov 24, 2019 07:20 |
|
I'd just write a plugin, extension, macro for my IDE.
|
# ? Nov 24, 2019 16:50 |
|
Dominoes posted:Shot in the dark: Would it be possible to hack macros into Python? I want to make nicer syntax for matrices. Like Julia's, but in Python since Julia has some quirks* Any ideas? (Probably not feasible.) I was thinking that in Rust, you can use macros to make arbitrary syntax like this. https://github.com/lihaoyi/macropy
|
# ? Nov 24, 2019 16:51 |
|
itt, we reimplement LISP syntax, but worse. Have a look at Hy and how they do macros, perhaps? → http://docs.hylang.org/en/stable/index.html
|
# ? Nov 24, 2019 16:59 |
|
Can't you just write a function?
|
# ? Nov 24, 2019 21:15 |
|
If I have two threads, one really fast and one really slow, and synchronization between the two is important, which threading object makes the most sense? Or rather, what obvious traps should I not fall into? I use events now, but I havent used events for long enough to know if I'd be better off with a lock or something
|
# ? Nov 26, 2019 16:29 |
|
Hollow Talk posted:Have a look at Hy and how they do macros, perhaps? → http://docs.hylang.org/en/stable/index.html
|
# ? Nov 26, 2019 16:34 |
|
Is there a way to create pdf reports without a specific package that does that? At work I am limited only to the packages that come with anaconda, and I can’t see anything that will work. Everything I find on google is just “install reportlab” and it’s frustrating because my IT department won’t let me
|
# ? Dec 1, 2019 21:44 |
|
Deadite posted:Is there a way to create pdf reports without a specific package that does that? At work I am limited only to the packages that come with anaconda, and I can’t see anything that will work. Point out that reportlab is part of the official anaconda repo and that they're being petty tyrants about this. And then create a new Anaconda environment in your home area and install whatever you want in it
|
# ? Dec 1, 2019 21:56 |
|
I don't have a good answer, but you could create the report as a spreadsheet format (There are a few libs, like xlsxwriter), then use Excel/LibreOffice to make a PDF from it. Or a document format and do the same, depending on your content.
|
# ? Dec 1, 2019 22:03 |
|
Deadite posted:Is there a way to create pdf reports without a specific package that does that? At work I am limited only to the packages that come with anaconda, and I can’t see anything that will work. You can download the source for it, run the setup.py to create an egg, and pip install that to your environment for use. Any other hosted locations would also need to do the same. Otherwise "my IT department won’t let me" is kinda vague. There are plenty of non-standard ways to use that code I'm sure... Are you trying to pip install, install through requirements.txt, download the source code, etc...?
|
# ? Dec 2, 2019 03:26 |
|
|
# ? May 25, 2024 21:49 |
|
I’m pretty new to python, and I didn’t think to just download the code and import it that way. I’m used to programming in SAS, so having to find packages to accomplish tasks is hard to get the hang of. I keep thinking there must be a way to do everything in vanilla python and that’s the wrong way to think about creating programs it seems
|
# ? Dec 2, 2019 04:27 |