|
Eela6 posted:They used 'callable' as a synonym for in-place, so I assumed they meant 'mutable', given the example. Yes, I know what you assumed, I'm pointing out that there's more than one way to read it so the poster doesn't get confused.
|
# ? Feb 9, 2018 20:52 |
|
|
# ? May 16, 2024 11:55 |
Thermopyle posted:Yes, I know what you assumed, I'm pointing out that there's more than one way to read it so the poster doesn't get confused. Thanks!
|
|
# ? Feb 9, 2018 20:53 |
|
vikingstrike posted:Anybody had any issues with PyCharm skipping over breakpoints when debugging? My Google searches have failed me, and it's getting super annoying because I can't figure out how to replicate the issues. Only when I've accidentally Run the code instead of Debugging it.
|
# ? Feb 9, 2018 21:52 |
|
Slimchandi posted:When I was learning Python basics, it took me a long time to realise that when using a list comprehension, you can use any name in the for part. As well as what people have said, print takes varargs anyway, so you can unpack an iterable like my example a couple of posts ago Python code:
it might make more sense to use a generator comprehension instead though, since you don't really need the list Python code:
|
# ? Feb 10, 2018 02:21 |
|
In this case,code:
|
# ? Feb 10, 2018 10:05 |
|
For clarity, I was referring to functions with side effects or return None rather than just callables. The given example was exactly what I intended. It seems much more convenient than a for loop; is it just non-conventional or A Bad Thing?
|
# ? Feb 10, 2018 19:50 |
|
Slimchandi posted:For clarity, I was referring to functions with side effects or return None rather than just callables. The given example was exactly what I intended. Remember, it violates PEP8, but you can even do [code]for counter in iterable: function_with_side_effects()[/cod] Without the line break.
|
# ? Feb 10, 2018 20:07 |
|
Cingulate posted:In this case, Doh yes
|
# ? Feb 10, 2018 20:23 |
|
I was a bit surprisedcode:
|
# ? Feb 10, 2018 20:26 |
|
I guess there isn't really a use for it since you can do *name, especially with the Python philosophy of having one way to do a thing. It would be more convenient if the separator was supplied from elsewhere, so you always get consistent behaviour, but heySlimchandi posted:For clarity, I was referring to functions with side effects or return None rather than just callables. The given example was exactly what I intended. It's convenient in the sense that a 2-line (or 3 with a conditional) for loop can be put on 1 line instead, or even more complicated nesting. It gets to a point where that might be less clear about what you're doing though The problem is that list comprehensions generate a list, that's their whole purpose. When someone reads your code, and they see a list comprehension, they see you generating data, they don't see the side effects. You can argue that the clues are all there because a) you're not keeping the list and b) it's calling print() and everyone knows what that does, but that's not always necessarily true. The actual purpose of your code is implicit instead of explicit. (You're also actually generating a list full of Nones, which is grody garbage at best and might actually cause memory issues at worst) Unfortunately Python doesn't seem to have any equivalent to forEach, where you'd have something like for_each(function, iterable) and it would explicitly call the function with each element, returning nothing. There's a consume recipe in itertools but that means it's not actually in the standard library, for reasons, and you'd have to put it in your own code somewhere. And that means you need to add your own iterable of function calls, like consume(print(c) for c in name) or consume(map(print, name)). Which is better, in that it's more explicit, but I think the plain for loop is more pythonic If you do it a lot though, no reason you can't write your own utility function to make it super explicit and concise!
|
# ? Feb 10, 2018 21:05 |
|
map sounds like what you're talking about?
|
# ? Feb 10, 2018 21:22 |
|
Dr Subterfuge posted:map sounds like what you're talking about? baka kaba posted:consume(map(print, name)). Buuuut why not just code:
|
# ? Feb 10, 2018 21:35 |
|
^^^ that's what I'm saying, in most cases that's probably the best way to do it Yeah I used map in my last example - it returns an iterable though, you still have to consume it by iterating over it somehow Personally I don't really like the semantics of that anyway - map is for transforming elements from one set of data into another set of data, so it produces some values. In this case you're still ignoring the results and relying on the side effects of the mapping function - like here's one way to consume that example I gave: Python code:
|
# ? Feb 10, 2018 21:36 |
Hey I'm coming to Python from JS, this is just a hobby, I started that Think Python book and it is a lot of trying to make myself remember 8th grade geometry in turtle graphics form, does it get less tedious?
|
|
# ? Feb 12, 2018 03:43 |
|
Cingulate posted:baka is, in fact, literally talking about map: Trying to python from my phone was a bad idea.
|
# ? Feb 12, 2018 03:56 |
|
I know a few of you guys run your python linux dev environments in vms due to being on windows machines out of preference or necessity... Anyone have any experience with using Pycharm with docker / windows subsystem for Linux? My current place of work runs windows machines only but our app is switching over to docker containers on Linux and I'm looking for a way to develop on Linux without actually having to dual boot if possible.
|
# ? Feb 12, 2018 16:44 |
|
NtotheTC posted:I know a few of you guys run your python linux dev environments in vms due to being on windows machines out of preference or necessity... I`m lazy so I just run a bunch of VMs in Virtualbox and tell windows pycharm to remote control the binaries there with my code via remote interpreters. Works well enough for me.
|
# ? Feb 12, 2018 17:05 |
|
NtotheTC posted:I know a few of you guys run your python linux dev environments in vms due to being on windows machines out of preference or necessity... Yeah, you just configure remote interpreters. There's plenty of Googleable advice on the subject.
|
# ? Feb 12, 2018 17:25 |
|
.
|
# ? Feb 12, 2018 17:38 |
|
Noob stuff i guess: If I have a nested dictionary and a bunch of strings with nested keys on the form “[‘Dic1’][‘Dic1.2’][’innerkey’]” How can I use the strings to get at the value at innerkey? In other words Outerdic [‘Dic1’][‘Dic1.2’][’innerkey’] Obviously Outerdic “[‘Dic1’][‘Dic1.2’][’innerkey’]” Is not it
|
# ? Feb 13, 2018 16:56 |
|
If your dictionnary is declared like so : Python code:
Python code:
e: someone will probably give you a better answer unpacked robinhood fucked around with this message at 18:22 on Feb 13, 2018 |
# ? Feb 13, 2018 17:20 |
|
unpacked robinhood posted:If your dictionnary is declared like so : Yes I know, and that’s why I’m asking how I can use a string and a dictionary to the same effect. I need to unpack the string somehow and pass it to the dictionary if you get what I mean.
|
# ? Feb 13, 2018 18:00 |
|
sofokles posted:Noob stuff i guess: Not sure I completely understand what you're asking here, but I think this is what you're looking for? You have a dictionary: Python code:
Python code:
Thermopyle fucked around with this message at 18:51 on Feb 13, 2018 |
# ? Feb 13, 2018 18:36 |
|
Thermopyle posted:Not sure I completely understand what you're asking here, but I think this is what you're looking for? Thats what im after - thank you ! Reduce and operator.getitem did the trick The re pattern wont work as there could be some weird symbols in some of the keys, but i can get the string to list by other means. But for the sake of interest, as Im no good at regex, what would a pattern be for capturing all instances in single quotes starting from left so that it would capture "'this',notthis'this'....notthis..'this', andnotthis" ? I changed the do_things args a little : Python code:
|
# ? Feb 13, 2018 20:57 |
|
sofokles posted:Thats what im after
|
# ? Feb 13, 2018 20:59 |
|
sofokles posted:The re pattern wont work as there could be some weird symbols in some of the keys, but i can get the string to list by other means. But for the sake of interest, as Im no good at regex, what would a pattern be for capturing all instances in single quotes starting from left so that it would capture "'this',notthis'this'....notthis..'this', andnotthis" ? Pulled from my collection of regexs that I thought might be useful at some point: code:
|
# ? Feb 13, 2018 21:02 |
|
That's some sweet reduce action
|
# ? Feb 13, 2018 21:06 |
|
Cingulate posted:Why? Why did you want to do that? As i said I'm a noob. . Using VBA a lot I instinctively grab a dict. Also my approach is probably not totally utilitarian. I get some xml files from a supplier - that I need to clean up and transform into columnar data. They change in shape and form and exactly where, or how deep in, or under what tag, the data Im after sits - is unpredictable. So i flatten it as much as possible (cant flatten completely - wouild give duplicate keys). Then i iterate over that and create a reverse lookup dictionary to provide the paths to the "inner" keys Next the plan was/is to iterate over the first-level keys-values, and for each iterate over an ordered list of columns, and get the data values for each keypath that corresponds to a specific column and spit the resulting rows out to excel Thermopyle posted:Pulled from my collection of regexs that I thought might be useful at some point: Thanks - tryingg out sofokles fucked around with this message at 21:55 on Feb 13, 2018 |
# ? Feb 13, 2018 21:28 |
|
Is there a good article somewhere that goes through some of the idiosyncrasies of Popen? I'm sure anybody here that has used it many times in different circumstances have found little quirks based on things like the OS, how the application handles pipes, what happens with arguments, shell=True, etc. I'm not talking "hurr look at the subprocess/psutil documentation hurr." I'm talking about contextual, system caveats that plague using Popen and friends for spawning and monitoring other processes and their output.
|
# ? Feb 14, 2018 19:13 |
|
Rocko Bonaparte posted:Is there a good article somewhere that goes through some of the idiosyncrasies of Popen? I'm sure anybody here that has used it many times in different circumstances have found little quirks based on things like the OS, how the application handles pipes, what happens with arguments, shell=True, etc. I don't know of any such article, and while I'm sure one exists I will instead offer some recommendations from my own limited experience: A) shell=True will actually launch a new shell, which you want to avoid if you can; it's going to almost always be possible to use the sequence form of Popen instead of just passing in a string and shell=True, so just do that. Basically you can just build your entire command string with arguments and flags as a list or a tuple and then pass that in as the first argument of Popen, bada-bing bada-boom B) the stdout and stderr variables basically work like you'd expect, and if you want to read from them from within your Python session then there's the helpful subprocess.PIPE object. If you don't set stdout or stderr to anything then they'll just do whatever your stdout / stderr normally do (e.g. print to the terminal) C) if your Popen object goes out of scope while executing, such as when you start a long-running Popen inside of a function and then that function suddenly exits, then Python will helpfully close it for you; be sure to do something like call Popen.wait() if you actually want to keep the process running until it completes D) I only code in *nix so who the gently caress knows what a Windows environment is going to do but the same advice is probably all still true
|
# ? Feb 15, 2018 06:55 |
|
Only things I would add to that are: - you can usually just use subprocess.run as a convenience function (everything QuarkJets said still holds) - if you are dealing with pipes with text instead of bytes you may need text=True. Older guides/posts may suggest unversal_newlines=True; these are the same. I would also emphasize QuarkJet's point A. I would say you basically never need shell=True, but avoiding it just requires some understanding of what the shell does. Anything the shell is doing you can do more explicitly and securely in Python. For example, expanding wildcards you can use Path.glob or glob.glob; os.environ for environment variables, etc.
|
# ? Feb 15, 2018 16:49 |
|
I'm a very long-time Python user, but most of my experience has been with the 2.x line. I've used a few versions of 3.x, but mostly kept my code to be compatible with 2.x. However! I now have a job-interview coming up at a place that uses 3.6. I was able to pass this place's take-home coding interview just fine (or at least, fine enough to get a callback to the on-site, heh) but I'm worried about getting asked to whiteboard something related to some more recent 3.x feature that I'm less familiar with at the on-site interview. Which feature? Who knows, well not me, and that's what I'm worried about! Thus, my question: is there something I can read that has an overview of important 3.x features? Are there any important 3.x-only libraries I should know about? Thank you in advance for your advice, fellow goons.
|
# ? Feb 17, 2018 02:52 |
|
German Joey posted:I'm a very long-time Python user, but most of my experience has been with the 2.x line. I've used a few versions of 3.x, but mostly kept my code to be compatible with 2.x. However! I now have a job-interview coming up at a place that uses 3.6. A number of the features added to 3.x were backported to 2.7, which is good. But there are cheat sheets for the remainder. This python.org article is a starting point: https://docs.python.org/3/howto/pyporting.html Especially this section: https://docs.python.org/3/howto/pyporting.html#learn-the-differences-between-python-2-3 It links to a number of documents cheat sheets, and here's another: http://sebastianraschka.com/Articles/2014_python_2_3_key_diff.html What does your new employer do?
|
# ? Feb 17, 2018 03:18 |
|
I want to make multiple plots so that I can save and style them indivudually, and then I wanna be able to pass them into one master plot where I’d arrange them into some logical order, and then save that final one too. How would I do that? Something like, code:
Boris Galerkin fucked around with this message at 13:32 on Feb 21, 2018 |
# ? Feb 21, 2018 13:28 |
|
Boris Galerkin posted:I want to make multiple plots so that I can save and style them indivudually, and then I wanna be able to pass them into one master plot where I’d arrange them into some logical order, and then save that final one too. How would I do that? code:
|
# ? Feb 21, 2018 14:26 |
|
Using RQ what is the best way to create multiple workers? I'm not so familiar with systemd, is there a simple way to configure this? I kind of wish it was just built-in the way you can specify number of workers when launching gunicorn, for example. This seems like it's probably the best of the simple options, that I can find; would you follow it? https://serverfault.com/questions/730239/start-n-processes-with-one-systemd-service-file/878398#878398
|
# ? Feb 21, 2018 18:37 |
|
No that's not what I meant. Say I create my first figure, code:
code:
code:
e: In the end, I want 3 saved figures. figs 1 and 2 are full sized pictures of my sin and cos plots, and fig3 has those same plots stretched to fit into the layout I defined in fig3. It's my understanding that a figure object is just a container for axes objects, and an axes object is a container for curves, labels, markers, etc. So I feel like creating a figure object from existing axes objects should be a thing? Boris Galerkin fucked around with this message at 18:47 on Feb 21, 2018 |
# ? Feb 21, 2018 18:41 |
|
You can try this stackexchange answer: https://stackoverflow.com/questions/6309472/matplotlib-can-i-create-axessubplot-objects-then-add-them-to-a-figure-instance/46906599#46906599 But my suggestion would be to write a function that creates your axis, and give it an axis parameter. Then you call it once for its own figure, and another time for the joint figure. Much less awkward.
|
# ? Feb 21, 2018 20:57 |
|
mr_package posted:Using RQ what is the best way to create multiple workers? I'm not so familiar with systemd, is there a simple way to configure this? I kind of wish it was just built-in the way you can specify number of workers when launching gunicorn, for example. Probably more of a Linux question then a Python question. I'd try the Linux thread.
|
# ? Feb 21, 2018 21:19 |
|
|
# ? May 16, 2024 11:55 |
|
Yeah I just figured if there's any RQ-specific gotchas or advice there might be people with experience on it here-- RQ was recommended to me several pages back, and the docs straight up say if you want concurrent work you need to spawn multiple workers, so this is a problem anyone who's deployed it will already have solved. It's weird that there's no detailed info about this either in the docs or even in google searches; it would be pointless in most cases to only run a single worker. But sometimes that just means the simple solution is the one to use, and worker@01 / worker@02 / etc. seems to be a common enough systemd paradigm I'll just try that.
|
# ? Feb 21, 2018 21:53 |