Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
NinpoEspiritoSanto
Oct 22, 2013




Back at a proper keyboard now. The Guide

Adbot
ADBOT LOVES YOU

teen phone cutie
Jun 18, 2012

last year i rewrote something awful from scratch because i hate myself
Thanks guys. I'll take a look tomorrow and report back

Thermopyle
Jul 1, 2003

...the stupid are cocksure while the intelligent are full of doubt. —Bertrand Russell

You can also use caddy instead of nginx and it automatically takes care of letsencrypt certificates and does a lot of stuff for you automatically.

teen phone cutie
Jun 18, 2012

last year i rewrote something awful from scratch because i hate myself
I ended up trying to add an nginx image with this tutorial

https://medium.com/@pentacent/nginx-and-lets-encrypt-with-docker-in-less-than-5-minutes-b4b8a60d3a71

but ended up getting 502s for api.example.com

And now I have this outstanding SO question:

https://stackoverflow.com/questions/54281016/502-bad-gateway-with-nginx-reverse-proxy-for-docker-flask-app

e: thinking it might just be a Centos thing. Gonna try another OS

yep loving centos

teen phone cutie fucked around with this message at 03:51 on Jan 21, 2019

mr_package
Jun 13, 2000
I'm parsing a file line-by-line, modifying some of them based on regex pattern matching. Basic pattern is this; forgive the $ variables, trying to simplify so consider this pseudocode/Python hybrid:
code:
with source_file.open() as f:
    for line in f:
        if $line_matches_regex_pattern:
            line = $line_modified_with_value_from_dict
        yield line
Main question I have is whether other Python programmers agree using generator here is appropriate. I'm doing it so that someone can call this function and make more changes, if desired, or just write each line to a new file. Before I had a monolithic function that was doing all of this and it just made sense to break it down. I'm using yield here because the 'for line in f' pattern reads the file line by line (and not entirely into memory) which is a general pattern I'm trying to use where possible. Technically the files I'm working with today are small but they may not be in future, so trying to learn this the 'correct' way. Is it?

Grump posted:

yep loving centos
I might have fixed this with an SE Linux group / permissions change but so long ago I can't say 100%. Also I wasn't using Docker. But may be worth checking into; I definitely ran into SE Linux issues on CentOS other times as well.

edit: indent fix

mr_package fucked around with this message at 07:03 on Jan 21, 2019

bob dobbs is dead
Oct 8, 2017

I love peeps
Nap Ghost

mr_package posted:

I'm parsing a file line-by-line, modifying some of them based on regex pattern matching. Basic pattern is this; forgive the $ variables, trying to simplify so consider this pseudocode/Python hybrid:
code:
with source_file.open() as f:
    for line in f:
        if $line_matches_regex_pattern:
            line = $line_modified_with_value_from_dict
    yield line
Main question I have is whether other Python programmers agree using generator here is appropriate. I'm doing it so that someone can call this function and make more changes, if desired, or just write each line to a new file. Before I had a monolithic function that was doing all of this and it just made sense to break it down. I'm using yield here because the 'for line in f' pattern reads the file line by line (and not entirely into memory) which is a general pattern I'm trying to use where possible. Technically the files I'm working with today are small but they may not be in future, so trying to learn this the 'correct' way. Is it?

I might have fixed this with an SE Linux group / permissions change but so long ago I can't say 100%. Also I wasn't using Docker. But may be worth checking into; I definitely ran into SE Linux issues on CentOS other times as well.

indent is hosed
for is an iterator in the first place, so doing a big ol' complicated dealie with for is still O(1) memory. I would just make a mutate_line_if_line_matches_regex and use the iterator from for (or map, or a list comp, or a generator comp). then you can just apply more pure functions

if the dealio you're doing is not pure function, that's when you pull out them generator function dealios and that yield poo poo. so you would pull that out for the "do i/o" bit

(seriously consider moving heaven and earth to make what you're doing a pure function, cuz almost every pure function is testable as poo poo)

bob dobbs is dead fucked around with this message at 05:49 on Jan 21, 2019

unpacked robinhood
Feb 18, 2013

by Fluffdaddy

Boris Galerkin posted:

Make sure you read some stuff on securing your new server as well. Nobody is going to specifically target you, but they will throw out wide nets and see who they can get. Last thing you want is someone using your server in their bitcoin mining operation.

Yep, I have a dumb thing running on aws and the logs are mostly people trying to reach default admin panels for a variety of frameworks and a few attempts at setting monero miners.

mr_package
Jun 13, 2000
Ok I think I get it. I rewrote my function ("parse_line") to accept text as input and return text as output, using a dictionary as source of values for string replacement when there's a regex match. Moved the for loop to the main body of the program where it can feed this function input and then write the output line by line:
code:
with source_file_path.open() as source, dest_file_path.open("w") as dest:
    for line in source:
        dest.write(parse_line(line, dict_of_values))
Is this more in line with what you are talking about?

bob dobbs is dead
Oct 8, 2017

I love peeps
Nap Ghost

mr_package posted:

Ok I think I get it. I rewrote my function ("parse_line") to accept text as input and return text as output, using a dictionary as source of values for string replacement when there's a regex match. Moved the for loop to the main body of the program where it can feed this function input and then write the output line by line:
code:
with source_file_path.open() as source, dest_file_path.open("w") as dest:
    for line in source:
        dest.write(parse_line(line, dict_of_values))
Is this more in line with what you are talking about?

if there's exactly one dest_file, sure

teen phone cutie
Jun 18, 2012

last year i rewrote something awful from scratch because i hate myself

cinci zoo sniper posted:

Assuming you configured restart behaviour and volumes correctly in compose.yml, the app will persist through docker and server (provided that you made Docker a service) restarts, or crashes if those happen (obviously depends on how hard and how much crashed). Without restart configuration it will stay dead, without volumes you will lose all data accumulated by app(s) - while that is far from always necessary, you mention MySQL container so you probably don't want that database to lose all data in it if anything happens.

Had a quick question about data persistance in Docker.

Currently in my docker compose file I have this

code:
  db:
    image: mariadb
    ports:
      - "32000:3306"
    environment:
      MYSQL_DATABASE: drinks
      MYSQL_RANDOM_ROOT_PASSWORD: "yes"
      MYSQL_USER: username
      MYSQL_PASSWORD: password
    volumes:
      - ./data/mariadb:/var/lib/mysql/data
And on my host system, I see that a data/mariadb dir is being created but there's nothing in it. And running docker-compose down destroys everything.

Am I doing something wrong? My database is being instantiated in my Flask app like so:

code:
url = 'mysql+pymysql://user:password@db:3306/drinks'

if not database_exists(url): 
  create_database(url)

cinci zoo sniper
Mar 15, 2013




Are you running docker-compose down or docker-compose down -v? What does “destroys everything” mean? Does Docker user have write access on the volume?

unpacked robinhood
Feb 18, 2013

by Fluffdaddy
I'd like to "live-update" a geographical map with rectangular overlays, as soon as background processes bring up fresh data.
So far I have an ugly blocking thing in matplotlib/cartopy that requires manually closing the window to resume processing, until updated content pops up in a new window.
Can I make a non blocking display in matplotlib, or is there something more suitable out there ? I've seen a few SO posts on the issue but the solutions looked super clunky and confusing.

accipter
Sep 12, 2003

unpacked robinhood posted:

I'd like to "live-update" a geographical map with rectangular overlays, as soon as background processes bring up fresh data.
So far I have an ugly blocking thing in matplotlib/cartopy that requires manually closing the window to resume processing, until updated content pops up in a new window.
Can I make a non blocking display in matplotlib, or is there something more suitable out there ? I've seen a few SO posts on the issue but the solutions looked super clunky and confusing.

You might want to consider Folium. I think it should work for you.

teen phone cutie
Jun 18, 2012

last year i rewrote something awful from scratch because i hate myself

cinci zoo sniper posted:

Are you running docker-compose down or docker-compose down -v? What does “destroys everything” mean? Does Docker user have write access on the volume?

It looks like I was just mounting the volume from the wrong source

I needed

volumes:
- ./data/mariadb:/var/lib/mysql

instead of

volumes:
- ./data/mariadb:/var/lib/mysql/data

And with that I can call this pet project API complete (?) and can start work on the front-end. Thanks for the help folks I'm sure I'll be in here with more questions soon.

Also Python is v fun to write

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

mr_package posted:

I'm parsing a file line-by-line, modifying some of them based on regex pattern matching. Basic pattern is this; forgive the $ variables, trying to simplify so consider this pseudocode/Python hybrid:
code:
with source_file.open() as f:
    for line in f:
        if $line_matches_regex_pattern:
            line = $line_modified_with_value_from_dict
        yield line
Main question I have is whether other Python programmers agree using generator here is appropriate. I'm doing it so that someone can call this function and make more changes, if desired, or just write each line to a new file. Before I had a monolithic function that was doing all of this and it just made sense to break it down. I'm using yield here because the 'for line in f' pattern reads the file line by line (and not entirely into memory) which is a general pattern I'm trying to use where possible. Technically the files I'm working with today are small but they may not be in future, so trying to learn this the 'correct' way. Is it?

I might have fixed this with an SE Linux group / permissions change but so long ago I can't say 100%. Also I wasn't using Docker. But may be worth checking into; I definitely ran into SE Linux issues on CentOS other times as well.

edit: indent fix

Use sed

cinci zoo sniper
Mar 15, 2013




Grump posted:

It looks like I was just mounting the volume from the wrong source

I needed

volumes:
- ./data/mariadb:/var/lib/mysql

instead of

volumes:
- ./data/mariadb:/var/lib/mysql/data

And with that I can call this pet project API complete (?) and can start work on the front-end. Thanks for the help folks I'm sure I'll be in here with more questions soon.

Also Python is v fun to write

Godspeed!

Master_Odin
Apr 15, 2010

My spear never misses its mark...

ladies

unpacked robinhood posted:

I'd like to "live-update" a geographical map with rectangular overlays, as soon as background processes bring up fresh data.
So far I have an ugly blocking thing in matplotlib/cartopy that requires manually closing the window to resume processing, until updated content pops up in a new window.
Can I make a non blocking display in matplotlib, or is there something more suitable out there ? I've seen a few SO posts on the issue but the solutions looked super clunky and confusing.
Alternatively, your best bet is just to use Leaflet.js (or just D3) with some JS around it that makes ajax calls/websocket to your python backend to get the data and then updates the Leaflet.js, all on the frontend.

Boris Galerkin
Dec 17, 2011

I don't understand why I can't harass people online. Seriously, somebody please explain why I shouldn't be allowed to stalk others on social media!
Another attrs question:

Is there a way to set a method that needs to be called every time an attribute is set? Something like,

code:
@attr
class Foo:
    bar = attrib(default=None)
    baz = attrib(default=None)

    def _post_setter_method(self):
        value_i_just_set = ...
        self.value_name = 2 * value_i_just_set

a = Foo()
a.bar = 10
a.baz = 5
print(a)  # a.bar = 20, a.baz = 10

cinci zoo sniper
Mar 15, 2013




Boris Galerkin posted:

Another attrs question:

Is there a way to set a method that needs to be called every time an attribute is set? Something like,

code:
@attr
class Foo:
    bar = attrib(default=None)
    baz = attrib(default=None)

    def _post_setter_method(self):
        value_i_just_set = ...
        self.value_name = 2 * value_i_just_set

a = Foo()
a.bar = 10
a.baz = 5
print(a)  # a.bar = 20, a.baz = 10

Write a custom setter. https://docs.python.org/3/library/functions.html#property

Boris Galerkin
Dec 17, 2011

I don't understand why I can't harass people online. Seriously, somebody please explain why I shouldn't be allowed to stalk others on social media!

Wouldn’t I need to define a setter method for every attribute I declare then? I was hoping attrs provided a hook for a generic one that gets called for all attributes I define with attrib.

E: yeah, it looks like what I want isn’t possible so I’ll just rethink this entirely. Thanks.

Boris Galerkin fucked around with this message at 08:25 on Jan 25, 2019

cinci zoo sniper
Mar 15, 2013




Boris Galerkin posted:

Wouldn’t I need to define a setter method for every attribute I declare then? I was hoping attrs provided a hook for a generic one that gets called for all attributes I define with attrib.

E: yeah, it looks like what I want isn’t possible so I’ll just rethink this entirely. Thanks.

I mean, your concept is okay for code golf and academic exercise, but if I had to solve this problem for abstract case of N attributes then I would look at writing a custom property or something.

If this is a practical problem then yes, please do rethink it. :v:

unpacked robinhood
Feb 18, 2013

by Fluffdaddy

accipter posted:

You might want to consider Folium. I think it should work for you.

Master_Odin posted:

Alternatively, your best bet is just to use Leaflet.js (or just D3) with some JS around it that makes ajax calls/websocket to your python backend to get the data and then updates the Leaflet.js, all on the frontend.

Thanks, I played a bit with folium which is super easy to use, but it seems like I can't avoid using Leaflet.js directly. There's a realtime module that explicitely does what I need.

The module needs to poll a URL, any pointers as to how to make it as painlessly as possible ?
Usually I'd end up having the URL point to a Flask method but I'm not sure if there's a simpler way.

CarForumPoster
Jun 26, 2013

⚡POWER⚡
Question: What is the proper way to break out these email addresses into separate rows?

I get a response from an API when I lookup an email address for someone formatted like this:

code:
[Email(valid_since=datetime.datetime(2013, 5, 14, 0, 0), last_seen=datetime.datetime(2018, 8, 23, 0, 0), type_='personal', email_provider=True, address='greatemailname@gmail.com', address_md5='11137bc04acd3df7974979429e9ed15c'), Email(valid_since=datetime.datetime(2008, 4, 9, 0, 0), last_seen=datetime.datetime(2017, 12, 1, 0, 0), type_='personal', email_provider=True, address='greatemailname6@yahoo.com', address_md5='1114e2178a3ddaa46405811db44b58db')]
I stick this into its own cell in a dataframe which has the format: UserName | APIRespEmailString

I will eventually send this person an email at both emails and will need to break this into one line, one email like:

UserName1 | greatemailname@gmail.com
UserName1 | greatemailname6@yahoo.com
UserName2 | Email1
UserName2 | Email2
UserName2 | Email3
Username3 | Email1

DarthRoblox
Nov 25, 2007
*rolls ankle* *gains 15lbs* *apologizes to TFLC* *rolls ankle*...
That's an odd api response - it looks like it's just returning a list of the raw Email objects that match your query, when it should really be serialized into json or xml or whatever.

Normally you'd just parse the response, but since it's not a standard format a workable option is probably just a regex match against the email field with something like this:
code:
address=(\S+)
and then returning a list of the matches, then iterating through those combined with the username to generate the list of emails to send.

Comedy option would be to define an Email class with the correct attributes and then eval() the string to generate a list of Email objects and work with them normally. (don't actually do this)

NinpoEspiritoSanto
Oct 22, 2013




Is that the raw response? The API returns a list with datetime objects in it among other things? Are you sure you're not looking at it after parsing?

Data Graham
Dec 28, 2009

📈📊🍪😋



Style question. Which is better:

code:
    def retrieve(self, request, *args, **kwargs):
        account = get_object_or_404(Account, pk=kwargs['account_id'])
        obj = get_object_or_404(self.queryset, pk=kwargs['pk'], account=account)
Or

code:
    def retrieve(self, request, account_id=None, pk=None):
        account = get_object_or_404(Account, pk=account_id)
        obj = get_object_or_404(self.queryset, pk=pk, account=account)
?

Boris Galerkin
Dec 17, 2011

I don't understand why I can't harass people online. Seriously, somebody please explain why I shouldn't be allowed to stalk others on social media!

Data Graham posted:

Style question. Which is better:

code:
    def retrieve(self, request, *args, **kwargs):
        account = get_object_or_404(Account, pk=kwargs['account_id'])
        obj = get_object_or_404(self.queryset, pk=kwargs['pk'], account=account)
Or

code:
    def retrieve(self, request, account_id=None, pk=None):
        account = get_object_or_404(Account, pk=account_id)
        obj = get_object_or_404(self.queryset, pk=pk, account=account)
?

I like the second because my autocomplete will tell me there’s an argument I can/need to put in.

cinci zoo sniper
Mar 15, 2013




Same, I pretty much always go for the second option (but then again I don’t do anything intensely complex from tech side).

bob dobbs is dead
Oct 8, 2017

I love peeps
Nap Ghost

Data Graham posted:

Style question. Which is better:

code:
    def retrieve(self, request, *args, **kwargs):
        account = get_object_or_404(Account, pk=kwargs['account_id'])
        obj = get_object_or_404(self.queryset, pk=kwargs['pk'], account=account)
Or

code:
    def retrieve(self, request, account_id=None, pk=None):
        account = get_object_or_404(Account, pk=account_id)
        obj = get_object_or_404(self.queryset, pk=pk, account=account)
?

think of em as different bets

the first one is a bet that you're gonna add like 23 parameters so you're gonna have to do variadic arg and keyword dict proper
the second one is a bet that you're gonna add fewer than 4 or 5 params and they're gonna be used seldom

but adding 23 params is no bueno in the first place

Nippashish
Nov 2, 2005

Let me see you dance!
IMO you should only use *args or **kwargs over explicitly named arguments if you're going to make use of the fact that they give you containers (i.e. iterating over them, forwarding them to other functions, etc). If all you're going to do with the kwargs dict is access elements by name then you might as well have the names in the function signature.

NinpoEspiritoSanto
Oct 22, 2013




Agree with the other posts and besides, explicit is better than implicit.

Data Graham
Dec 28, 2009

📈📊🍪😋



Cool, thanks for the responses.

Bundy posted:

Agree with the other posts and besides, explicit is better than implicit.

Nice. I'll go this route then, since this gives me a good bite-sized rationale in case anyone wants me to defend it.

QuarkJets
Sep 8, 2008

I agree with all of the responses

I only use **kwargs and *args in cases where I'm passing those parameters through to something else

Dr Subterfuge
Aug 31, 2005

TIME TO ROC N' ROLL

Data Graham posted:

Nice. I'll go this route then, since this gives me a good bite-sized rationale in case anyone wants me to defend it.

That rationale comes packaged with some others, if you're curious.

SurgicalOntologist
Jun 17, 2004

Even beyond avoiding *args and **kwargs generally, the biggest issue with the first version is that you have keyword arguments that aren't required in the function signature, but are assumed to be there.

In this case, you know what *args will be (empty), and you know what all the keys of **kwargs will be be, so you don't get any benefit of either.

Data Graham
Dec 28, 2009

📈📊🍪😋



Dr Subterfuge posted:

That rationale comes packaged with some others, if you're curious.

Oh sure, I've got most of them committed to memory, I just hadn't connected that one to the case in question.

For context though, this is me subclassing DRF ViewSets and their class methods, which in the base implementation use the *args/**kwargs notation, so I was sort of following that pattern just out of inertia. I wanted to use the named params, but I was hoping there wasn't maybe like some unspoken rule about keeping consistency with a library you're building on top of, or something like that.

Data Graham fucked around with this message at 04:05 on Jan 27, 2019

dougdrums
Feb 25, 2005
CLIENT REQUESTED ELECTRONIC FUNDING RECEIPT (FUNDS NOW)
Ya I've used *args for variadic functions. Now I'm kinda wondering if variadic function signatures are overkill for any language that has a dynamic list built-in. The extra brackets aren't going to hurt. It only makes sense to me in python for ensuring a variable is at least a list or dict without explicit typing. The typing module and annotations can do that now though. I feel like I'm missing some other magic here or something. It makes sense in C for functions that implement dynamic structures, because you need those functions to do the actual allocation anyways.

I regret my every use of **kwargs. And of course you should split things up once the number of arguments gets unruly.

E: I guess you can use it to compose arguments but fuuuck.

bob dobbs is dead
Oct 8, 2017

I love peeps
Nap Ghost

dougdrums posted:

Ya I've used *args for variadic functions. Now I'm kinda wondering if variadic function signatures are overkill for any language that has a dynamic list built-in. The extra brackets aren't going to hurt. It only makes sense to me in python for ensuring a variable is at least a list or dict without explicit typing. The typing module and annotations can do that now though. I feel like I'm missing some other magic here or something. It makes sense in C for functions that implement dynamic structures, because you need those functions to do the actual allocation anyways.

I regret my every use of **kwargs. And of course you should split things up once the number of arguments gets unruly.

E: I guess you can use it to compose arguments but fuuuck.

You gotta then also avoid the parameter default list fuckup

dougdrums
Feb 25, 2005
CLIENT REQUESTED ELECTRONIC FUNDING RECEIPT (FUNDS NOW)
Ugh yeah but I just always assume None is [] since they're both false. If None were iterable it'd be ok. Then you might have an optional list args situation, but those should be split into seperate function definitions anyways. Of course type annotations would make it explicit.

Adbot
ADBOT LOVES YOU

Boris Galerkin
Dec 17, 2011

I don't understand why I can't harass people online. Seriously, somebody please explain why I shouldn't be allowed to stalk others on social media!
What is the easiest/proper way of securing an api endpoint with a token that only one person (me) will ever use?

My (naive) way is that I'm thinking I need to generate a random token and put this value in a read only text/config file that my script can read, and then on the user end I just set an Authorization header with that token and in my code I would compare the values and exit if they don't match.

I have no idea what I'm doing though so I feel like someone is going to tell me that this is a bad idea.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply