Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
susan b buffering
Nov 14, 2016

a few DRUNK BONERS posted:

Does anyone have any ideas / examples of how to design a project with different databases for different users? The idea is you would log in with username / password / group code and then be in a different database based on the group code (with no shared auth, each database has its own auth table). Database routers have no access to requests. There's some bullshit online about using thread local variables which seems dumb and fragile. Possibly this isn't something that Django can handle with a basic single server setup but maybe there's something I'm missing.

Is this the thread local variable solution you’re referring to? Because it doesn’t look terribly unreasonable for the constraints you laid out.

I don’t think there’s really gonna be a “clean” solution to this problem because the level of segregation you’re talking about is usually achieved by having a separate instance of the application for each database, which might be worth exploring even if you are constrained to a single server.

Adbot
ADBOT LOVES YOU

a few DRUNK BONERS
Mar 25, 2016

skull mask mcgee posted:

Is this the thread local variable solution you’re referring to? Because it doesn’t look terribly unreasonable for the constraints you laid out.

I don’t think there’s really gonna be a “clean” solution to this problem because the level of segregation you’re talking about is usually achieved by having a separate instance of the application for each database, which might be worth exploring even if you are constrained to a single server.

Basically I don't trust doing that kind of thing with threads when you're deployed on some cloud VM. I think you're right that there should be separate instances.

Hadlock
Nov 9, 2004

If I have base.py and it says allowed_hosts=foo.com and I run it on a server/container with env allowed_hosts=bar.net what is the expected hierarchy

lazerwolf
Dec 22, 2009

Orange and Black

Hadlock posted:

If I have base.py and it says allowed_hosts=foo.com and I run it on a server/container with env allowed_hosts=bar.net what is the expected hierarchy

Unless you specify to pull the env variable into base.pay you get what you type.

code:
allowed_hosts = env.get(“ALLOWED_HOSTS_FROM_ENV”, “foo.com”)

Dominoes
Sep 20, 2007

Hey - I'm having a struggle with preventing people from making superusers from the default Admin User page. About 80% of the time my custom form without the superuser tick works, but sometimes the original form works, and lets staff members make someone a useruser. (eg from mashing F5, or getting lucky) Relevant code:

Python code:
admin.site.unregister(User)

@register(User)
class CustomUserAdmin(UserAdmin):
    def get_form(self, request, *args, **kwargs):
        form = super(CustomUserAdmin, self).get_form(request, *args, **kwargs)

        self.fieldsets = (
            (None, {"fields": ("username", "password", "is_active")}),
            (_("Personal info"), {"fields": ("email",)}),
            (_("Permissions"), {"fields": ("is_staff", "groups")}),
            (_("Important dates"), {"fields": ("last_login", "date_joined")}),
        )

        return form
Any idea what's going on, ie why it sometimes works and sometimes doesn't?

Is there a more "hard" way to prevent people from creating superusers at a deeper level than hiding fields? Using the latest Django. Thank you.

a few DRUNK BONERS
Mar 25, 2016

`self.fieldsets` is a class attribute (not an instance attribute) and modifying it is not thread safe. I think what you want is to override get_fieldsets instead.

Dominoes
Sep 20, 2007

Thank you. That worked!

Hed
Mar 31, 2004

Fun Shoe
Does anyone here run Django serverless (Lambda, Azure Functions) in production? I have a couple of Zappa sites that are basically landing pages but for anything more involved I still host Django on EC2, with an RDS database if the site is sufficiently complex.

With Zapppa looking unhealthy I was curious if anyone is looking at anything else like serverless framework or if it's just not a great fit for Django.

For a more involved site with lots of async task queues spinning off work I'm looking for recommendations of what people like between moving to something like ECS, or going full bore with lambdas calling lambdas, etc.

CarForumPoster
Jun 26, 2013

⚡POWER⚡

Hed posted:

Does anyone here run Django serverless (Lambda, Azure Functions) in production? I have a couple of Zappa sites that are basically landing pages but for anything more involved I still host Django on EC2, with an RDS database if the site is sufficiently complex.

With Zapppa looking unhealthy I was curious if anyone is looking at anything else like serverless framework or if it's just not a great fit for Django.

For a more involved site with lots of async task queues spinning off work I'm looking for recommendations of what people like between moving to something like ECS, or going full bore with lambdas calling lambdas, etc.

I don’t have any complicated ones but yea deployed with Zappa is the easiest route. AWS SAM is good too.

For anything with a series of lambdas calling lambdas I use lambda+ stepfunctions deployed with SAM.

Data Graham
Dec 28, 2009

📈📊🍪😋



Someone please tell me if there's a better way to do this.

I have a non-abstract base model like this:

Python code:
class BaseWidget(models.Model):
    widget_type = models.CharField(max_length=20)
    thing = models.ForeignKey('things.Thing', null=True, blank=True, on_delete=models.SET_NULL)
    customer = models.ForeignKey('users.Customer', null=True, blank=True, on_delete=models.SET_NULL)
    timestamp = models.DateTimeField(auto_now_add=True)
    ...
And then two subclasses like:

Python code:
class WidgetA(BaseWidget):
    field_a = models.IntegerField()

class WidgetB(BaseWidget):
    field_B = models.IntegerField()
In other words, WidgetA and WidgetB are identical (and have a ton of fields in common) except for a couple of extra fields on each one.

So, suppose I have a WidgetA and I want to convert it to a WidgetB.

What's the best way of doing this? I basically want something like

Python code:
widget_a_values = some_dict_of_BaseWidget_field_values_from_widget_a
widget_b = WidgetB(**widget_a_values)
But that dict of widget_a values has to come from some method or serializer class, which I would have to write custom anyway. The upshot being that it doesn't save me any more code or field-by-field explicitness than just doing:

Python code:
widget_b = WidgetB(
    widget_type=widget_a.widget_type,
    thing=widget_a.thing,
    customer=widget_a.customer,
    timestamp=widget_a.timestamp,
    ....
)
And so on and on and on, which is how I've got it currently.

Is there some shortcut for "pass all the common fields from widget_a to a new WidgetB object"? Maybe something to do with the model _meta API?

D34THROW
Jan 29, 2012

RETAIL RETAIL LISTEN TO ME BITCH ABOUT RETAIL
:rant:
What about vars(widget_a)? That returns a dict of the instance attributes.

Python code:
>>> class Foo:
        ...
>>> foo = Foo()
>>> vars(foo)
{'bar': 1, 'baz': 'baz', 'qux': 1.5}
EDIT: There's also widget_a.__dict__ but that's everything including methods.


EDIT 2: Rereading, you're trying to avoid having to do that. :doh:

D34THROW fucked around with this message at 19:26 on Mar 18, 2022

Data Graham
Dec 28, 2009

📈📊🍪😋



That's actually very close to what I want, thanks! The only issue is that there are a few meta-fields that I have to pop out or WidgetB will barf.

This is what I did right after posting:

Python code:
        base_widget_fields = {}
        for field in widget_a.basewidget._meta.get_fields():
            if field.name not in ['id', 'widget_a', 'widget_b']:
                base_widget_fields[field.name] = getattr(widget_a, field.name)
So using _meta I have to strip out a few of the fields I don't want anyway, but that just means it's about the same amount of effort as vars. I can handle an exclusion list as opposed to an inclusion list.

Now I just wonder which is more pythonic and less brittle, _meta or vars ...

Data Graham
Dec 28, 2009

📈📊🍪😋



Oh, interesting. Turns out that if you make a dict from vars(widget_a) and then use pop('_state'), suddenly your widget_a object has no _state and then it barfs if you try to delete() it.

I could do a deepcopy of the dict and then pop it out of there, but that's more weird opaque boilerplate and still requires the explicit popping, so it doesn't blow my skirt up from a readability standpoint.

So I think the moral of the story here is that there is evil magic in the model classes and if they're going to give me _meta, I might as well just use that; turns out it may actually be the most straightforward/single-step way. But thanks again for reminding me about vars, I'd forgotten entirely that it existed...

Hed
Mar 31, 2004

Fun Shoe
Sounds like you’ve got it sorted but yeah I was going to reply that _meta is at least the blessed way since they’ve declared it stable.

Dominoes
Sep 20, 2007

Data Graham posted:

Someone please tell me if there's a better way to do this.

Python code:
class Category(Enum):
    A = auto()
    B = auto()


@dataclass
class Widget:
    widget_type = models.CharField(max_length=20)
    category = Category
    thing = models.ForeignKey('things.Thing', null=True, blank=True, on_delete=models.SET_NULL)
    customer = models.ForeignKey('users.Customer', null=True, blank=True, on_delete=models.SET_NULL)
    timestamp = models.DateTimeField(auto_now_add=True)
    field = models.IntegerField()

Hed
Mar 31, 2004

Fun Shoe
Just doing my check-in to see if anyone's got a good guide to dockerizing Django to run on Fargate or k8s ... I've used cookiecutter-django or some random script I found but they are really opinionated with packages and how to run them.

Hadlock
Nov 9, 2004

Cross posting from the devops thread as I got no response on over a week

Hadlock posted:

Is there an established plan of action for running Django migrations on logical postgres replicas

Migrations on physical/classic replication is a snap but there's a ton of moving parts to account for with logical

I think the answer is "no, don't use local replication that way" but curious if anything has changed recently

It's worth noting that logical replication is a newer type of replication that lets you replicate just a single table or even specific columns of data in a table, different from normal replication i.e. physical replication

I'm sure the Venn diagram of Django developers that give two fucks about logical replication on postgres is vanishingly small but gonna give it another try anyways

worms butthole guy
Jan 29, 2021

by Fluffdaddy
I've been using Django more and more, but I haven't deployed anything yet. Something that has me confused is I created a ImageField in a model, but upon reading documentation I saw something that said Django can't actually serve media? So does that mean any image I add to the sqlite db won't actually be grabbable by a API?

Thanks and sorry for the dumb question.

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
Django is designed so that static assets (JavaScript, CSS, images) can be served by a CDN, i.e. a separate host to your webserver so that it's not bogged down with small easily-cacheable requests. However not everyone has a CDN so a lot of people use a plugin like whitenoise to serve these assets from the same web server as the Django app itself.

For media uploaded via users, you need to build a view to actually serve those; i.e. something that will parse a URL to determine which file to serve, pull the data from wherever it's stored, and send it to the user. Django has a built-in view for this that's convenient for development but isn't suitable for production for various reasons.

But you could write your own. E.g. If you have a model "MyImage", you might write a view with url pattern "/view_image/<int:model_id>/", and the view might look something like:

code:
import mimetypes
from django.http import FileResponse

def view_image(request: HttpRequest, model_id: int) -> FileResponse:
    model = get_object_or_404(MyImage, model_id)
    fullpath = MEDIA_PATH + model.file_name   # This is more psuedo-code than actual code
    content_type, encoding = mimetypes.guess_type(str(fullpath))
    content_type = content_type or 'application/octet-stream'
    response = FileResponse(fullpath.open('rb'), content_type=content_type)
    if encoding:
        response.headers["Content-Encoding"] = encoding
    return response
That's really basic, there's probably a better way to do it; almost certainly there's some Django plugin or utility that can do a better job, handling cache headers, Last-Modified headers, etc.

worms butthole guy
Jan 29, 2021

by Fluffdaddy
Perfect, thank you for the help. I'm not going to push this little app live and it'l llive on my home network so a CDN might be a bit excessive, but will definitely use that built in view tool.

Thank you!

Data Graham
Dec 28, 2009

📈📊🍪😋



Also the most common (imo) use case is that you would set up Apache or Nginx to serve static files as well as media files directly via a traditional old-school directory alias, because you don't want to be passing requests for statics into your app server and making Django spin up a whole process for every single one. Nginx is built to serve up statics super fast and low-overhead, and you only want it to reverse-proxy requests to routes that actually need processing (i.e. Django routes).

Of course that also means you can't do any meaningful access protection or authorization on those statics, so if you care about who can read those files, either you have to make obfuscated paths for any sensitive media to prevent the URLs from being guessable, or (if security by obscurity isn't your bag) bite the bullet and route those requests through an actual Django view like in the above post, so you can do an ownership check.

worms butthole guy
Jan 29, 2021

by Fluffdaddy
So what i'm doing is just using Django as a API to get 1 piece of text and then a image to use for a background, of which all this information (and the images) are added only by the admin using Django Admin; so I don't think I need to worry about that stuff right? I'm not worried if someone gets in and there's nothing private, although I guess if I ever want to publically expose the API then I would need to worry. I guess I could offload the images to a CDN and then just hold onto the link as a URL in the API which would probably make stuff easier. I'm guessing there's no free CDN's for like ~200 images tho lol.

I guess my sub question is, where can I host this API on the web if I wanted to for "free" or cheap? It has maybe 200 objects it'll expose on the API, if that makes sense.

Thanks for all the help.

Data Graham
Dec 28, 2009

📈📊🍪😋



Yeah if I were you I wouldn't be thinking about CDNs; just store the images on your server and set up Apache/Nginx to say "/media is an alias for /path/to/my/media_root/" (where /media is what you have MEDIA_URL set to and /path/to/my/media_root/ is MEDIA_ROOT). That way your Django admin will put the files in there and users will be able to access them directly by their URL as provided by the FileField.url value.

sb hermit
Dec 13, 2016





worms butthole guy posted:

So what i'm doing is just using Django as a API to get 1 piece of text and then a image to use for a background, of which all this information (and the images) are added only by the admin using Django Admin; so I don't think I need to worry about that stuff right? I'm not worried if someone gets in and there's nothing private, although I guess if I ever want to publically expose the API then I would need to worry. I guess I could offload the images to a CDN and then just hold onto the link as a URL in the API which would probably make stuff easier. I'm guessing there's no free CDN's for like ~200 images tho lol.

I guess my sub question is, where can I host this API on the web if I wanted to for "free" or cheap? It has maybe 200 objects it'll expose on the API, if that makes sense.

Thanks for all the help.

For hosting your api, try the web hosting thread:

https://forums.somethingawful.com/showthread.php?threadid=3289126

sb hermit
Dec 13, 2016






To be specific, you'll probably want virtual private hosting, unless you can find a cheap or free provider that just does django or something.

You can also try hosting it at home and using something like noip.com but that's not very secure.

worms butthole guy
Jan 29, 2021

by Fluffdaddy
Sweet thank you both!

Hadlock
Nov 9, 2004

For something that simple, you could host your Django API endpoint on a lambda and just pay $0.0005/request or whatever, have it serve up an S3 link to the image with a one time access token in the response

If you want to code golf this you could probably drop the database requirement

https://dev.to/vaddimart/deploy-django-app-on-aws-lambda-using-serverless-part-1-1i90

Hadlock fucked around with this message at 02:26 on May 5, 2022

worms butthole guy
Jan 29, 2021

by Fluffdaddy
I ended up using PythonAnywhere, didn't know you could lamba that though. Maybe i'll move it to that.

Data Graham
Dec 28, 2009

📈📊🍪😋



Is there any better way to do this?

Python code:
class User(models.Model):

    location = PointField(srid=4326, geography=True, blank=True, null=True)

    def distance_from(self, from_user):
        if not from_user:
            raise Exception('User must be provided.')

        annotated_users = User.objects.annotate(distance=Distance('location', self.location))
        annotated_from_user = annotated_users.filter(username=from_user.username).values('distance').first()
        if annotated_from_user:
            return annotated_from_user['distance']
        return None
i.e. what I'm trying to do is "I already have a User (and a self which is also a User), I want to annotate this one object with a Distance function so I can calculate the distance between them"

Is the only way really to do a whole queryset/annotate/filter on the id of the object I already have? Feel like this should be more of a one-step kind of a thing but I can't brain my way into it.



e: Here's how I'm a do it

Python code:
class UserManager(BaseUserManager):

    def with_distance_from(self, from_user):
        return self.get_queryset().annotate(distance=Distance('location', from_user.location))


class User(models.Model):

    location = PointField(srid=4326, geography=True, blank=True, null=True)

    def distance_from(self, from_user):
        if not from_user:
            raise Exception('User must be provided.')

        annotated_users = User.objects.with_distance_from(self)
        annotated_from_user = annotated_users.filter(username=from_user.username).values('distance').first()
        if annotated_from_user:
            return annotated_from_user['distance']
        return None


In [6]: User.objects.with_distance_from(cool_shirtman)[2].distance
Out[6]: Distance(m=672842.6144303734)
Still annoying that I have to re-filter based on an attribute of the object I already have, but at least the annotation stuff is centralized and reusable now.


e2: better yet

Python code:
    def distance_from(self, from_user):
        if not from_user:
            raise Exception('User must be provided.')

        annotated_users = User.objects.with_distance_from(self)
        return annotated_users.get(id=from_user.id).distance
fu unnecessarily paranoid error handling

Data Graham fucked around with this message at 16:04 on Jun 25, 2022

Sleepy Robot
Mar 24, 2006
instant constitutional scholar, just add astonomist
I'm creating an app that involves the User saving "favorite places" pins from Google Maps to the db.

So when creating the User model, it seems it's technically possible to extend AbstractUser and add the auth fields + favorite_places=models.ManyToManyField().

Experienced developers seem to recommend creating a separate model, something like Profile, and adding the additional field(s) there, and then creating a OneToOne relationship between Profile and User.

Any thoughts on this? To me as a novice, having 1 User model that can take care of everything seems more attractive than having a User model for auth + a Profile model for additional info and creating a 1-to-1 relationship between them. Why is it advised to do this?

Data Graham
Dec 28, 2009

📈📊🍪😋



It's just easier to tack on a Profile model after the fact, if you never built your app with a custom User model to begin with. If you just used the built-in one, and your app is now in production, you get yourself into a situation where migrating to a brand-new custom User model via migrations is a huge pain.

If I'm starting out from a blank sheet, I always like to build my own "User(AbstractUser): pass" as step 1, before even "initial commit".

Hed
Mar 31, 2004

Fun Shoe
I have a Django site behind an AWS ALB, and the auth and redirection it provides is great.

However, my old nemesis the "Invalid HTTP_HOST Header" comes up when the ALB does its health checks. For regular requests, of course the Host header is set in HTTP, but for the health checks it just goes for it, and I end up with "Invalid HTTP_HOST header: '172.31.29.18:5000'" may need to be added to your list of hosts.

I don't see a way to customize the behavior of the ALB health check, is the only way around this to change my Django config for prod to scrape out the internal IP from the AWS HOSTNAME variable (currently HOSTNAME=ip-172-31-29-18.ec2.internal) and stuff it into the ALLOWED_HOSTS ?

I feel like there should be a lot of people with this problem but I'm clearly not encountering it in my searches.

Hed
Mar 31, 2004

Fun Shoe

Hed posted:

I have a Django site behind an AWS ALB, and the auth and redirection it provides is great.

However, my old nemesis the "Invalid HTTP_HOST Header" comes up when the ALB does its health checks. For regular requests, of course the Host header is set in HTTP, but for the health checks it just goes for it, and I end up with "Invalid HTTP_HOST header: '172.31.29.18:5000'" may need to be added to your list of hosts.

I don't see a way to customize the behavior of the ALB health check, is the only way around this to change my Django config for prod to scrape out the internal IP from the AWS HOSTNAME variable (currently HOSTNAME=ip-172-31-29-18.ec2.internal) and stuff it into the ALLOWED_HOSTS ?

I feel like there should be a lot of people with this problem but I'm clearly not encountering it in my searches.

If anyone cares I solved this by writing some Django middleware, and putting it higher in the settings.MIDDLEWARE stack than the built-in SecurityMiddleware, such that it short-circuits the response before "Host:" header gets checked in the HTTP request:

Python code:
class HealthCheckMiddleware:
    def __init__(self, get_response):
        self.get_response = get_response
        # One-time configuration and initialization.
        self.healthy_response = HttpResponse("OK")

    def __call__(self, request: HttpRequest):
        # Code to be executed for each request before
        # the view (and later middleware) are called.
        if request.META["PATH_INFO"] == "/healthcheck/":
            return self.healthy_response
        else:
            response = self.get_response(request)
            return response

CarForumPoster
Jun 26, 2013

⚡POWER⚡

Hed posted:

If anyone cares I solved this by writing some Django middleware, and putting it higher in the settings.MIDDLEWARE stack than the built-in SecurityMiddleware, such that it short-circuits the response before "Host:" header gets checked in the HTTP request:

Python code:
class HealthCheckMiddleware:
    def __init__(self, get_response):
        self.get_response = get_response
        # One-time configuration and initialization.
        self.healthy_response = HttpResponse("OK")

    def __call__(self, request: HttpRequest):
        # Code to be executed for each request before
        # the view (and later middleware) are called.
        if request.META["PATH_INFO"] == "/healthcheck/":
            return self.healthy_response
        else:
            response = self.get_response(request)
            return response


This actually is helpful, I forgot that we had this same problem 2 years ago and had to fix it until you posted your solution, unfortunately. I still don't recall how we fixed it.

galenanorth
May 19, 2016

I'll be using Django for web scraping to collect data for a business-to-business data service, mainly for business locations. Someone might be using or viewing parts of the PostgreSQL database directly. I want to make it easier to see which columns go with which table during joins or complex queries by using a custom naming convention.

The default table name is appname_modelname, and Django doesn't convert the CamelCase model name convention into snake_case. Using the db_table option, the default appname_modelname for the table becomes model_name.

Using the db_column option, the default field_name for the column becomes table_name_column_name. The id column becomes table_name_id. The purpose behind this convention is that during a join or complex query, or when different models have the same field names (e.g. first_name), it will make it easier to see which column goes with which table. Particularly, the foreign key will stand out. If there were a product table (using a hypothetical unrelated to the project), its column headers would be product_id, product_description, vendor_id, product_purchase_date, product_price.

Is this going to cause difficulty down the line? Removing the appname_ prefix might lead to a higher chance of naming conflicts, but leaving it would make writing SQL queries take slightly longer. Looking at my PostgreSQL textbook, which uses app_name.table_name, it doesn't look like they'll be that much longer and there are always aliases

Edit: Also, for naming indexes, I'll use {tablename}_{columnname(s)}_idx. This is PostgreSQL's system for default index naming, and it's used automatically behind-the-scenes when PostgreSQL creates constraints. Django shortens index names to 30 characters for compatibility with Oracle, but I'd like the names to be easier to read and more predictable, so if I accidentally break something and have to go into the DBMS, the SQL environment will feel familiar

galenanorth fucked around with this message at 21:45 on Mar 12, 2023

duck monster
Dec 15, 2004

galenanorth posted:

I'll be using Django for web scraping to collect data for a business-to-business data service, mainly for business locations. Someone might be using or viewing parts of the PostgreSQL database directly. I want to make it easier to see which columns go with which table during joins or complex queries by using a custom naming convention.

The default table name is appname_modelname, and Django doesn't convert the CamelCase model name convention into snake_case. Using the db_table option, the default appname_modelname for the table becomes model_name.

Using the db_column option, the default field_name for the column becomes table_name_column_name. The id column becomes table_name_id. The purpose behind this convention is that during a join or complex query, or when different models have the same field names (e.g. first_name), it will make it easier to see which column goes with which table. Particularly, the foreign key will stand out. If there were a product table (using a hypothetical unrelated to the project), its column headers would be product_id, product_description, vendor_id, product_purchase_date, product_price.

Is this going to cause difficulty down the line? Removing the appname_ prefix might lead to a higher chance of naming conflicts, but leaving it would make writing SQL queries take slightly longer. Looking at my PostgreSQL textbook, which uses app_name.table_name, it doesn't look like they'll be that much longer and there are always aliases

You can override these conventions pretty straight forwardly. Still, I *highly* advise against lettiing clients have direct access to postgres. (But I get it happens, we have to do this at work for a couple of clients, and lets say I'm not a fan but its not up to me, these clients are multi-billion $$$$ multinationals)

death cob for cutie
Dec 30, 2006

dwarves won't delve no more
too much splatting down on Zot:4
I'm helping to take over Votefinder, a Django project for keeping track of voting in games of Mafia on the forums. The previous maintainers of the project were not always Django/Python folks, and in doing an emergency scramble to re-establish the server while it was down I hit a big stumbling block because I didn't realize a previous maintainer of the project changed some assumptions about how Django projects are structured. I'm a bit leery of changing how things are organized in something using a big framework like Django, not necessarily for fear of weird behavior (you'd probably notice quickly enough) but for fear that it'll make the project harder to get into if things don't look and feel like a "normal" Django project does.

Another goon just put in a PR to get some better environment variable support via dotenv, and one of the things they did is move everything that was in settings.py to an __init__.py file in a folder called settings. I know that the way Python works this is more or less functionally equivalent and is not a huge change, but like I said before, I'm leery of anything that causes serious deviation from "normal" Django structure. Am I right in being a little gunshy about changes like this, or is this common enough in Django that even if it's not how django-admin startproject wants to arrange it that it shouldn't be too alarming?

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
I'd be a little nervous about that, because Django implements some funky magic with its app-discovery / import system, and lots of places import the settings.py file so anything non-standard there might be tricky to debug.

It also smells bad to me that __init__.py is being used for anything except making it easier for other modules to import whatever "public functions/classes/types" exist in the __init__.py's directory. __init__.py is the last place I expect to see code, and it always irks me whenever I find significant code and constants defined there.

I might be wrong on this, maybe there's some popular alternative way of managing Django settings and this PR is just following best practices. But I'd feel more comfortable if they could justify that decision by pointing to some docs that explain the alternative and why it's better.

death cob for cutie
Dec 30, 2006

dwarves won't delve no more
too much splatting down on Zot:4

minato posted:

It also smells bad to me that __init__.py is being used for anything except making it easier for other modules to import whatever "public functions/classes/types" exist in the __init__.py's directory. __init__.py is the last place I expect to see code, and it always irks me whenever I find significant code and constants defined there.

I think this is a better way of expressing my feelings - all the settings for the DB type, logging, debug being on and off, etc. are in settings.py. Technically the contents of __init__.py are just declaring these based on reading them out from environment variables, so there's no logic per se - but then it feels like a change that's just made to make a change.

Adbot
ADBOT LOVES YOU

lazerwolf
Dec 22, 2009

Orange and Black
I will organize my settings files in a folder called settings with a pattern like:

code:
settings
- __init__.py  # this is empty
- base.py
- local.py
- staging.py
- production.py
Base contains general settings across all environments. Then I separate out environment specific settings in each other file.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply