-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.22 (GNU/Linux)
iQIcBAEBCAAGBQJWsPP4AAoJEC0ft5FqUuEhLeMQAKQEZ/wQ8RWrc9lclBDwf54C
B4fcjzcX6AB8Fk17n4KVYO5AEonQ5SfyibyoNV3O7i3Pd+wVox/W/8kkwHZfNm64
Ev8FeJtQFqWiDkcv+SsBGDVU7xcNYwhIWEasJo/V/vFGcARSrypcZBnl9RCC2XP6
TFqZnKprkc43CshNHWqH/ygG5y2RKtDrYqGTSLypNItm1xevPrnu31rbx3NNteCa
Vit8fLRKymQOVezhVJh1swpXgleeMG0rKbCkkTUiYQ1147c0wXJMYEJNOMeA9su0
IaAB+XnB0GPGf1X80LJCU6PNQ/x7+YEeDu1sqfl49SN8ErO9EgNhbVVWbmCwlRSS
OmwF1kpMFI9pTuKqqnnSsC7LQ4WtN6mCJ7WNPYcaKL3QeKvoBbYU1kbpVgk7bCGO
e6PVIS8dEXR6vSiEjledS78bRiPBAO4byaeiIKuEg9nDVI0QUY4hS2f5iZ9sHjb7
uEIeL4ldICHZ0XjYhqLPiqVP3t2T8vjNOlMm62JfcZgF7nH6qFEYddAw2el2pBA5
pDJjYWVkqL1PWWRwFOZwjXPILUtLtCNPBs5DYvglvr0GgfT8494tuqDEEpmgbtBO
Lr1C9w0vZQUqjLUuSpjCcOmI7YGHTmgGZPW/PIDJzCDVkhFD6rsR2DQL9VJMqVYN
TKqSSlhDNHE2u/AihYgU
=W6/t
-----END PGP SIGNATURE-----
Hi Vinay,
On 02/02/2016 10:56 AM, 'Vinay Sajip' via Django users wrote:
> On Tuesday, February 2, 2016 at 5:08:40 PM UTC, Carl Meyer wrote:
>
> You can't (well, you might be able to by poking around in the internals
> of the django.db.connections object, but I'd strongly advise against
> that). The proper (and thread-safe) way to achieve the equivalent is to
>
> Well, for any given run of the application (on which the example I
> posted is based), only one database is ever used. The idea of multiple
> databases was to allow us to select which one is used for any particular
> Python process run, and we expected to be able to run management
> commands which would (based on a settings configuration) determine both
> which database to use and e.g. how to populate parts of it ready for
> deployment to particular customers.
For this scenario, I would use a different settings file (which can all
"inherit" the same common settings via the `from common_settings import
*` trick) for each "customer", with a different default database defined
in each. In fact, is there any reason for one process to have access to
another process' database at all, in your scenario? I don't think you
even have a use-case for Django's multi-db support here at all, just a
use case for multiple settings file variants. Then you can use the
--settings option with management commands to choose which settings to
use (or the DJANGO_SETTINGS_MODULE env var where that's convenient); you
don't need to define your own custom option.
Django's multi-db support is designed for what you're calling "run-time
routing," where a single process needs to access multiple different
databases at different times. That's not your situation; I'm not sure
you even need or want multi-db.
> Oddly, setting
> django.db.utils.DEFAULT_DB_ALIAS = alias does the trick. Yes, I know
> that's a no-no, but it seems to me that for this kind of case where you
> don't need run-time routing, something like django-dynamic-db-router
> shouldn't really be needed, useful though it might be for the run-time
> routing case. Everything else works off the settings module, and the
> fact that Django caches the default database to use such that a
> management command can't change it seems like a design flaw.
The "design flaw" you are observing here is not specific to databases,
but to Django settings. Django settings in general are not designed to
be modified at runtime; they define a static configuration for a given
process. Some settings can in practice safely be modified on-the-fly
(but still probably shouldn't be), some cannot, because their value is
cached for efficiency or other reasons: in general modifying settings at
runtime is not a supported technique. And it's not a _necessary_
technique either, in your case, when you could just instead use the
right settings for each process from the beginning.
> I suppose I
> was expecting transaction.atomic(using=XXX) to not just open a
> transaction on that database, but also make it the default for the scope
> of the block (except if explicitly overridden by using() for particular
> models in code called from that block). The current behaviour seems to
> violate the principle of least surprise (this my first encounter with
> multiple databases).
I think Django's design in this case is superior to the one you propose,
because it keeps two concepts orthogonal that don't need to be linked
(transactions and multi-db-routing) and avoids implicit global or
thread-local state changes in operations that don't obviously imply such
changes, making the overall system more flexible and predictable.
I suppose we'd need a more scientific survey to establish which behavior
is more surprising to more people :-)
> Is there really no case for an in_database()
> context manager in Django itself?
Perhaps. In my experience the more typical uses for multi-db are
amenable to other types of routing (e.g. models X, Y are always routed
to database A, model Z is always routed to database B, or more complex
schemes), and dynamically-scoped routing (such as that provided by
`in_database`) isn't as commonly needed.
Django provides the database router abstraction, which is a very
flexible system for implementing whatever kind of multi-db routing you
need. The burden of proof is heavy on any particular routing scheme to
demonstrate that it is so frequently needed that it should be bundled
with Django itself, rather than being a separate reusable package for
those who need it.
Carl
--
You received this message because you are subscribed to the Google Groups "Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to django-users+unsubscribe@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/django-users/56B0F3F4.2040601%40oddbird.net.
For more options, visit https://groups.google.com/d/optout.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment