Thanks for all your replies and helpful suggestions. To answer some of your questions -
1. Where is the data coming from? - It is textual data in a spec/spreadsheet. The data is a lot of meta data (name and one or more values) for describing attributes of scanned documents. There is what I would call a "base set" of data, which means it is what we can think of now based on a review of a representative set of documents. However, as new documents are imported into the application, there may be other types of metadata with multiple values that will have to be created on the fly (hence the need for the admin forms to add meta date in the future). The import function I need is a one-off function. We need it as we develop the models and test them against various documents. Sometimes it is easier to just delete the database and rebuild it when we are developing the app than to back out certain migrations. So a simple way to populate the metadata for development purposes, and then one time when we go into production is what we are looking for. Currently, we have 24 metadata names, and each one can have one to 20 values.
2. The manage.py loaddata is an appealing option. However, it will take further effort to convert the spreadsheet data to any of the formats for this option. I think a csv file reader is more suitable for our purposes.
3. I will have to think about the validation concepts. These are simple name-value pairs, so it is not clear what I am validating against, unless it is detecting duplicates.
4. I am also looking into a spreadsheet -> csv file > mysql load data as perhaps the easiest way to complete this project. The spreadsheet is easy to create and update the metadata with the least effort, and then it is pretty automatic from the spreadsheet to the database.
I am open to other suggestions!
Thanks!
Mark
On Thu, Sep 7, 2017 at 10:29 PM, <callsamleung@gmail.com> wrote:
hi,--think about problem, you mean initial data? see this(https://docs.djangoproject.com/en/1.11/ )howto/initial-data/ django way suggestion:1. django command, load/dump data.(https://docs.djangoproject.com/en/1.11/ref/ )django-admin/#django-admin- dumpdata 2. write your data migration(https://docs.djangoproject.com/en/1.11/ )topics/migrations/#data- migrations
On Friday, September 8, 2017 at 12:57:58 PM UTC+8, mark wrote:I have a several classes in models.py, and I would like to load some data into these models (or tables in the db), as well as add more data over time through the admin panel.I googled for some ideas on how to pre-populate the db tables, and found references to RunSQL (https://docs.djangoproject.com/en/1.11/ref/migration-opera ) to execute some SQL commands during a migration (http://blog.endpoint.com/2016tions/#django.db.migrations. operations.RunSQL /09/executing-custom-sql-in- ).django-migration.html Is this the "correct" way to accomplish my goal of populating the data tables with some data, or is there another way the more closely ties the data to the models?Thanks!Mark
You received this message because you are subscribed to the Google Groups "Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to django-users+unsubscribe@googlegroups.com .
To post to this group, send email to django-users@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users .
To view this discussion on the web visit https://groups.google.com/d/msgid/django-users/58bc0187- .4b71-4b37-9a4b-6f5120c65277% 40googlegroups.com
You received this message because you are subscribed to the Google Groups "Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to django-users+unsubscribe@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/django-users/CAEqej2PaENLbicxC62fAn5_fLYWh8KEC_fxpoOjO%2BDwcAXKDgg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.
No comments:
Post a Comment