> Hi, this is probably not your case, but in case it is, here is my story:
> Creating a script for import CSV files is the best solution as long as
> they are few, but in my case, the problem was that I need to import
> nearly 40 VERY BIG CSV files, each one mapping a database table, and I
> needed to do it quickly. I thought that the best way was to use MySQL's
> "load data in local..." functionality since it works very fast and I
> could create only one function to import all the files. The problem was
> that my CSV files were pretty big and my database server were eating big
> amounts of memory and crashing my site so I ended up slicing each file
> in smaller chunks.
> Again, this is a very specific need, but in case you find yourself in
> such situation, here's my base code from which you can extend ;)
>
> https://gist.github.com/1dc28cd496d52ad67b29
Dear Anler, thank you for sharing your experience and your code. That's
very kind of you. I'll study it and ask you for questions.
Cheers, Fabio.
--
Fabio Natali
--
You received this message because you are subscribed to the Google Groups "Django users" group.
To post to this group, send email to django-users@googlegroups.com.
To unsubscribe from this group, send email to django-users+unsubscribe@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/django-users?hl=en.
No comments:
Post a Comment