[section_title title=Overview]

I have been working on a django application that I had to deploy on a server with only MySQL. The database has evolved over time and for future preparation we finally installed PostgreSQL on the server. I am not going to go over why I decided to go with PostgreSQL because this article is focused on the actual migration. Below is some additional background and the requirements.

  1. The application does not use any MySQL specific functionality like enumerations.
  2. I only want to migrate the data (i.e. not the structure) because django is pretty good at defining the structure and corresponding constraints, and by migrating the structure I would lose some of this in translation.
  3. I will rebuild the migrated database on a separate machine and then copy it back to the final server.
  4. Note that by default django also populates some special tables when you run syncdb (e.g. auth_permissions). I am going to remove these rows and load the ones from source so I don’t affect any foreign key references.
  5. Along the way I will be cleaning up some data (e.g. django_session).
  6. I will use the MySQL to Postgres Migration Wizard by EnterpriseDB v1.1 tool to move data from MySQL to Postgresql.

To accomplish this, here is a summary of the steps (some kung fu required) that we will be performing.

  1. Dump the MySQL database and restore in a local MySQL instance.
  2. Remove any unnecessary data. This is where we will clean django_session and django_admin_log tables.
  3. Create the PostgreSQL database and using the EnterpriseDB migration tool do a full migration. FYI, the migration tool does not create any sequences and many of the constraints are not propagated either.
  4. Once the migration is complete you will have a separate schema in the PostgreSQL database.
  5. Dump only the data (i.e. no structure) with complete inserts.
  6. Run the django-admin utility to create the PostgreSQL-specific tables in the ‘public’ schema.
  7. Note that syncdb will also populate the auth_permission and django_content_type tables, which are also in the dump that we created. So, we are going to delete the new ones and stick to the ones in the dumps.
  8. Edit the dump file…
    1. Rename the scheme to load to ‘public’ schema.
    2. Encapsulate in a transaction and add the deferred constraint checks.
  9. Source the edited file into the ‘public’ scheme (default).
  10. Drop the unnecessary schema.
  11. Update the sequences because sourcing does not update any sequences.
  12. Export the database so we can reload it on the final, destination server.
  13. Load it on the remote server.

Whew! Only after all these steps will we meet our objectives and have a well defined PostgreSQL database with the existing data from MySQL 🙂

Note, along the way you might encounter some issues because MySQL isn’t very strict on checking its constraints. For instance, earlier in the development, if you started out by allowing a column to be null, but later decided that this shouldn’t have been the case, then, unless if you remembered to convert the existing nulls appropriately, the earlier rows will still have those columns as null, which, obviously, PostgreSQL isn’t going to like. So, the local MySQL instance will give you a chance to correct these without impacting any running applications.