Migrate Postgres DBs across cloud providers with pgloader

Pierre Averous
3 min readOct 11, 2020

--

…or how I migrated from AWS to Azure !

So a couple months ago, i received an email from AWS, titled “Your AWS Free Tier Period is Expiring”. I was like “Oh dang, I just set everything up on AWS !”, so I started looking into the pricing for the paid account on AWS.I did not use many services on AWS, only a t2.micro RDS instance, and a t2.micro EC2 instance. These would have cost me about 25€/month starting March 2020. This is not too expensive, but being a student at the time, I did not have any revenue to pay for this. I looked at my options, and found two alternatives which required a bit of work:

The thing is, I already used up my 12 months on Google Cloud, so I just went with Azure !

After creating my free tier account, I started migrating.

Migrating my EC2 instance

The first thing I migrated was my EC2 instance. I created a VM on Azure, and migrated my code on there. At that point, I still have no deployment process, so I simply created an SSH key for the VM, added it to my Gitlab account, pulled my repo, and ran docker-compose up.I then added a rule to allow for incoming HTTP traffic to the VM, and there: done for the EC2 migration !

Migrating my RDS instance

Now for the RDS instance. This part is the most interesting, as I did not want to lose any of my data.

The first step was to create the databases I had on AWS on Azure using Azure PostgreSQL Server. I then created an empty database and a role for each database.

CREATE DATABASE my_database;CREATE ROLE db_owner nologin;
GRANT db_owner TO admin;
ALTER DATABASE my_database OWNER TO db_owner;
CREATE USER user WITH PASSWORD ‘p455w0rd’;
GRANT db_owner TO user;
GRANT ALL PRIVILEGES ON DATABASE my_database TO db_owner;
GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA public TO db_owner;
GRANT ALL PRIVILEGES ON ALL SEQUENCES IN SCHEMA public TO db_owner;

Once that was done, I had to migrate the data. I used a great tool called pgloader to do this.I set it up in a docker container like so:

docker run -it dimitri/pgloader:ccl.latest

Once in the docker container, I created a file called db.load and filled it with my pgloader config:

LOAD DATABASE
FROM pgsql://username:password@psql_rds_host:port/database
INTO pgsql://new_username:new_password@psql_azure_host:port/database;

Finally, I just had to run pgloader db.load, in order to start the migration !

NB: In order to parse some special characters in my password and username, Iused PGUSER and PGPASSWORD variables. See documentation for details.

And there, done ! All that was left to do, was update env vars across all apps (Heroku & GitlabCI mostly) to use the new environment.

There, see ya later ! 💪

--

--

Pierre Averous

DevOps enthusiast, curious about ever evolving tech. Currently working at https://padok.fr/.