Google Photo to Immich Migration
Put down some notes about how I migrate from Google Photos to Immich.
Today I finished migating from Google Photo to self hosted Immich running on unraid. It feels awesome.
Being bothered by Google Photos storage limit is actually the primary reason for me to explore self host. I have learned about Immich a while ago, but didn't really considering self hosting it utill a friend showed me how easy (and how cool) it is to own your whole asset. I still remember being impressed when I see his storage is 10T in the Immich UI.
Immich is really nice: the intuitive web ui; solid mobile support; active development and advanced features. After some quick initial tryout, I decided to invest on it. The only question is: you need to migrate your Google Photos data.
The high level process is roughly
- Download your Google Photo takeout.
- Process all photos to backfill exif data.
- Upload to Immich.
Download your Google Photo takeout.
This is a pretty standard process. Go to your Google account takeout page, choose Google Photos only, create the takeout. And wait for Google to packaging it, you come and download them.
Takeout will be a multiple tar ball. If you have lots of photos like me, better choose a largest size (10GB) per file to reduce the download clicks. Unfortunately, there's no easy way to scripting the downloads if you have lots of take out files.
Process all photos to backfill exif data
Good, you got all your data back. Before uploading to immich, one more thing, backfill exif data!
When a photo is uploaded to Google Photo, it may modify the photo and remove the exif info from it. Exif includes GPS information which is extremely useful. For example, you might want to know where is that beautful vista point picture taken 2 years ago.
When realizing this, I came to understand what "Vendor Lock In" means for a consumer product at another level. The takeout data itself has been crafted in a less standard way.
Luckily, from the takeout, each photo has a <file>.json metadata encodes the same information. A OSS tool google-photos-exif will help you with that. You can find some discussion here.
So uncompress all your takeout tar ball and use the tool to backfill all exif.
Upload to Immich
Immich CLI has a immich upload command to upload all photos from filesystem to the server in a batch.
- Get a API key from your Immich website
immich login <immich-url>with API key.immich upload <dir> -a,-awill create an album with the target directory name.
If you are using Immich on unraid as Docker container, you can exec into the container terminal, and binary has already been installed.
Boring Details
- Some ad-hoc shared albums via link without title would be created a
Untitled<number>. I special handled them by just uploading without creating album. - UnRaid host Linux does not have good Linux utilities. You could install some via NerdTools, but I found even that is limited. I chose to create a development container with all tools I like, and then mount the Google Photo shares as host path mount into this container. Same for Immich. This way you can do the exif processing in the development container and uploading photos in immich container.
- Lots of bash scripting, but thanks god, it's 2024 and we have ChatGPT.
The process takes some time, but in the end of the day, it feels good to do some ops for yourself and regain the data sovereignty :)