Export sites from command line

Hi, I would like to know how to export sites from command line. I have looked at the exported zip files and looks like the export is doing these actions:

  1. Take mysqldump of the database to path-to-site/app/sql/local.sql
  2. Back up site configuration to path-to-site/local-site.json
  3. Zip the site folder
  4. Delete path-to-site/local-site.json
  5. Offer the zip file for download

Between 2. and 4. maybe it actually doesn’t create and delete the file, but somehow adds it to the zip file.

Here are my queries:

  1. At a high level, are these steps correct?
  2. If the steps are correct, then I can probably create a script to export the site from command line, but I am wondering if there is already some way or pre-existing scripts / commands, that I can use to mimic this export functionality?

Thanks.

Hi @rahul_local!

I’m not sure on the exact steps for the native export, but I did some testing, and this one-line script seemed to work for me:

backup_dir="2025-12-05_backup" && mkdir -p "$backup_dir/files" && cp -r wp-content "$backup_dir/files" && ( cd "$backup_dir" && wp db export ) && zip -r "${backup_dir}.zip" "$backup_dir"

You can go about this in different ways, too. A bit more manual, but here is a breakdown of individual steps that will also work:

  • Make your backup folder (ex: mkdir 2025-12-05_backup)

  • Make a “files” directory in that folder (ex: mkdir 2025-12-05_backup/files)

  • Copy the wp-content into there: cp -r wp-content 2025-12-05_backup/files

  • Go into the folder with cd 2025-12-05_backup and then create your db backup with wp db export

  • Go back to the root and zip it up: zip -r 2025-12-05_backup.zip 2025-12-05_backup

The only caveat would be if you have other stuff hanging out in the root that you also want to capture, then you may need to make sure that gets included as well.

This doesn’t seem to work. wp throws an error.
```
Error: This does not seem to be a WordPress installation.
The used path is: /tmp/2025-12-05_backup/
Pass --path=`path/to/wordpress` or run `wp core download
```

If I pass the path to wp, I get the error

```
Warning: Failed to get current character set of the posts table. Reason: ERROR 1698 (28000): Access denied fo
r user ‘root’@‘localhost’

mariadb-dump: Got error: 1698: “Access denied for user ‘root’@‘localhost’” when trying to connect
```

Did you try to export the zip file you checked? Did it work? Would be good to know what export does exactly under the hood so that it can be replicated.

I made a couple of exports on a couple of sites in my testing and they worked just fine. I was able to reimport them into Local as well, but we can continue to try and help here and narrow down what’s needed.

Can you tell me some more about your setup and the site you are testing on?

  • What is your OS/OS version?

  • What version of Local are you on?

  • Is your site a single, preferred configuration site? Or does it have any kind of custom configuration?

OS: Debian Trixie

Version of Local: Version 9.2.9+6887

No multisite setup or custom configuration or any complex configuration.

My use-case is straight-forward. I am experimenting with a lot of plugins and don’t want to remember to take backups manually. So, want to setup a cron job for doing the same.

I also figured out the permission denied issue. I was directly running wp-cli without setting up the environment. These threads helped fix the issue: WP-CLI support with MySQL sockets in Local 5 - #4 by henscu and Open site shell not working (the latter issue should actually be filed as a bug).

What would be ideal is a script which takes just two arguments - 1. Local Site name / path and 2. Destination zip file path. It should then do the export and create the zip file. This should also be robust enough that if we delete the existing site and restore one from the backup, it should still work (because the site name / path would not have changed).

Hi @rahul_local!

Circling back here. I checked with one of our Local Developers, who confirmed that the steps you provided, as far as what the export job is doing, are correct.

Regarding your question about whether there’s a way to do this from the command line, we don’t have any other instructions or native tools to supply for this, but I’ve changed your post into a Feature Request in case this is something others are interested in, so we can track interest and potentially work on future development of the idea.

That said, you might still be able to create a script right now that works like this:

  1. Configures environment of shell that the cron job is running in so that it correctly connects to the site. This would roughly look like the commands within <userDataFolder>/ssh-entry/<siteId>.sh where userDataFolder on Linux is ~/.config/Local

  2. Export DB using mysqldump

  3. Save the site metadata (what you mention as step 2) — you might be able to use a tool like jq to get the siteId key from <userDataFolder>/sites.json and save those JSON values to the file

  4. Zip up the site

You may not need to delete the local-site.json file, but it may not hurt to delete it either.

Fair warning, this is not something we have tested but simply wanted to provide some additional details that may be useful.

1 Like