Skip to content

Migrating data from Kopano Groupware

Attention

This feature is not enabled by default and requires configuration by the system administrator.

A migration tool is available to import data from a Kopano Groupware installation. The import can be performed from trusted sources identified by their IP address.

Requirements

To use the migration tool, the following conditions must be met:

  • the target users must exist
    • manually created or synchronised via ad-connect
    • but must be set to inactive to envure the mailbox is not in use while importing
  • the source users must either be fully qualified user principal names (i.e. have a domain)
    • or the mapping configuration on import must provide the extra information.
  • read access to the Kopano Groupware database (MySQL or MariaDB).
  • read access to the Kopano Groupware attachment storage directory (either directly or via network mount)
  • access to the Exchange4all server importer endpoint (source IP must be whitelisted).
  • yhe kgdump tool supports Kopano Groupware database schema database revision 63 or higher.
    • This corresponds to the first public release of Kopano Groupware (8.0) and also includes Zarafa Groupware 7.1.
  • the migration tool does not support partial or incremental migrations and repeated migrations will lead to duplicate items.
Tip

It is recommended that you have a fast dedicated host on which to run both the dump and the migration in order to achieve the best possible performance when migrating large datasets and to avoid using CPU resources on either the source or target system. A suitable migration system should have at least 8 CPUs and 16 GiB of RAM. Disk space requirements will depend on the amount of data being transferred - if dumping with concurrency 1, the maximum amount of temporary disk space would be the dump size of the largest single folder.

Server side configuration

To add an IP to the import whitelist simply add it to /storage/config.yaml. The format of the data is as follows:

e4a:
    adminrpc:
        importer:
        ip_white_lists:
          - 198.51.100.5
          - 203.0.113.19

Downloading the migration tool

The latest version of the tool is bundled with the container and can be downloaded with a browser by going to https://your-server/download/service/kg-migration/e4a-kg-migration-latest-linux-amd64.tar.gz. Inside the archive additional documentation and example configurations can be found.

Examples for running the import tools

The migration tool consists of two components, kg-dump and migrate. The kg-dump tool connects to the Kopano database and attachments and generates an export stream, while the migrate tool opens this stream, maps it to the corresponding user and forwards the data to the new server (with a TLS connection). Since the dump and import can take a long time, it is recommended to run it inside a tool like screen or tmux, which takes care of logging and keeps the migration running even if the terminal session is closed.

Dump and directly pipe into import

This example assumes that the migration tools are running on the same host and that this host can reach the source database and the target importer endpoint.

screen -L -Logfile kg-e4a-migrate-$(date -I'second').log -dmS kg-e4a-migrate sh -c "../bin/kgdump dump --stdout | ../bin/migrate import --stdin"

Dump and directly pipe into remote import through SSH

This is the recommended method if for some reason it is not possible to run dump and import on a machine that can access both source and target. For this to work, the migrate tool and its configuration YAML file must be available on the SSH_TARGET host.

SSH_TARGET=target.url
screen -L -Logfile kg-e4a-migrate-$(date -I'second').log -dmS kg-e4a-migrate sh -c "../bin/kgdump dump --stdout | ssh ${SSH_TARGET} env DEBUG_LEVEL=${DEBUG_LEVEL} IMPORTER_CONFIG=${IMPORTER_CONFIG} IMPORTER_IMPORT_URL=${IMPORTER_IMPORT_URL} ../bin/migrate import --stdin"

Dump into file, transfer file, import from file

This is not recommended for large datasets as it will take twice as long and require a lot of disk space to store the dumped data. In some cases it may be possible to transfer the dumped data using a removable disk (plug into source server, dump, unplug, plug into target server).

DUMPFILE=kgdump-yourserver-$(date -I'seconds').d1
../bin/kgdump dump --output="${DUMPFILE}"

Then transfer the resulting file to the target system and run the importer locally:

../bin/migrate import --input="${DUMPFILE}"

Errata

Known issues and improvements:

  • #39 Export filters from Kopano Groupware can be confusing as when exporting, stores can be selected by user ID while when importing the store id is used for all filtering and selecting.
  • #46 Migration of "rules" is not implemented.
  • #47 Migration of "ACLs" and "permissions" is not implemented.
  • #48 Migration of "Out of Office settings" is not implemented.
  • #49 Migration of "Email Signatures" is not implemented.
  • #64, #67, #69 References to users from the global addressbook are converted to SMTP Email address references.
  • #66 Some old Meeting requests in the "Sent" folder cannot be opened.
  • #71 The only implemented way to import attachment data is by including them in the export directly.
  • #72 The importer is not limited to the user mailbox quota, means the mailbox might be full and the quota needs to be increased by an admin to make such a mailbox fully usable.
  • #73 After importing the log output "size" field does not include attachment size and the resulting mailbox size might be way larger because of its attachments.
  • #75 Imported message size is taken as is from the source system and can change slightly whenever the imported message is changed as that computes the message size.
  • #479 The memory consumption while importing is relational to the total numer of attachments. Millions of attachments in the imported data can require multiple gigabytes of RAM until the import is finished.
  • #480 The importer has a message class white list and logs "message class not supported yet" in the server's log if a message is skipped.