You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have been using Docker compose for running KairosDB and Cassandra, We are facing Too much time(few days) to complete Backup and restore, the file size of cassandra is aprox:42GB. After backup file size is 25GB
for backup command
./kairosdb.sh export | gzip > file_name.gz
Well to what end is your backups? Are you using Cassandra or Scylla as a backend?
The Kairos "backup" is really just an export of the data, it has to read every datapoint and then writes it out. Not the most efficient way of getting data out of C*. There are tools that can export the tables directly from C* which is more efficient but may be less flexible. I believe Scylla provides the best tools for this purpose. So depending on your scenario we can dig into one of these options.
I am using cassandra and the services running in docker.
I have transferred the complete data folder from one server to another, it was working some time but not all the time.
After file transferred from one server to another then cassandra started restarting.
I couldn't able to find proper solutions for backup and restore. Help us to solve this.
Hi Team,
We have been using Docker compose for running KairosDB and Cassandra, We are facing Too much time(few days) to complete Backup and restore, the file size of cassandra is aprox:42GB. After backup file size is 25GB
for backup command
./kairosdb.sh export | gzip > file_name.gz
Restore command
gzip -dc /var/log/kairosdb/file_name.gz | bin/kairosdb.sh import
Could you please help me to achieve with alternate ways, if possible?
The text was updated successfully, but these errors were encountered: