backup

Mysql Dump Splitter

Stephen
After trying to import massive db dumps and it taking for ever i came up with a script to chop up a mysql dump at the table level. This allows me to import the dumps table by table, allowing for a multi threaded import rather than a single threaded import. #!/usr/bin/php <?php $start=time(); echo "MySQL Dump Split to Tables \r\n"; set_time_limit(600); $filetype=mime_content_type($argv[1]); if(!isset($argv[1])) { echo "Please provide dump file as a argument \r\n"; echo "If the 2nd argument is gzip it will compress the sql dumps of the tables \r\n"; exit(1); } if(isset($argv[2])) { if($argv[2] == "gzip") { $gzipoutput=true; } } if($filetype == "text/plain") { $handle = @fopen($argv[1], "r"); } else if($filetype == "application/x-gzip") { $handle = @gzopen($argv[1], "r"); } else { echo "Please provide a sql or gzip compressed sql file \r\n"; exit(1); } $header=true; if ($handle) { while (!

Mysql Dump Splitter / Benchmarks

Stephen
Just a quick post to let you know some interesting finding on the sql splitter script i wrote: The gzip compressed size 1.2G dump.sql.gz The decompressed size 6.4G dump.sql Time taken in seconds splitter dump.sql gzip -- 22 ./splitter dump.sql.gz gzip -- 29 ./splitter dump.sql -- 43 The limiting factors looks to be the cpu with the gzip to gzip as php only runs on one core. Where as uncompressed to uncompressed was IO wait.

DD + Lzo = superfast

Stephen
I recently purchased a new laptop, (Hp Envy 14) that came pre-installed with windows. After leaving it on the hard drive for some time i finally needed the space. I shrank the partition as small as it could go but alas i have a SSD and every GB is expensive and needed. I thought to myself what’s the best way to back this up, dd + nc was the way to go i thought.