An email from the Google Drive Team dropped into my inbox earlier this week to let me know that from November any WhatsApp backups won’t count against my storage quota on Google Drive. It’s welcome news especially for heavy WhatsApp users with the basic 15 GB Drive allowance.
Unaware that WhatsApp did backups? It’s worth checking out as it’s one of the app’s best features. It’s especially useful for moving to a new phone or if needing to do a factory reset as all the chats, photos and videos get restored to the new device. It’s also very straightforward to set up and once done, the backups happen in the background on a regular basis.
Here’s are the steps. On Android, to get WhatsApp configured for backups, hit the three dots in the top right and tap on “Settings”, then “Chats”.
On the “Chats” screen, it’s “Chat backup” and the “Chat backup” dialog is where all the not-very-hard work is done.
Choose how often the backup needs to happen, the account to use, whether to include videos and so on. I recommend daily backups over WiFi only with videos. Once configured, the green “BACK UP” button can be used to immediately send the chats to Google Drive. On Google Drive, the WhatsApp chats are stored in the “Backups” section – it’s blanked out because the mobile number is included in the name of the backup file.
When it comes to restoring a backup, it has to be done the first time the WhatsApp app is run after installation and WhatsApp will display a message about it. It’s not possible to restore to an old backup after using the app for awhile.
Mat Honan’s story (as covered by Todd in the latest podcast) showed me that the strongest password in the world is worth nothing if it can be reset by a straightforward social engineering-based attack. I’m sure Apple and Amazon will be looking hard at their policies and procedures but for the individual, there’s also much to learn from the episode.
i) Two-factor authentication. There’s no doubt that this is a good thing and I enabled it on my Gmail account last night. Turning it on is easy, but it’s a pain in the ass for the first few hours as you re-login to all your Google-based services. With several regularly used PCs, email clients and umpteen mobile devices, it takes a bit of time to get them all setup correctly. Touch wood, now that I’ve been through the re-login process, things are largely back to normal.
ii) Backup, backup, backup. For at least part of the story, Mat is entirely to blame. If there’s only one copy of any piece of data, it might as well not exist. Never mind hackers; theft, damage and accidental deletion make it all too easy to lose data, especially with mobile devices. Disk space is cheap, so even if you have just one PC, have a working set of folders, a backup set of folders and also make copies on a regular basis to a USB drive, which you disconnect from your PC when not in use and preferably store somewhere else.
iii) It’s your data. Convenient as “the cloud” is, remember it’s your data and your responsibility to keep it safe. If you push information directly to the cloud, don’t forget to include this information in your backup routine. Google has tools to download data from its services. Or don’t bother with someone else’s cloud and build your own, using a PogoPlug or similar.
iv) Download email using POP3. I use web-based Gmail and IMAP-enabled apps to manage my email and if email is deleted from Gmail…poof, it’s all gone. By using a POP3 email client like Thunderbird, you can have a copy on your PC as well.
v) Spread the load. Convenient as it might be to have all your eggs in one basket, either with Apple or Google, consider spreading your digital assets across different services, e.g. email on Gmail, work files on Dropbox, personal files on Box, photos on Flickr. If someone does compromise one of your accounts, all is not lost in one go. But don’t use the same password across all the systems.
vi) Remote kill-switch. The ability to kill mobile devices remotely is very handy if they are stolen but there’s a risk that the kill-switch can get into the wrong hands as in this case. However, the benefits probably outweigh the risks in that you are far more likely to lose your device than be hacked, so it’s perhaps better to focus on minimising the fall-out from both physical loss and a remote wipe.
There’s certainly plenty of food for thought there and even if you only take on one or two of the suggestions above, you’ll make yourself much harder to attack while lessening the impact.
Having a good backup system both locally and offsite is important for anyone with a computer. Once you have decide to use a cloud backup the first problem you face is the overwhelming number of options. There are two broad category of backups, manual and automatic. Services like Dropbox or Google Drive are what I call manual backups, in that they require you to physically drop a file or folder into them. An automatic backup system is just that, you choose the files/folders you want to back up and the system you choose backs up those files automatically either at a specific time or interval. There are a couple of things I look for in a backup system: first is it easy to use, second when I recover a file do I get back what I put in, third is the cost, and finally is it trust no one(TNO) compliant. The idea behind TNO is that you and only you has access to your content including your password and keys.
The solution I have found is called Arq after trying Backblaze, Carbonite, and Jungle Disk. Arq falls into second category of backups in that the backup happens automatically once you have set it up.
I first heard about Arq on Security Now Episode #351: Back To The Cloud. Arq is a Mac only backup solution, although there is an app available to view the files on iOS. Arq runs on Amazon S3 and does require you to sign up for the Amazon Web Service
Once you sign up it will give you an access key id, secret access key and you also have to provide a password. Make sure you keep a copy of all these, neither Arq nor Amazon can recover them for you (I use 1password for this purpose). Although this can be inconvenient it makes Arq TNO compliant. There is a 30 day trial, during the trial you pay only for the Amazon S3 fees After 30 days if you decide to continue to use it there is a $29.00 one time licensing fee. Amazon S3 fees are $1.25 $12.5 cents/GB or $.93 9.3 cents/GB for reduce redundancy storage. They also bill you for outgoing transfers. Outgoing transfers are free up to 1GB/month, from 1GB/month to 10/GB it is .120 per GB and so on. The price per GB goes down the more GBs you use. This is one of the things I like about Arq you are only paying for what you are using instead of a flat fee. As part of the sign up process Arq will ask how much you want to budget for backup starting in $5.00 increments. You put in the dollar amount you want to spend and it will tell you how much that will backup. If you are about to go over your budgeted amount Arq will automatically delete the oldest files. Arq does version backup similar to Time Machine, so it will always keep at least two versions of a backup.
You can choose which files/folders you want to back up and you can exclude specific files by name. You can back up from a network attach storage drive. It doesn’t delete backups from network storage devices even if you remove those devices from your network. If you can see it in the Finder menu it will back it up. In fact when I first start the backup process I noticed it was backing up my Dropbox folder, which I quickly unchecked. It does not care what type of file you are backing up. Arq allows you to back up automatically every hour at a specific time during that hour, you can schedule a backup once a day or you can do a manual back up and have it only back up when you tell it too. You can control the transfer rate, either maximum, automatic which will throttle the speed if you are transferring something else over the Internet or a fix transfer rate at a specific KB per second. If you want you can get a Growl notification when a backup is completed. Plus you can have Arq start-up on login, show on the menu bar and prevent your computer from sleeping when backing up.
To restore a file or folder you simply highlight it and then either click restore, which restores it to a folder labeled Arq folder or you can drag and drop the folder/file on to the Finder Window. I did a test restore on an image and it worked great, the image and all the metadata restored perfectly.
I have only been using Arq for a day now but so far I really like it. It was easy to set up, I like the fact it is TNO compliant and I like the cost. If you want to share the files with someone this is not the solution you are looking for. However if you are on a Mac and are looking for a good, secure backup solution I do recommend trying Arq.
Apple’s Time Machine has been a lifesaver for many people, especially when they’ve accidentally deleted an important file. However, it doesn’t protect against fire, flood or theft when everything is lost. Enter Dolly Drive, a Time Machine-compatible cloud-based backup service.
Available as a subscription service based on data usage, Dolly Drive looks like another Time Machine target to OS X and once setup, will store revisions and changes to the cloud, giving the security of off-site backup.
Included as part of the deal, subscribers are sent a hard drive via courier to return and seed their Dolly Drive for the first time. This avoids a lengthy upload over broadband when the service is first started and the whole disk is copied.
Prices start at $5 per month for 50 GB but a more representative subscription is $10 pcm for 250 GB. As a bonus, 5 GB is added each month for free.
Interview by Andy McCaskey and Courtney Wallin of SDR News and RV News Net.
Support my CES 2020 Sponsor:
30% off on New GoDaddy Product & Services cjcgeek30
$4.99 for a New or Transferred .com cjcgeek99
$1.00 / mo Economy Hosting with a free domain. Promo Code: cjcgeek1h
$1.00 / mo Managed WordPress Hosting with free Domain. Promo Code: cjcgeek1w
Support the show by becoming a Geek News Central Insider
The Japanese Government is taking contingency planning to a whole new level by proposing to build a backup city for Tokyo. Situated 300 miles to the west of Tokyo on an old airport, it would have capacity for 50,000 residents and space for 200,000 workers.
At the time of a disaster the capital would move to this new city so that Japan could continue to function. The city is currently codenamed IRTBBC, or Integrated Resort, Tourism, Business and Backup City and will include buildings for all the key government functions, such as parliament and ministries but will also have general office space and entertainment complexes. It’s also planned to have the world’s tallest skyscraper at 652 m.
Hajime Ishii, a member of the ruling Democratic Party, said, “The idea is being able to have a back-up, a spare battery for the functions of the nation, isn’t this a really good idea?”
Err, yes, but wouldn’t it be more cost-effective to find a way to re-purpose existing structures in another city in the time of emergency, much like cruise ships used to be re-purposed as hospital ships?
For several years, I’ve had an original Buffalo LinkStation NAS as my main fileserver. Being on 24×7, it’s gone through several fans and at least one hard disk, but it’s now time to retire it in favour of a LinkStation Duo which will give me more space, RAID capabilities and faster transfer speeds.
Naturally, as my main fileserver, it’s backed up. However when I copy the files to the new Duo, how do I know that they’ve all copied correctly and none have been missed? There are hundreds of thousands of files and checking each one by hand would be pointless.
Linux has lots of tools that tell me how much disk space is used, such as du, df and filelight, but they don’t always report back consistently between filesystems. Mostly for reasons of speed, they report the total size of the file blocks used to store a file and as block sizes can vary between filesystems, the total number of blocks used for a set of files will be different. For example, I have two folder sets that I know are identical and du -s on one reports 210245632 and on the other 209778684.
Fortunately, there’s an extra command line flag that will change the behaviour to take longer and sum the actual bytes. In this case, du -sb will return 214813009920 bytes on both filesystems. On the whole, I can be reasonably confident that if the total number of bytes used is the same between two filesystems, then all the files have copied correctly.
But what if the total number of bytes don’t match? How can I find the missing or truncated file? After thinking and tinkering, what I want is to get a list of files with each filesize from the old and new filesystems and then compare the two. And here’s how you do it (each section here goes on one line).
find /home/old_folder -type f -printf '%s %p\n' |
sed 's/\/home\/old_folder\///' | sort > old.txt
find /home/new_folder -type f -printf '%s %p\n' |
sed 's/\/home\/new_folder\///' | sort > new.txt
diff -wy --suppress-common-lines old.txt new.txt
If you aren’t used to Linux, this can look a bit scary, but it’s not really. The first two lines create the text files with all the files and the third line compares the two. The the first two lines are much the same in that they do the same commands but on different filesystems. There are three sections, find to list all the files, sed to chop the directory path off, and sort to get all the files in some sort of order. Here are some explanations.
find – finds files /some/folder – where to start finding files -type f – only interested in files (not directories or links) -printf ‘%s %p\n’ – only print the filesize (%s) and the full pathname (%p) on each line (\n)
sed – processes text ‘s/x/y/’ – means replace x with y. In our instance, it’s replacing the leading folder path with nothing. It looks worse than it is because the slashes in the path need to be escaped by a preceeding backslash, so you get these \/s everywhere.
sort – sorts text > file.txt – copy everything into a text file.
One of the clever things about Unix-like operating systems is that you can pass information from one command to another using a pipe. That’s represented by the | symbol, so the find command gets the information on files and files sizes, passes it to sed to tidy up which then in turn passes it on to sort before sending it to a file.
After running this set of commands on the old and new filesystems, all that needs done is to compare the files. Let’s look at the third and final command.
diff – compares files line-by-line -w – ignore whitespace (spaces and tabs) -y – compare files side-by-side –suppress-common-lines – ignore lines that are the same old.txt new.txt – the two files to compare
So what might the output of the diff be? If all the files copied correctly, you’d get absolutely nothing. Other possibilities are that the file partially copied or didn’t copy at all. Here’s what the output might be like.
The numbers at the beginning of the entry are the number of bytes, so the first line shows that the epson3.txt is only 500 bytes long in the new file but 598 in the old. The second line shows that onstream.txt is present in the old file but not in the new, as the arrow points to the old file.
To close the story, did I find that I had lost any files? Yes, I did. I discovered a couple of small files that hadn’t copied at all because of non-standard characters in their filenames. The filenames were acceptable to Windows but not Linux and I’d used my Linux PC to do the copying. Fortunately, the files were saved and all the scripting was worth it.
In this computer day and age, you want to make sure pictures, documents and more are backed up on a daily basis. Especially with hard drives that fail every day and notebooks that get stolen every day. Even if you get your stolen computer back, the thieves might have done damage like wiping the hard drive or dropping the machine altogether.
I back up my machines to different sources. I have 2 home backups and one cloud backup. The cloud backup can also be a great way to share pictures through an iPhone, Android or iPad application. Nonetheless, if my computer drive dies, if there is fire or water damage, if someone steals my computer, I don’t lose the data.
Advantages to Cloud Based Backup
Off-site data retention
access to data from multiple computers or mobile devices
software that will start backing up files when computer is idle
High encryption on backup
Cloud based service runs their own backups – Your data gets backed up by them, too!
Prices are low – There are some services that are free to a certain level.
So here are Five cloud backup solutions you can employ now:
MyPCBackup was ranked #1 by Top 10 Online Backups. With unlimited backup, you can make sure all your pictures, documents and more are safe. They have an option to sync multiple computers through a folder on the machine. They also offer a money back option and full customer support.
Mac users won’t be able to use this program, which is a major downfall to this program. The “1 PC, 1 Mac” per household ratio is growing. There is also no mobile application so you can view your backup files.
Carbonite Online Backup
One of the more popular online backup systems is Carbonite, this unlimited backup lets you view your files from any computer, or through an iPhone, Blackberry or Android application. Carbonite also has accidental deletion option, which means if you delete a file that was backed up, you have 30 days to restore it.
Carbonite also works on PC and Mac systems. Therefore, you could backup all your items from the old computer, switch the program to Mac and download the files. There is full customer support if needed.
Mozy Online Backup
Mozy online backup is also a PC or Mac backup. You can access your data from an iPhone and Android (no Blackberry) application. They also offer on-site backup, which means you can set up a USB hard drive and Mozy will backup to that as well. Full support and they also offer the ability to restore a file up to 30 days after deleted.
Mozy does not offer unlimited backup, though. $5.99 /month for 50 GB and $9.99 for 125 GB. For a standard computer, you might not fill that spot – unless you take tons of pictures or create video like myself.
You might not think of Dropbox as a online backup. However, with their open API, there are a lot of things that Dropbox can become. You can use it to sync with other computers, so it’s a great collaborative tool.
Keep in mind, it doesn’t tout itself as an online backup. You do get some great options, like short-term backup recovery.
They do have a free service for the first 2 GB, then pricing options after.
iBackup is a backup for not only PCs and Macs, but also Servers. You can also back up your MySQL, Linux box Exchange, Oracle and iPhone. Therefore, it’s a full corporate backup solution. It gets pricey fast, but if you need 1.5 Terabytes of data that has to be backed up, this is a great solution.
They also have a “30 version” backup – If you change a document 29 times, then realize you need to start from the beginning, you can go into your account and pull up the first version. That can help if someone messes up a document and you don’t realize it right away.
There are other online backup options, too. This includes items like PogoPlug or Drobo, where the files are stored at your home site. Elephantdrive, Livedrive, KeepIT are some other online backup tools.
Since it’s your data, you should research each program and see if it’s right for you. But if you have important data and need to back it up, then these options might help you with that solution.
There are also some great software solutions if you want to backup your computer to another computer. If you are on a Mac, then Time Machine is a feature you can set. Everything from backing up folders to making a full image of your computer to restore at a moments notice.
Most important – Back it up now, because you never know if you’ll get a chance to back it up later.