I'm writing this post on a "vintage" MacPro, model 1,1. I bought it in January 2007, SO IT'S ALREADY 4 YEARS OLD! how horrible ! :)
Actually, there are ways to make your Dekstop mac (MacPro) and laptop (MacBookPro) run faster without buying new hardware. Actually i recommend buying an OLD mac from a reliable source (refurbished ones on Apple Store, Apple resellers sometimes have them, display equipment, eBay, geek friends, etc., and refurbishing them for max performance. It will usually cost you less). I actually still have no reason to buy a new MacPro, as I never max out the CPU (check with Activity Monitor), and everything I do with the tips below runs fast enough for me. Should I need one day a faster MacPro, I could just unplug my disks and replug them ;), although I would need to change the extra memory as they would run on a faster bus clock.
These are the tricks I've used so far, and I recommend most of my tricks ;). I've used extensively the resources on macperformanceguide.com (MPG later in this post) and thank the author of that site for them.
1) try to use the latest version of Mac OSX for best performance. Old PowerPC mac (not recommended anymore) can only use OSX Leopard. Intel macs should use SnowLeopard. Update them with the latest version of the system update. I totally recommend tweaking the system thereafter with Onyx, a utility that allows you to play with several settings. MPG also has great tips on improving speed.
2) add the maximum possible memory on your laptop (4GB on MacBookAirs, 8GB on your MacBookPros), and between 12 and 16GB minimum on your MacPros. Prices have fallen drastically; get them from MacSales.com or Macway.fr. There's a little known mechanism on OSX : whenever your computer looks for a program to launch, it does so from the hard drive. It takes a bit of time (a few seconds). However, next time you launch your program, it will look for it first in memory, launching it much faster. So if like me, you seldom reboot your computer (just close it/open it), the launch of a program will be usually very fast. This is why a lot of memory makes sense. You can monitor how much memory you are using with 'Activity Monitor'. However, if you launch too many programs, you'll saturate your memory : on a laptop, it will start "swapping" back to disk, and you loose the benefit and will need to look below for a SSD boost. On a desktop, just add more memory.
3) add a SSD boot disk. Most tests show you the boot speed of a computer with and without an SSD. I don't care about those 20 seconds gained, as I seldom reboot. However, as mentioned above, if you saturate your 8GB of memory on your laptop, you'll start swapping. Swapping on a SSD is so much faster you will almost not feel it. And the original launch of your apps will be much faster. Overall, a SSD is a much better experience. Check MPG on discussions of quality of SSDs.
On your MacBookPro you can replace your DVD drive with a SSD, hence use 2 disks on it. On your MacPro you can add a SSD in the second optical bay. Ideally you want to separate your Boot drive from your data drive, and also move your home directory from your boot drive to your data drive. The SSD should be first for your boot drive.
4) Defragment your drive regularly. I use Drive Genius 3 (I've had problems with other programs). Although OSX is advertised with not having fragmentation issues, I found that the system optimizes only for files below 20MB. Since my photo or video files tend to be bigger than 20MB, I do get fragmentation. I wrote a post a while ago on how to defragment your disk.
5) upgrade your components :
- I use 2x24" display screens from DELL (model 2405FPW) on my MacPro. They were at the time BETTER and CHEAPER than the displays from Apple. MPG has recommendations for current screens. Mine still work great, hence you might want to buy old screens from eBay. You might need a calibration tool such as Spyder (I'm getting one soon) for accurate colours, particularly, if, like me, you deal a lot with photography.
- I also upgraded my internal video card to a new one (Nvidia GEForce 8800GT) because I was getting glitches when I upgraded to SnowLeopard. I have no need yet for a top-notch video card (I don't play games nor do video rendering).
- Finally, I added eSATA2 controller cards to my MacPro, and only use external disks with this technology (at 3Gb/s, it's faster than the 800Mb/s of Firewire800, 480Mb/s of USB2, 400Mb/s of firewire400). One caveat though : if you hook up your external disk to an internal eSATA cable, if you ever eject or turn off the disk, you'll have to reboot your system. This is why I have a PCI eSATA Card, so that occasionally I can turn off the disk (although it's now a bad idea).
- MPG has info on how to upgrade your CPU on some newer MacPro models.
As usual, don't forget to fully backup your mac before doing anything. I recommend using the free utility SuperDuper to do so.
Here are my current configs :
As more and more of our life becomes digital, we've come to rely on digital files to store and represent all that we've come to love: music (either digitized old CDs or bought oniine from iTunes), video (ripped from legally bought DVDs, or bought online on iTunes), pictures (taken with your great DSLR or mobile phone), and now soon (e-)books (for the Kindle, or iPad), etc.
Haven't you wondered whether the book you just bought will be available to read by your great-grand children (I have books from the mid-19th century!); that the picture you just took will be digitally framed on a wall by your great-grand children (I have pix from the end of the 19th century in my living room) ? (pix here is my great-great-grandparents crossing to Chile on one of their trips, over 100 years ago).
I've been bothered by the problem of data durability for quite some time (and if you browse this blog back a few years, it's a persistent theme). It's actually two separate problems :
- data integrity : this is what we usually call data backup. How do you maintain integrity of your files on your current system ?
- data persistence : but it's more than that. What happens when you change computers (or devices)? when technology changes ? when generations change ? See, I started collecting my data in the early 80s on 5"1/4 diskettes (remember those ?), then I moved them to 3,5" diskettes (I still have a drive to read them. Do you ?). Then I burned them to CDs (that have a limited life span !). I then moved the data to DVDs ! Currently I moved all my data to external hard disks. What's next ? The only question here is that I need to keep moving my data regularly onto new technologies before the previous one gets obsolete. I get worried though that I'll have a. the data, b. the software c. the devices to keep accessing all my data. Some steps I'm taking here is that I'll start printing books with my pictures.
- data repository unicity: the two points above assumed you keep all your data in the same place but that is not so. You produce very interesting and/or important data elsewhere as well on social sites and networks such as your blog, twitter, facebook, gmail, etc. I might come back to this point in another post, but online solutions such as backupify are spot-on and very useful.
So back to data integrity. The key criteria for a backup strategy should be :
- reliability vs. availability : do I want to have my data available (ie. accessible) ALL the time (such as my posts on my Facebook page - and I don't even know about that), or my data SAFE (a reliable hence usable copy)?
The best solution for 100% (or close) up-time is mirroring 2 or more disks (RAID 1), so that you get 1 more exact copy on the-fly. But it has drawbacks: a. it's expensive when you have large volumes, and might take a performance toll on your system. b. if you're saving a corrupted file (happens all the time), then you're saving several copies of a corrupted file. Not what you want.
Availability says you are guaranteed to have a recent enough copy of your data, and even several copies, scattered over time, so that you can roll back to an uncorrupted version of your data. Restoring an old copy might take time (such as getting an archive tape from somewhere, rebuilding a directory from incremental backups.=> in my case, for my personal data (videos, pictures, etc.), my data doesn't change much once I store it: I usually only add new data, and do not work much with old data (modifying old pix for example). Hence I don't really need an incremental backup for my data (only for my system, and some office documents).
- usability: or should I say, simplicity. Most backup strategies fail because they become too complicated for a home use. You can impose procedures in corporations, but they are hard to follow otherwise. Therefore, the most automated system with automatic alarms is probably the best way to go.
I also don't like software that produce backups that are not immediately readable with the operating system (ie. restore a proprietary archive). Hence copies of my files that I can read just by plugging in an external drive elsewhere are just great. This should be just the same as inserting a data DVD. This last point becomes crucial when you are dealing with large data volumes; I'm currently working with more than 2TB which is the largest disk size on the market. TimeMachine will only allow for "backup" to ONE disk, and start deleting old copies. I don't want that. I could create a larger disk by combining several external disks (JBOD-style concatenation, or RAID0-stripping). But that is a problem in itself : if one disk fails in the target disk pool, all the data fails. Or if I want to re-read all the disks elsewhere, I need to setup all the disks just to read one file. Hence option out.
=> therefore I need a solution that allows me to backup (sync ?) specific directories to specific disks. Ideally though, a program would just say work like this : here are all the files I want to backup (from different disks, partitions, RAID arrays, directories, etc.) to this pool of disks. And the program would slice it down automatically without asking for more. The problem would however be that I would need to maintain a "master" disk with the logs and the index of all files: therefore restoring a file (smaller than 2TB of course ! which might be a problem, since my picture library for just last year is already almost 0,5 TB) would only require in worst case 2 disks: the index, and the data disk.
All backup strategies should include a combination of the following :
- on-site and off-site backups separated by a reasonable distance (think natural disaster like the earthquake in Chile, fire, theft, etc.) : we NEVER do this, but we should as our data is so much more important than the computer value these days. Backing up ONLINE is an alternative for a remote off-site, but it requires probably a very large disk for me (in the TBs, not GBs), and a very very fast line (fiber) unless I want a year-long backup. I will have to put my most precious data (pix, family videos, document archives from my work) on an external disk and put them elsewhere, just in case. I'd do this on a monthly basis at best. This is called a time-stamped archive. There's no point archiving data you can get back in some way or another.
=> off-site archive is key
- father/son/grandfather strategy, more usually known as day / week / monthly backup, or hour / day / week, etc. The idea here is to rotate between different copies of a disk, because sometimes your data gets corrupted and you need to go back in time. I would keep one full backup on-site next to my computer. However, when you do a new backup, you're scratching the current backup, so for the period you're rebuilding the backup, there's no proper backup...
=> hence you need at least 2 rotating backup sets.
There is an incredible number of backup software out there for the
Macintosh. Hence, probably the only way to choose the right one, is to
FIRST decide on selection criteria, then narrow down the list by
- supported features
- priceI'll test out a few programs in a second post, then in a final post I'll describe the whole final setup and proper procedures.
First error reports from my disk tests (see posts in the past few days) by checking :1) power supplies: two of my external disks (LaCie) were showing time-outs when I tried copying the data. I tried tons of things on them. Basically, the power supplies of both them were kaput (these were my older 2 external disks). I changed them (at a HEFTY price), and I got back my data with no problem whatsoever. So always check the power supply of your disks and try that as a cure first. There's no way to know this unless you have a spare GOOD one around (from another disk. Keep in mind I had 2 that didn't work). And, after contacting LaCie support, they sent me a spare one... but I had already bought 2 new ones. That might help (contacting support). Now, one of the disks is showing difficulties again, but I lost no data; I guess the lifetime of that disk is gone.
2) S.M.A.R.T statuses: I didn't manage to recuperate data on one of my other disks, although I tried everything (new power supply, Data Rescue). It seems to me that the disk is physically damaged. There is a standard, called S.M.A.R.T. that can alert you to this. The problem is that you can't read the status when you're attaching them with an external fireware cable (400 or 800), USB cable or eSATA cable. I "hacked" my MacPro by using 2 unused internal SATA ports, to an external port (in the USA get it form MacSales, in France from Macway). And by booting up the disk first (always with this case), then the computer, you get to read the status as if the disk were an internal disk. And guess what? S.M.A.R.T. status = FAILED !!! There was indeed a problem. A total surface scan also shows this. LaCie online support are willing to repair / exchange the disk, as long as I pay for one way shipment. Probably a good choice, as it will cost me only 10% of the price of a full new disk. Still, it sucks, as the disk is only a few months old.
3) bad blocks: Most disks have a few bad blocks. The only way to deal with them is to zero-out totally the disk with a full erase (Disk Utility), so that whenever the OS can't write to a block, it will map it out in the file system table. It's a must-do step for every new disk.
4) drivers updates: finally, I've been trying to use my LaCie eSATA cards to pilot my external disks. I've been having regular kernel panics; I contacted support for an updated driver. There is one (2009), that is not indicated in the list of drivers for the PCIe card (2008 only). The bad news is however that it is still unstable, and will have my MacPro crash whenever I do something fancy such as a RAID strip with external disks. I've nailed it down to the card and its driver, since connecting the same disks with only Firewire cables, or with my internal SATA extender (see 2)), doesn't produce this problem. It seems the only way out here is to use a great eSata card from someone else : Firmtek or Sonnet, hopefully one that doesn't need any driver (plug & play).
Bottom line, test extensively your disks when you FIRST get them (zero them out with Disk Utility, then surface scan them with Drive Genius to check everything is OK) to prevent any data disaster in the future. And set up a slave backup disk (also tested) for your master disk. TimeMachine is good enough for this.
I've been having problems with my external hard drives for a while. A few days ago I described the new setup I'm putting in place. I still have to build an external RAID-0 setup for backup, using TimeMachine. With the joy of my new Hitachi disks, I didn't INVEST the time to thoroughly test them, but I'm doing that now with my backup disks.
Data safety is not an easy problem, and I wish there were real solutions out there, including full online hosting for my disks when fiber gets to my place. Any joyful entrepreneur up to the task ?
Screenshots of work in progress :