Wikipedia stores the complete revision history of each page in snapshot form, and not in differential form. While this allows quick diffs between arbitrary revisions, it makes the storage requirements quadratic in the number of revisions. And, in practive, Wikipedia XML dump files are absolutely gigantic. 14 gigabytes uncompressed for the 100,000 article Turkish wiki; 290 frigging gigabytes for the 600,000 article French wiki; the English wiki with its 2,000,000 articles is 133 gigabytes bzip2-compressed and I don't know how big it really is.
Now you might think that if you distribute the articles of each Wikipedia over a number of different servers it should be all right, after all the individual articles aren't that big. Each snapshot may not be too big, but some articles have many, many snapshots.
I will give you some precise figures from the latest dump of the French wikipedia, taken on Februrary the 15th, 2008.
The Napoléon Ier article is ~120,000 bytes long and has 2,393 revisions. The XML portion of the 2,393 revisions in snapshot form, gzipped with compression level 2, is 80,801,744 bytes; uncompressed it is about ~200,000,000 bytes. In differential form all that information measures a meager 5,310,701 bytes as a marshalled Ocaml object. Gzipped, that file takes 1,370,456 bytes. Basically, Wikipedia has a ×40 overhead and is wasting 97% of its uncompressed storage.