Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

cb911

macrumors 601
Original poster
Mar 12, 2002
4,134
4
BrisVegas, Australia
i was just backing up my DB yesterday, and saw a guide that said to use zip or gzip if you've got a large DataBase. so - what qualifies as a large DB? a few hundred MB? a GB? :eek:

on the subject of DB's - how large would the DB for a forum like MR be? i'm guessing it'd be up in the ten's of GB's possibly?
 
The question is almost as meaningless as "how big is big?"

I suspect that the Social Security Administration of the United States has a database of every social security number ever allocated, and to whom, and name and all addresses over time and all employers over time, and all soc sec payments made by those employers over time, and current status, and so on. And therefore of every company that has ever had an employee, and who all their employees were every year and when they started and what their salary was each year, and what soc sec taxes were paid out on their behalf each year. And of course every social security check ever cut, and its serial number, and to whom it was sent, and address and date, and whether or not it was cashed or deposited or reported lost or stolen.

That one probably isn't very small, although it doesn't contain many 'blobs'. A 'blob' — binary large object — can be an image such as a JPEG, or a soundfile such as an MP3, or a movie such as an MPEG. It can even be another computer file or set of files and therefore it can be another entire database contained within a single field of a single record of the current database. I do not know if Sony Records has a database containing the track name, album name, performer, date, serial number, and the actual audio track itself, but they very well might. And for all I know, Paramount has one of every one of their film titles, properly digitized with scripted routines to auto-burn that content to DVD at any time or stream it for "Pay Per View".

It isn't farfetched to posit databases of several petabytes.
 
Well its really up to you you can zip it if you want to save some space, however it takes time to zip and unzip. Its really your choice. I would say if its a long term storage, and its over 100 mb then zip it.
 
ahunter3 said:
It isn't farfetched to posit databases of several petabytes.

:eek:

well my DB was only around 77MB. but it took a bit of time to display in the browser, next time i will zip it and just download it.

interesting about those 'blobs' as well. i've never heard of that term before.
 
I tend to gzip all my database dumps regardless of what size they are, but for the larger ones (100MB+) I use bzip2, which I find compresses them twice as much as gzip (although it takes ages). Not sure if there's a GUI for it, but you can use it in Terminal.

I'm pretty sure the MR db is 'only' a few GB, under 5GB i think. I've been working with a database that, in my experience at least, is huge - 58.3GB. That's a list of every vehicle registered in the UK along with lots of other details. On my PC it takes about 6 minutes to do a query on it when searching on a non-indexed column.

I remember reading something in relation to the Oracle database software - a small DB is one under 200GB, medium is 200GB to 1TB and over 1TB is large.

World's largest commercial database - 100TB
 
cb911 said:
:eek:

well my DB was only around 77MB. but it took a bit of time to display in the browser, next time i will zip it and just download it.

interesting about those 'blobs' as well. i've never heard of that term before.


blobs
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.