January 8, 2002
11:53 AM   Subscribe

Florida company announces a breakthrough in compression. They say they have managed to compress random data, losslessly 100 times over, Cold Fusion or huge discovery, guess we'll have to wait and see.
posted by zeoslap (35 comments total)
here is their press release
posted by machaus at 12:00 PM on January 8, 2002

Scam. Or more charitably, self-delusion.
posted by jaek at 12:02 PM on January 8, 2002

From the article:
...his company's technique challenged the foundations of digital communications[sic] theory spelled out by AT&T's Dr. Claude Shannon
Sounds like the specious claims of pseudoscience to me on the basis of this alone. Violate "Shannon's Law", do not pass Go, straight to jail.
posted by majick at 12:03 PM on January 8, 2002

I especially am impressed by thie terminology, their opening flash animation "connecting the world through mentions multi dimentional mathematics". Amazing.
posted by demannu at 12:05 PM on January 8, 2002

This could really prolong the life of dialup! Great news (if proven true) to those of us who live in the sticks.

Is it just me or does the phrase "practically random information sequences" kind of disturb you. I don't know of any way to pick a truly random number, but that sounds like a bit of a setup.
posted by revbrian at 12:06 PM on January 8, 2002

The question is, how much processing power does it require and how efficient is this algorithm?
posted by SpecialK at 12:08 PM on January 8, 2002

Total, steaming offal.
posted by jdc at 12:18 PM on January 8, 2002

PS: Publish Details or Shut The Hell Up
posted by jdc at 12:19 PM on January 8, 2002

I also wonder how fast this algorithm could be, but on a larger scale, I doubt its existence. Particularly for smaller data strings.

Compression technologies rely on the fact that most data we transmit is ordered in some way: there are common patterns within the data. Computers can use those patterns to compress data, because on the other end it's possible to recreate the pattern without knowing every detail bit by bit, thus it can be transmitted using fewer bits of data than it represents.

But the best method to use to compress data depends largely on the type of data being transmitted. JPEG, MPEG, and MP3 all rely on humans' inability to detect tiny differences in sound and pictures and on the brain's ability to fill in gaps, and on their media's tendency to repeat lots of things over and over to compress data. But those algorithms aren't lossless.

But to compress random data? It's not possible (not that you'd want to). Over the long term, you'd end up with at least as much overhead as you saved by compressing reoccuring patterns. There's no information in random data, and thus there are no patterns to take advantage of.

The press release is misleading at best.

[by the time I typed this: what skallas said]
posted by daveadams at 12:24 PM on January 8, 2002

From their site:

This three-dimensional world limitation can however be resolved in higher dimensional space. In higher, multi-dimensional projective theory, it is possible to create string nodes that describe significant components of simultaneously identically yet different mathematical entities. Within this space it is possible and is not a theoretical impossibility to create a point that is simultaneously a square and also a cube. In our example all three substantially exist as unique entities yet are linked together. This simultaneous yet differentiated occurrence is the foundation of ZeoSync's Relational Differentiation Encoding™ (RDE™) technology. This proprietary methodology is capable of intentionally introducing a multi-dimensional patterning so that the nodes of a target binary string simultaneously and/or substantially occupy the space of a Low Kolmogorov Complexity construct. The difference between these occurrences is so small that we will have for all intents and purposes successfully encoded lossley universal compression. The limitation to this Pigeonhole Principle circumvention is that the multi-dimensional space can never be super saturated, and that all of the pigeons can not be simultaneously present at which point our multi-dimensional circumvention of the pigeonhole problem breaks down.

I am a geek. I have an mechanical engineering background. I have taken post-calculus collegiate math. I also have a degree in English.

Damned if I can tell what they're saying.

Unfortunately, I suspect that may be intentional.
posted by NortonDC at 12:25 PM on January 8, 2002

<old_joke>Unfortunately, they're still working on a way to uncompress it.</old_joke>
posted by RavinDave at 12:29 PM on January 8, 2002

I don't know of any way to pick a truly random number...
There are lots of ways. I'm particularly fond of using the quantum uncertainty of radioactive decay intervals when I need a (literally) high-entropy chunk of bits for something.
posted by majick at 12:32 PM on January 8, 2002


"This press release may contain forward-looking statements. Investors are cautioned that such forward-looking statements involve risks and uncertainties, including, without limitation, financing, completion of technology development, product demand, competition, and other risks and uncertainties."
posted by moss at 12:32 PM on January 8, 2002

Bah, it's obsolete already compared to the power of CENTRA.
posted by holloway at 12:46 PM on January 8, 2002

"ZeoSync said it had applied for patent protection for a technology it calls Zero Space Tuner".

This sounds suspiciously like Star Trek technobabble. I'm waiting until they perfect the Flux Capacitor and the Gravity Drive before I invest
posted by phatboy at 1:11 PM on January 8, 2002

I wonder if their technique involves removing all the ™ characters from press releases? That would compress theirs quite a bit!
posted by bschoate at 1:11 PM on January 8, 2002

NortonDC...at first i thought you had posted a quote from that Time Cube guy.
posted by th3ph17 at 1:16 PM on January 8, 2002

It works. Here is the meaning of life, the universe, and everything run through ZeoSync's Relational Differentiation Encoding™ (RDE™) technology: 42
posted by O Boingo at 1:17 PM on January 8, 2002

Their claims are impossible. See the comp.compression FAQ for a proof.

Also, check the Data Compression Library for more about it. (It's a wonderful place to learn about compression! (which is a fascinating topic!))
posted by sonofsamiam at 1:21 PM on January 8, 2002

Well, the MeFi concensus is clearly dubious of ZSync's claims. If they turn out to be on the level, it will be a sharp blow to MeFi's reputation as a clearinghouse for the scientific and technical world(s).
posted by BentPenguin at 1:30 PM on January 8, 2002

O Boingo, that's the answer, but what's the question?
posted by David Dark at 1:55 PM on January 8, 2002

<another _old_joke> DEL *.* = 100% Compression </another_old_joke>
posted by pheideaux at 1:57 PM on January 8, 2002

majick - I was referring more to my limited understanding of codesmithing.
posted by revbrian at 2:09 PM on January 8, 2002

Damned if I can tell what they're saying.

Same here. It's either written by a businessman with no actual theoretical background or a researcher that's way too close to the problem to understand what needs explaining and what doesn't. Either way, it's appalling English: simultaneously identically yet different mathematical entities?

On the plus side, at least I found out about the Pigeonhole Principle and Kolmogrov Complexity today.
posted by MUD at 2:24 PM on January 8, 2002

Here is a related product.
posted by jeblis at 2:50 PM on January 8, 2002

MUD: it's KolmogOrov -- and I was just looking at a great site on the topic of descriptive complexity a few days ago, but now I can't find it. Oh well ..
posted by sylloge at 3:15 PM on January 8, 2002

Wow, that's amazing! And, imagine if they ran the compression algorithm several more times on the same data, it would be 10,000 times yet even smaller. ha.
posted by Real9 at 3:23 PM on January 8, 2002


"This press release . . . other risks and uncertainties."

What's so funny about a Safe Harbor statement?
posted by yerfatma at 5:07 PM on January 8, 2002

Ha! Good one, Real9. Enough times and they would end up with a file that was just two bits: a one and a zero.
posted by pheideaux at 7:13 PM on January 8, 2002

I still remember a company that claimed it could compress any data file by a factor of 64.

Figure it out :-)
posted by Orik at 8:12 PM on January 8, 2002

posted by ParisParamus at 8:39 PM on January 8, 2002

I wonder if this will prove as chimerical as Adam Clark's video compression claims...
posted by dhartung at 12:35 AM on January 9, 2002

I am not a geek, do not have a college degree, and never progressed beyong high-school calculus (and I was either stoned or sleeping through most of that).

But isn't there a Law out there somewhere that states that any perfectly conpressed signal is indistinguishable from white noise?

But I'm confusing cryptography and data compression now, aren't I?

(This is what happens when Neal Stephenson novels fall into the hands of the Great Unwashed.)
posted by BitterOldPunk at 9:20 AM on January 9, 2002

BitterOldPunk: don't know about that law - but any science, sufficiently advanced, is indistinguishable from magic (what happens when an Azimov novel falls in the hands of the hoi polloi).
posted by RichLyon at 11:41 AM on January 9, 2002

RichLyon, I think you mean Clarke, not Asimov. I love petty fact-checking. I wonder what Clarke's First & Third Laws are? Hmmm.....
posted by BitterOldPunk at 8:52 AM on January 10, 2002

« Older On the Public's Right to Know   |   Weblog Junior High. Newer »

This thread has been archived and is closed to new comments