Compression consulting Schindler offers its data compression knowledge to you. We help choose a method that fits your needs or develop a new solution if there is none. We also have free giveaways; currently our fast cross-platform compressor szip (freeware, executable for various systems) and for the programmer the rangecoder, one of our entropy coders bundled with a fast and simple probability model (GNU GPL source code, other licenses on request). You might also want to look at our huffman coding hints page.
As Michael also has a long experience as configuration manager/configuration management system administrator assistence in that area is also offered. Previous projects included migration of Philips Speech Processing development in Vienna from Microsoft SourceSafe to Rational ClearCase . Hands-on experience in administrering/planning VSS , CVS , subversion in addition to ClearCase is is also available.
Since you came here you have a desire to make your data smaller. Reasons could be cut of storage or data transfer costs, limitation of bandwidth or storage capacity. We can help you by choosing a suitable compression for your needs, or, if there is none, develop one. We will also assist you in evaluating other's compression proposals and inform you of potential problems.
We can also point out to you places where we think a lossy compression is possible. We have had a case where a technically speaking lossy compression is still regarded lossless by the client - the compression could not distinguish 1,2,3 from 3,1,2 but the client did not care for the sequence anyway. The difference in compressed size was 2:1 there.
Please contact us for configuration management assistence. That area depends strongly on the actors and programming style. Generally speaking it makes no sense to implement new processes that will not work; in most cases it is far better to implement something that is used and allow users to make errors than to implement a foolproof thing that everyone bypasses by emailing versions or sharing unversioned files.
We will not redevelop the wheel; if there is an existing solution to your compression needs we will make the contact or show you your options. Such applications may be image, audio or video where a lot of research is done. It may also be that your data matches existing standard algorithms or modifications thereof; in that case we could tell you what performance to expect and how to modify your data or the algorithm for better results. We can help your people writing a good compression or we can provide code - the choice is up to you. We try to focus on unusual data like high energy physics particle tracks, multidimensional raster data or whatever you have. Our strength is that we try to understand what process causes the data and then use that knowledge to design a compression.
Trying to understand sometimes leads to unusual solutions; in one job just changing the readout direction of a 2d-matrix gave 20% less compressed data volume; in another place we found that the number of detectors could be halved without loss of accuracy; thus not only reducing data volume but also device costs (some hundredthousand channels less).
If you need good freeware cross-platform compression program consider szip. If your platform is not available please contact email@example.com Performance tests can be found on the monthly Archive Comparison Test and for Canterbury Corpus .
There is our freeware (GNU GPL) range coder source code available, bundled with a fast probability model. Range encoding is similar to arithmetic coding, just faster with slightly (usually less than 0.01%) larger files.
You can brush up your knowledge about huffman coding on our huffman coding hints page. The recipes on that page will help you writing a fast huffman coder. At other places you might have to pay for that information or do literature work, we give it away for free - our job is to choose the right compression for your needs, not to hide implementation tricks.
If you are looking for a compression algorithm easy to implement in hardware which should outperform existing chips in compression and throughput please contact us at firstname.lastname@example.org - together we can make an excellent product. More information is also available at your local patent office; it was filed under PCT/EP97/06356 for several countries. (granted US pat. 6,199,064 )
Compare design of a data compression with building a car: If you just need a low-end play solution you can do it yourself. If you need an average solution you go ahead and buy a car; likewise you would use a standard library in compression. Even there an expert could help, or you may end up with a two-seater sportscar to transport a family. If you have unusual needs (a "car" to transport spacecrafts, a racing car,...) you go for an expert. You should do the same in compression.
That is a knotty question. You get some savings and you still have saved if you give us 90% if it. However we will not take 90% of your savings. Generally speaking the price must be fixed for each project individually after we see what is coming toward us. If you demand the impossible we do the same, if your problem can be solved by a reference (for example to the The Data Compression Library or some existing code) we most likely do it for free. If the problem requires us to read 150 pages of technical documentation before we know what is going on you can't expect a free service, the same is true if you want an elaborated document of our opinion. We will always inform you before we start to charge for our service, so feel free to send questions to email@example.com . Be sure to give a brief problem description, including parameters like required throughput or limited resources.
Sure, but some are no longer on the web. However these should work:
Two articles for the CERN LHC Aliche detector (These links are nonlocal, they go to CERN )
My former University institute can be visited here and I am still mentioned in old staff phone directory
You may want to know that an UK company distributes its data with a compressed database burnt on CD for direct access. That turned out to be 1/4 of the size and 10 times the speed compared to the normal database files. This meant user satisfaction (just 1 CD instead of 2 and a lot faster), cd production cost reduction for them. However for security reasons they do not want their name on the web, so please ask who they are and who to contact. They also have an online demo of their product on their website.
Other references are no longer existing since shutdown of the old university institute server:
Go to Michael Schindler's old university page and look at some papers - the CERN internal paper has lot of spelling errors due to H. Baker, but the technical content is ok - none of the authors is english native speaker and it is just internal use. A more elaborated version was presented at Osaka (HTML ) (Postscript (CERN) ) (Postscript (old university page) )
There is also some docs of an old project (no longer) available - be warned, it is technical and you are not familiar with that high energy physics experiment.
You may test our freeware compressor szip, which demonstrates one of our pat. pend. ((US patent 6,199,064 ; provisional protection in europe was published on oct 27, 1999 under No. 0951753; others pending) technologies. A description is available here, where you will also find the DCC97 paper about it.
Or you may visit Intelligent Compression Technologies where Michael helped to improve their universal coder.
A method to nearly double speed of multisymbol arithmetic coding was presented at the Data Compression Conference in 1998, see the rangecoder home page.
Furthermore if you need an range (arithmetic) coder operating at the speed of a huffman coder - we have that (unpublished, not the freeware) if your data distribution does not change fast.
Compression Consulting Schindler is located in Vienna, Austria. This is handy if you are located in the EEC and need a development partner. Customers are typically international: ICT is located in USA, CERN in Switzerland, LNF in Italy, Kreutzfeldt Electronic Publishing in Germany, UPS operates globally.
(c) Michael Schindler, 2000-2004.
If you locate a spelling error click here
szip and the >data</// logo are trademarks or registered trademarks
of Michael Schindler.
All other trademarks or registered trademarks are held by their owners.