pouët.net

Go to bottom

Open jpeg for jpeg2000

category: general [glöplog]
http://www.tele.ucl.ac.be/PROJECTS/OPENJPEG/

Cheers,
h
added on the 2004-05-14 13:14:34 by hitchhikr hitchhikr
Correct me if i'm wrong, but all Wavelet i have seen, including JPEG2k are awfully slow for decompression. Ever wondered why Winnerdemo by Metalvotze loads so slow and is so big? Right, wavelet misuse.

Interesting alternative is VQ compression. Rates between 2 and 3 BPP (for all planes together), very slow compression ( a few minutes per pic), but real-time decompression. I'm not aware whether there's any source for it.

Another interesting alternative is FIASCO, which is like JPEG and does dictionary-assisted matching on small blocks, but with the difference it writes back to the dictionary, and the root dictionary contains many more than only cosine forms. Memory hungry, fairly fast compression and decompression, also interesting for video. Rates up to 0.5 Bit per pixel in each plane. Higher quality than JPEG 2000 and *much* higher speed. Open source implementation, which is unmaintained and hard to find.
added on the 2004-05-15 22:00:54 by eye eye
mmm what is the interest of JPEG2K compared to PNG for demomakers ?
added on the 2004-05-16 01:00:59 by Zest Zest
Jpeg2000 compresses better, and flexibly. It supports both lossy and lossless compression. Besides it makes up for a perfect pretense for long loading times. :>>>>> Otherwise it would be a perfect replacement for both PNG and JPEG, being superior to both of them. I would say that PNG is overused by the moment, making the demo packages a bit larger than they should be.

BTW, Winnerdemo doesn't use JPEG2000 - it uses Haar Wavelet, which is vastly less space-efficient, but nontheless slow. :)

I know another J2K implementation here: http://www.j2000.org/
I'll bechmark its speed against the one from hitchs post when i have time, which seems to be derived from this and newer. A comparison with other format codec libs would be cool.

And please: all this talk of long loading times reminds me waaay to much that i see even good demos which load tons of numerical data from text files - and take a lot of time for that. Please, don't.
added on the 2004-05-16 01:29:33 by eye eye
I've just checked the sources briefly but i believe there's room for improvments in both size and speed for these ones. OpenJPEG is based on libj2k that can be found on j2000.org, indeed.

I'll check all that when i'll have some times.

h.
added on the 2004-05-16 01:51:21 by hitchhikr hitchhikr
I'm talking the size and speed of the coder/decoder implementation, of course.
added on the 2004-05-16 01:53:18 by hitchhikr hitchhikr
Quote:

I would say that PNG is overused by the moment, making the demo packages a bit larger than they should be.


I disagree. PNG is most likely underused, judging from the amount of JPEG fringing seen around text and other "PNG friendly" images.

(And PNG compresses a lot better than Photoshop would have one believe, which is presumably what most people use.)
Eye: sure that winnerdemo didnt use "Jasperlib" or similair then? Jasperlib is painfully slow at decompressing jpeg2k images.. I know prophecy changed to some other lib, cant remember wich one >)

PNG isnt really good, just better than old jpeg cause of the Alpha support, lossless is kindof unimportant usually.
added on the 2004-05-16 13:03:56 by Hatikvah Hatikvah
arneweisse: Winnerdemo uses this here: http://ainc.openskynet.de/Wavelet.htm
added on the 2004-05-16 13:18:02 by eye eye
whee. haar & rle is pathetic :) especially if you make it slow :) you should at least do daubechies & spiht.
png has really good uses for textures that require lossless compression (text overlays, credits...) it is too ugly to see jpeg artifacts in such - and cranking up jpeg so that sharp edges stay sharp usually makes them bigger than an equivalnet png.
jpeg2k links are neat, i'll check them out. for relais the std. jpeg just produced much to big files with not enough quality. of course this only becomes an issue if you have a real lot of texture data to compress.
added on the 2004-05-16 14:59:43 by shiva shiva
Vector Quantization is useful both for image and for mesh compression, and can be even decoded on the fly. For me, it's especially interesting because Dreamcast uses VQ as a compressed texture format. :) It doesn't give ugly artifacts, but it is in fact too large for textures like in Relais.

I suggest you see http://www.gamasutra.com/features/20010416/ivanov_pfv.htm

I see that Relais uses many textures with similarities, both within one texture - which is by itself very advantegeous for FIASCO, and between bump, diffuse and spectacular. If you would compress many of them into one FIASCO stream, you would use up the dictionary even better.

Read http://www.linuxjournal.com/article.php?sid=4367

To hunt down the source, try FreeBSD or Debian archive, or i might have it somewhere.
added on the 2004-05-16 16:15:18 by eye eye
eye: yep, that's the first thing that hit me about "winnerdemo" - the slow wavelet decompression. if you ask me, that's what ruined it.

to sum up: great demo, apart from the slow wavelet decompression.

(yeah, right :)
added on the 2004-05-16 16:18:17 by gloom gloom
I agree with arneweisse...jasperlib is fuckin slow for JPEG2000 images !! I'm using the CxImage library (which uses jasperlib) to load all famous image formats like...

JPG, JP2 (JPEG2000), PNG, BMP, GIF, TIFF, TGA,
PCX, ICO, WBMP, WMF, ZLIB, MNG, JBIG, PNM,
PPM or PGM...bla

...but loading a 1024x1024 JPEG2000 image needs 9 seconds on my AMD Athlon XP 1900+ Definitely to much for loading tons of textures =[

Is the library from www.j2000.org faster and easy to use ?
added on the 2004-05-16 18:50:38 by 0x$$ 0x$$
haar isn't so bad if you first interpolate odd samples half a unit left ;)
added on the 2004-05-16 20:29:22 by 216 216
As thom mentioned, PNG support/compression in PS until very 8(CS) was dire producing files twice the size as saved in other programs. I use Irfan View exclusively for saving out PNGs, and then I use pngout(using custom block sizes for each image to maximise the compression) to compress even further for 8-bit images, and pngcrush for 24-bit images.

I have seen some examples of there JPEG2000 will compress slightly better than PNG in lossless mode, but i've also seen that JPEG2000 will be 4 times as large on images with large areas of a single colour.

That said, it bugs me that browser support for newer better formats like JPEG2000 is so slow be implemented. Some were very quick to adopt PNG like AWeb, Voyager and Opera, but IE is only just catching up, and still doesn't support alpha transparency.

On a note, JPEG2000 does compress faster than PNG :) But is 2 -3 times as slow as PNG to decompress.

I work a lot with compression formats for both archiving of data and images and JPEG2000 is definately very very nice.

I've been working with wavelet compression (from algoVision LuraTech, the guys behind some of the JPEG2000 wavelet algorithms and is probably the fastest implementation too) with LuraWave since about 1998 and it does have a lot of benifits fror archival purposes. LuraDocument is very impressive too, able to separate text from background images etc etc.

Of course there are other formats comming along, JBIG2 for example which can achive 100:1 compression ratios but that will mainly be used for document archiving mainly. You can get a version of AcroBat with a beta implementation of it.

Oh, and PNG is way underused and it baffles me why people still use GIFs.

Lossy compression? just say no! But i guess it's okay for some things. But ugg, people who re-compress jpeg images with jpeg again just bug me to hell and back, fecking ugly artifacts ruining images.
added on the 2004-05-16 21:28:03 by Intrinsic Intrinsic
Take
JPEG2000 for photo/texture like images
&
PNG for images with large single color areas
&
be happy ! =p


added on the 2004-05-17 15:02:38 by 0x$$ 0x$$
...but don't use Jasper for JPEG2000 ;]
added on the 2004-05-17 15:05:36 by 0x$$ 0x$$
Er... and why not: DXTn (or S3TC or whatever) for all textures that don't look horrible with it; and use whatever for the remaining (very few) ones (png/jpg)... This way you don't need to decompress at all, and save video RAM/bandwidth...
added on the 2004-05-17 15:09:06 by NeARAZ NeARAZ
because dxt has a fixed (roughly) 4:1 compression ratio (==very poor) and also is horrible in quality.
added on the 2004-05-17 15:18:16 by shiva shiva
But it saves VRAM/bandwidth and the quality very depends on how good your compressor is. Generally like 95% of your textures can be DXTn and noone will ever notice (even your artists :)).
DXT1 has 8:1 ratio; and even 4:1 is fairly good (and can be further compressed with simple zip). Anyway, bandwidth savings are a big plus...
added on the 2004-05-17 15:30:12 by NeARAZ NeARAZ
the 1:8 is a lie :) 15bpp*4*4 / 64bits < 4. no doubt dxt is nice on lower end gpus when quality loss is acceptable, but the compression is not enough for demo data (size limit!) by far. You need more something like 1:20 at least.
And yes, sometimes you will have to re-encode textures with dxt during upload if your graphics board has little memory or low texture bandwidth. This again however depends on the format. For normal maps for example (and gradients where the 15bpp of dxt are too little) you will want to do paletted textures instead though.
But only a problem for demos with their size limits ofc. Games do store textures in the gpu native format usually (even including the mip levels).
added on the 2004-05-17 15:56:06 by shiva shiva
the 1:8 IS true, if you're compressing 32bit textures containing binary alpha. The quality (with a good compressor) is usually very high (a lot higher than the mentioned 1:20 jpeg), and the bandwidth saving _is_ a _very_ important factor on _every_ card, nopt just "boards with little memory". Normal maps can be compressed with a little trick, go to the ATI devsite and download the according paper. BTW DXT1 is interpolated at 8bpp/component everywhere, except on the gf1/2/3.

It's the sizelimit that should be raised.
added on the 2004-05-17 16:18:25 by reptile reptile
I don't think sizelimit should be raised. If it would be, scene.org would need a new harddisk after each demoparty. :/ And me as well.
added on the 2004-05-17 21:56:23 by eye eye
i think the sizelimit should be lowered to 5mb.
And speed limit to 486. Why should we have only size restricted competitions and not speed???
added on the 2004-05-18 11:46:26 by Optimus Optimus

login

Go to top