Color depth in lossless formats

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Color depth in lossless formats

Niels Ott
Hi there,

I recently got a new used camera with 14MPix and now I feel like
re-thinking my archiving options for edited images. So far, I had a
6MPix Pentax from whose PEF files I created PNG files. But with the
14MPix of the K-7, the PNGs range from approx. 69 to 76MB per image.
Outch! So I checked out the other formats that DigiKam supports.

Example (compression rates depend on particular image, of course):

PEF/RAW:  14,7MB, "16 bpp", "not calibrated"

TIFF:     72.6MB, "16 bpp", "RGB"
PNG:      70.9MB, "16 bpp", "RGB"
PGF:      62.6MB, "48 bpp", "RGB"
JPEG2000: 31.5MB, "0 bpp",  "unknown"

Now I'm wondering: What's up with the bpp values? Shouldn't they all be
the same: 48bpp (3x16 bit per pixel in RGB mode)

Also, I'm stunned by JPEG2000's apparently superior lossless performance
(and flabbergasted by the tremendously slow implementation) - or is
there simply something going wrong here?

My DigiKam Version is the one of Ubuntu 12.04.5, namely 2.5.0

I know that disk space is cheap nowadays, but I'm also doing audio
recording and video stuff, so I fill up disks quickly and the JPEG2000
compression level seems awesome to me. But how to check that it doesn't
fool me and how safe is this to still work in the future? JPEG2000 is
considered dead by many people.

Thanks in advance for your ideas.

Best

   Niels
_______________________________________________
Digikam-users mailing list
[hidden email]
https://mail.kde.org/mailman/listinfo/digikam-users
Reply | Threaded
Open this post in threaded view
|

Re: Color depth in lossless formats

Robert Susmilch
I too wrestled with formats and space.  I finally just went with an open standard that will be well known for a long time (hopefully). I gave up on digikam doing lossless conversions since it never worked and I couldn't get it to do my whole collection of JPEG's so I wrote a script that converts all my jpg's.  I then wrote a script that would download new jpg's from my camera, load them into a holding folder, and then sort them into YEAR/MONTH/DAY folders since that is also outside digikam's scope.  Finally my backup script searches for all jpg's in my photos folder, converts and auto orients them into PNG files with fingerprint appended, and then optimizes them.  I think you should try optipng and since you're on Ubuntu it should be in the repos

"OptiPNG is a PNG optimizer that recompresses image files to a smaller size, without losing any information. This program also converts external formats (BMP, GIF, PNM and TIFF) to optimized PNG, and performs PNG integrity checks and corrections."

You get a savings up up to 20% I've seen on my testing, but it adds up.
Example of a 5000 DPI slide scan, scanned to jpg and tiff then converted to PNG with "mogrify -auto-orient -format png -quality 100 filename.ext"

50M Aug 25 12:41 Scan-140825-0001.jpeg.png
18M Aug 25 12:40 Scan-140825-0001.jpg
167M Aug 25 12:40 Scan-140825-0001.tif
161M Aug 25 12:41 Scan-140825-0001.tiff.png

Then optimizing it

optipng -nx -preserve -clobber -zc9 -zm9 -zs0-3 -f0-5 Scan-140825-0001.tiff.png
** Processing: Scan-140825-0001.tiff.png
6542x4455 pixels, 3x16 bits/pixel, RGB
Input IDAT size = 168099168 bytes
Input file size = 168162013 bytes

Trying:
  zc = 9  zm = 9  zs = 0  f = 0        IDAT size = 168099168
  zc = 9  zm = 9  zs = 0  f = 1        IDAT size = 147256818
  zc = 9  zm = 9  zs = 1  f = 1        IDAT size = 146369673
  zc = 1  zm = 9  zs = 2  f = 1        IDAT size = 146369185
  zc = 9  zm = 9  zs = 0  f = 3        IDAT size = 144707250
  zc = 9  zm = 9  zs = 1  f = 3        IDAT size = 143759134
  zc = 1  zm = 9  zs = 2  f = 3        IDAT size = 143758332
                              
Selecting parameters:
  zc = 1  zm = 9  zs = 2  f = 3        IDAT size = 143758332

Output IDAT size = 143758332 bytes (24340836 bytes decrease)
Output file size = 143759629 bytes (24402384 bytes = 14.51% decrease)

New size,
138M Aug 25 12:41 Scan-140825-0001.tiff.png

And now for the JPG converted to PNG

optipng -nx -preserve -clobber -zc9 -zm9 -zs0-3 -f0-5 Scan-140825-0001.jpeg.png
** Processing: Scan-140825-0001.jpeg.png
6542x4455 pixels, 3x8 bits/pixel, RGB
Input IDAT size = 51874187 bytes
Input file size = 51894705 bytes

Trying:
  zc = 9  zm = 9  zs = 0  f = 0        IDAT size = 51874187
  zc = 9  zm = 9  zs = 0  f = 1        IDAT size = 36850006
  zc = 9  zm = 9  zs = 1  f = 1        IDAT size = 36757149
  zc = 9  zm = 9  zs = 0  f = 2        IDAT size = 35389940
  zc = 9  zm = 9  zs = 1  f = 2        IDAT size = 35207170
                              
Selecting parameters:
  zc = 9  zm = 9  zs = 1  f = 2        IDAT size = 35207170

Output IDAT size = 35207170 bytes (16667017 bytes decrease)
Output file size = 35208692 bytes (16686013 bytes = 32.15% decrease)

And new size,

34M Aug 25 12:41 Scan-140825-0001.jpeg.png

Got 32% savings on that one!

BTW, I choose PNG over TIFF because TIFF has too many standards and, at least according to digikam, won't hold a lot of the metadata for the images.

Also use identify from either GM or ImageMagick to see that it's a reporting inconsistency as you can write color depth as either 3x16 or 48 bit, or even 64 bit if you have 4 channels!

On 08/25/2014 03:51 AM, Niels Ott wrote:
Hi there,

I recently got a new used camera with 14MPix and now I feel like
re-thinking my archiving options for edited images. So far, I had a
6MPix Pentax from whose PEF files I created PNG files. But with the
14MPix of the K-7, the PNGs range from approx. 69 to 76MB per image.
Outch! So I checked out the other formats that DigiKam supports.

Example (compression rates depend on particular image, of course):

PEF/RAW:  14,7MB, "16 bpp", "not calibrated"

TIFF:     72.6MB, "16 bpp", "RGB"
PNG:      70.9MB, "16 bpp", "RGB"
PGF:      62.6MB, "48 bpp", "RGB"
JPEG2000: 31.5MB, "0 bpp",  "unknown"

Now I'm wondering: What's up with the bpp values? Shouldn't they all be
the same: 48bpp (3x16 bit per pixel in RGB mode)

Also, I'm stunned by JPEG2000's apparently superior lossless performance
(and flabbergasted by the tremendously slow implementation) - or is
there simply something going wrong here?

My DigiKam Version is the one of Ubuntu 12.04.5, namely 2.5.0

I know that disk space is cheap nowadays, but I'm also doing audio
recording and video stuff, so I fill up disks quickly and the JPEG2000
compression level seems awesome to me. But how to check that it doesn't
fool me and how safe is this to still work in the future? JPEG2000 is
considered dead by many people.

Thanks in advance for your ideas.

Best

   Niels
_______________________________________________
Digikam-users mailing list
[hidden email]
https://mail.kde.org/mailman/listinfo/digikam-users


_______________________________________________
Digikam-users mailing list
[hidden email]
https://mail.kde.org/mailman/listinfo/digikam-users
Reply | Threaded
Open this post in threaded view
|

Re: Color depth in lossless formats

Niels Ott
Hi Robert,

thanks for the hint to "identify", which showed me that all my files
have the right bit depth, namely 16bit per channel. It doesn't deal with
the PGF though. All files in my comparison were saved with DigiKam:

identify: no decode delegate for this image format `file.pgf' @
error/constitute.c/ReadImage/532.

What is your point of converting from JPEG to PNG? After all, once the
damage is done, you can leave the files as they are. Except for when
you've done some editing. For me and myself, I simply never ever want to
go to lossy formats except for the very final step such as putting stuff
on web pages. (Same for audio, but I also know people who record into
MP3, oh boy.)

OptiPNG took around 25 Minutes on the file I mentioned and reported a
0.70% decrease of file size. So in terms of space and time, JPEG2000 in
lossless mode would still be much better.

Best
Niels



Am 25.08.2014 um 20:24 schrieb Robert Susmilch:

> I too wrestled with formats and space.  I finally just went with an open
> standard that will be well known for a long time (hopefully). I gave up
> on digikam doing lossless conversions since it never worked and I
> couldn't get it to do my whole collection of JPEG's so I wrote a script
> that converts all my jpg's.  I then wrote a script that would download
> new jpg's from my camera, load them into a holding folder, and then sort
> them into YEAR/MONTH/DAY folders since that is also outside digikam's
> scope.  Finally my backup script searches for all jpg's in my photos
> folder, converts and auto orients them into PNG files with fingerprint
> appended, and then optimizes them.  I think you should try optipng and
> since you're on Ubuntu it should be in the repos
>
> "*OptiPNG* is a PNG optimizer that recompresses image files to a smaller
> size, without losing any information. This program also converts
> external formats (BMP, GIF, PNM and TIFF) to optimized PNG, and performs
> PNG integrity checks and corrections."
>
> You get a savings up up to 20% I've seen on my testing, but it adds up.
> Example of a 5000 DPI slide scan, scanned to jpg and tiff then converted
> to PNG with "mogrify -auto-orient -format png -quality 100 filename.ext"
>
> 50M Aug 25 12:41 Scan-140825-0001.jpeg.png
> 18M Aug 25 12:40 Scan-140825-0001.jpg
> 167M Aug 25 12:40 Scan-140825-0001.tif
> 161M Aug 25 12:41 Scan-140825-0001.tiff.png
>
> Then optimizing it
>
> optipng -nx -preserve -clobber -zc9 -zm9 -zs0-3 -f0-5
> Scan-140825-0001.tiff.png
> ** Processing: Scan-140825-0001.tiff.png
> 6542x4455 pixels, 3x16 bits/pixel, RGB
> Input IDAT size = 168099168 bytes
> Input file size = 168162013 bytes
>
> Trying:
>   zc = 9  zm = 9  zs = 0  f = 0        IDAT size = 168099168
>   zc = 9  zm = 9  zs = 0  f = 1        IDAT size = 147256818
>   zc = 9  zm = 9  zs = 1  f = 1        IDAT size = 146369673
>   zc = 1  zm = 9  zs = 2  f = 1        IDAT size = 146369185
>   zc = 9  zm = 9  zs = 0  f = 3        IDAT size = 144707250
>   zc = 9  zm = 9  zs = 1  f = 3        IDAT size = 143759134
>   zc = 1  zm = 9  zs = 2  f = 3        IDAT size = 143758332
>                              
> Selecting parameters:
>   zc = 1  zm = 9  zs = 2  f = 3        IDAT size = 143758332
>
> Output IDAT size = 143758332 bytes (24340836 bytes decrease)
> Output file size = 143759629 bytes (24402384 bytes = 14.51% decrease)
>
> New size,
> 138M Aug 25 12:41 Scan-140825-0001.tiff.png
>
> And now for the JPG converted to PNG
>
> optipng -nx -preserve -clobber -zc9 -zm9 -zs0-3 -f0-5
> Scan-140825-0001.jpeg.png
> ** Processing: Scan-140825-0001.jpeg.png
> 6542x4455 pixels, 3x8 bits/pixel, RGB
> Input IDAT size = 51874187 bytes
> Input file size = 51894705 bytes
>
> Trying:
>   zc = 9  zm = 9  zs = 0  f = 0        IDAT size = 51874187
>   zc = 9  zm = 9  zs = 0  f = 1        IDAT size = 36850006
>   zc = 9  zm = 9  zs = 1  f = 1        IDAT size = 36757149
>   zc = 9  zm = 9  zs = 0  f = 2        IDAT size = 35389940
>   zc = 9  zm = 9  zs = 1  f = 2        IDAT size = 35207170
>                              
> Selecting parameters:
>   zc = 9  zm = 9  zs = 1  f = 2        IDAT size = 35207170
>
> Output IDAT size = 35207170 bytes (16667017 bytes decrease)
> Output file size = 35208692 bytes (16686013 bytes = 32.15% decrease)
>
> And new size,
>
> 34M Aug 25 12:41 Scan-140825-0001.jpeg.png
>
> Got 32% savings on that one!
>
> BTW, I choose PNG over TIFF because TIFF has too many standards and, at
> least according to digikam, won't hold a lot of the metadata for the images.
>
> Also use identify from either GM or ImageMagick to see that it's a
> reporting inconsistency as you can write color depth as either 3x16 or
> 48 bit, or even 64 bit if you have 4 channels!
>
> On 08/25/2014 03:51 AM, Niels Ott wrote:
>> Hi there,
>>
>> I recently got a new used camera with 14MPix and now I feel like
>> re-thinking my archiving options for edited images. So far, I had a
>> 6MPix Pentax from whose PEF files I created PNG files. But with the
>> 14MPix of the K-7, the PNGs range from approx. 69 to 76MB per image.
>> Outch! So I checked out the other formats that DigiKam supports.
>>
>> Example (compression rates depend on particular image, of course):
>>
>> PEF/RAW:  14,7MB, "16 bpp", "not calibrated"
>>
>> TIFF:     72.6MB, "16 bpp", "RGB"
>> PNG:      70.9MB, "16 bpp", "RGB"
>> PGF:      62.6MB, "48 bpp", "RGB"
>> JPEG2000: 31.5MB, "0 bpp",  "unknown"
>>
>> Now I'm wondering: What's up with the bpp values? Shouldn't they all be
>> the same: 48bpp (3x16 bit per pixel in RGB mode)
>>
>> Also, I'm stunned by JPEG2000's apparently superior lossless performance
>> (and flabbergasted by the tremendously slow implementation) - or is
>> there simply something going wrong here?
>>
>> My DigiKam Version is the one of Ubuntu 12.04.5, namely 2.5.0
>>
>> I know that disk space is cheap nowadays, but I'm also doing audio
>> recording and video stuff, so I fill up disks quickly and the JPEG2000
>> compression level seems awesome to me. But how to check that it doesn't
>> fool me and how safe is this to still work in the future? JPEG2000 is
>> considered dead by many people.
>>
>> Thanks in advance for your ideas.
>>
>> Best
>>
>>    Niels
>> _______________________________________________
>> Digikam-users mailing list
>> [hidden email]
>> https://mail.kde.org/mailman/listinfo/digikam-users
>
>
>
> _______________________________________________
> Digikam-users mailing list
> [hidden email]
> https://mail.kde.org/mailman/listinfo/digikam-users
>


--
Niels Ott
Bassist und so bei Delta B
http://www.delta-b.net
_______________________________________________
Digikam-users mailing list
[hidden email]
https://mail.kde.org/mailman/listinfo/digikam-users
Reply | Threaded
Open this post in threaded view
|

Re: Color depth in lossless formats

Robert Susmilch
Namely to go from my camera to lossless as soon as possible so that rotating, etc doesn't degrade my images. My wife has access to all the family pictures of course and I want high quality images in my backup before she attacks them. My camera doesn't do raw but I would like the best quality I can have possible. If you do the full monty of optipng which I believe is - o7 it tries over 1000 permutations and is slow. My quad core 3.4 ghz takes a few minutes and that's why I put it in the overnight backup script. It only does use one core though so I piped it into simultaneously work 8 files at once.

I debated as you did on file size for a while then just decided to bite the bullet and go with what will be around format wise for a long time. I also experimented with lossless jpg compression that did wonders for file size, but just added another layer to the system since they are then in a compressed format that isn't directly viewable.

On August 25, 2014 2:20:02 PM CST, Niels Ott <[hidden email]> wrote:
Hi Robert,

thanks for the hint to "identify", which showed me that all my files
have the right bit depth, namely 16bit per channel. It doesn't deal with
the PGF though. All files in my comparison were saved with DigiKam:

identify: no decode delegate for this image format `file.pgf' @
error/constitute.c/ReadImage/532.

What is your point of converting from JPEG to PNG? After all, once the
damage is done, you can leave the files as they are. Except for when
you've done some editing. For me and myself, I simply never ever want to
go to lossy formats except for the very final step such as putting stuff
on web pages. (Same for audio, but I also know people who record into
MP3, oh boy.)

OptiPNG took around 25 Minutes on the file I mentioned and reported a
0.70% decrease of file size. So in terms of space and time, JPEG2000 in
lossless mode would still be much better.

Best
Niels



Am 25.08.2014 um 20:24 schrieb Robert Susmilch:
I too wrestled with formats and space. I finally just went with an open
standard that will be well known for a long time (hopefully). I gave up
on digikam doing lossless conversions since it never worked and I
couldn't get it to do my whole collection of JPEG's so I wrote a script
that converts all my jpg's. I then wrote a script that would download
new jpg's from my camera, load them into a holding folder, and then sort
them into YEAR/MONTH/DAY folders since that is also outside digikam's
scope. Finally my backup script searches for all jpg's in my photos
folder, converts and auto orients them into PNG files with fingerprint
appended, and then optimizes them. I think you should try optipng and
since you 're on Ubuntu it should be in the repos

"*OptiPNG* is a PNG optimizer that recompresses image files to a smaller
size, without losing any information. This program also converts
external formats (BMP, GIF, PNM and TIFF) to optimized PNG, and performs
PNG integrity checks and corrections."

You get a savings up up to 20% I've seen on my testing, but it adds up.
Example of a 5000 DPI slide scan, scanned to jpg and tiff then converted
to PNG with "mogrify -auto-orient -format png -quality 100 filename.ext"

50M Aug 25 12:41 Scan-140825-0001.jpeg.png
18M Aug 25 12:40 Scan-140825-0001.jpg
167M Aug 25 12:40 Scan-140825-0001.tif
161M Aug 25 12:41 Scan-140825-0001.tiff.png

Then optimizing it

optipng -nx -preserve -clobber -zc9 -zm9 -zs0-3 -f0-5
Scan-140825-0001.tiff.png
** Processing: Scan-140825-0001.tiff.png
6542x4455 pixels, 3x16 bits/pixel, RGB
Input IDA T size = 168099168 bytes
Input file size = 168162013 bytes

Trying:
zc = 9 zm = 9 zs = 0 f = 0 IDAT size = 168099168
zc = 9 zm = 9 zs = 0 f = 1 IDAT size = 147256818
zc = 9 zm = 9 zs = 1 f = 1 IDAT size = 146369673
zc = 1 zm = 9 zs = 2 f = 1 IDAT size = 146369185
zc = 9 zm = 9 zs = 0 f = 3 IDAT size = 144707250
zc = 9 zm = 9 zs = 1 f = 3 IDAT size = 143759134
zc = 1 zm = 9 zs = 2 f = 3 IDAT size = 143758332

Selecting parameters:
zc = 1 zm = 9 zs = 2 f = 3 IDAT size = 143758332

Output IDAT size = 143758332 bytes (24340836 bytes decrease)
Output file size = 143759629 bytes (24402384 bytes = 14.51% decrease)

New size,
138M Aug 25 12:41 Scan-140825-0001.tiff.png

And now for the JPG converted to PNG

optipng -nx -preserve -c lobber -zc9 -zm9 -zs0-3 -f0-5
Scan-140825-0001.jpeg.png
** Processing: Scan-140825-0001.jpeg.png
6542x4455 pixels, 3x8 bits/pixel, RGB
Input IDAT size = 51874187 bytes
Input file size = 51894705 bytes

Trying:
zc = 9 zm = 9 zs = 0 f = 0 IDAT size = 51874187
zc = 9 zm = 9 zs = 0 f = 1 IDAT size = 36850006
zc = 9 zm = 9 zs = 1 f = 1 IDAT size = 36757149
zc = 9 zm = 9 zs = 0 f = 2 IDAT size = 35389940
zc = 9 zm = 9 zs = 1 f = 2 IDAT size = 35207170

Selecting parameters:
zc = 9 zm = 9 zs = 1 f = 2 IDAT size = 35207170

Output IDAT size = 35207170 bytes (16667017 bytes decrease)
Output file size = 35208692 bytes (16686013 bytes = 32.15% decrease)

And new size,

34M Aug 25 12:41 Scan-140825-0001.jpeg.png

Got 32% savings on that one!

BTW, I choose PNG over TIFF because TIFF has too many standards and, at
least according to digikam, won't hold a lot of the metadata for the images.

Also use identify from either GM or ImageMagick to see that it's a
reporting inconsistency as you can write color depth as either 3x16 or
48 bit, or even 64 bit if you have 4 channels!

On 08/25/2014 03:51 AM, Niels Ott wrote:
Hi there,

I recently got a new used camera with 14MPix and now I feel like
re-thinking my archiving options for edited images. So far, I had a
6MPix Pentax from whose PEF files I created PNG files. But with the
14MPix of the K-7, the PNGs range from approx. 69 to 76MB per image.
Outch! So I checked out the other formats that DigiKam supports.

Example (compression rates depend on particular image, of course):

PEF/RAW: 14,7MB, "16 bpp", "not calibrated"

TIFF: 72.6MB, "16 bpp", "RGB"
PNG: 70.9MB, "16 bpp", "RGB"
PGF: 62.6MB, "48 bpp", "RGB"
JPEG2000: 31.5MB, "0 bpp", "unknown"

Now I'm wondering: What's up with the bpp values? Shouldn't they all be
the same: 48bpp (3x16 bit per pixel in RGB mode)

Also, I'm stunned by JPEG2000's apparently superior lossless performance
(and flabbergasted by the tremendously slow implementation) - or is
there simply something going wrong here?

My DigiKam Version is the one of Ubuntu 12.04.5, namely 2.5.0

I know that disk space is cheap nowadays, but I'm also doing audio
recording and video stuff, so I fill up disks quickly and the JPEG2000
compression level seems awesome to me. But how to check that it doesn't
fool me and how safe is this to still work in the future? JPEG2000 is
considered dead by many people.

Thanks in advance for your ideas.

Best

Niels


Digikam-users mailing list
[hidden email]
https://mail.kde.org/mailman/listinfo/digikam-users





Digikam-users mailing list
[hidden email]
https://mail.kde.org/mailman/listinfo/digikam-users



--
Sent from my Android phone with K-9 Mail. Please excuse my brevity.
_______________________________________________
Digikam-users mailing list
[hidden email]
https://mail.kde.org/mailman/listinfo/digikam-users
Reply | Threaded
Open this post in threaded view
|

Re: Color depth in lossless formats

Milan Knížek-2
In reply to this post by Niels Ott
Niels Ott píše v Po 25. 08. 2014 v 10:51 +0200:
> Hi there,
>
> Also, I'm stunned by JPEG2000's apparently superior lossless performance
> (and flabbergasted by the tremendously slow implementation) - or is
> there simply something going wrong here?
>
Things might have change since I looked at JPEG2000 implementation:
then, it was not capable of storing colour profile (ICC). Also check if
it can store XMP (well, you could workaround that by a side-car file,
even that it is not so convenient).

It is a pitty they kept the wavelet compression proprietary, they just
digged a grave for it.

I personally use plain JPEG at full resolution with some not-so-high
quality settings - enough for tagging and web publishing - and keep the
original RAW file with XMP of darktable (which I use to develop RAWs).

Only when it is intended for print on sufficiently capable printer, I
use 16-bit TIFF.

Milan

_______________________________________________
Digikam-users mailing list
[hidden email]
https://mail.kde.org/mailman/listinfo/digikam-users
Reply | Threaded
Open this post in threaded view
|

Re: Color depth in lossless formats

Niels Ott
Am 26.08.2014 um 22:06 schrieb Milan Knížek:
> Things might have change since I looked at JPEG2000 implementation:
> then, it was not capable of storing colour profile (ICC).

At least with my version of DigiKam, there is no color profile stored.
So on re-opening with the image editor, DigiKam asks which profile to use.

> Also check if
> it can store XMP (well, you could workaround that by a side-car file,
> even that it is not so convenient).

Wikipedia says, JPEG2000 has space for arbitrary meta data stored as
XML. Furthermore, XMP seems to be based on RDF, which again often uses
XML for actual data storage. Hence it should be possible, but it's most
likely not implemented.

> It is a pitty they kept the wavelet compression proprietary, they just
> digged a grave for it.

Apparently yes.

I'm still not decided. For me it is not important that JPEG2000 isn't
supported by many programs. Except for Gimp, whose additional JPEG2000
plugin I still need to try. I use Gimp when I edit images for putting
them on the web (as normal JPEG).

So to me, there are two major points about a lossless format:

a) Can it be opened by the (few) tools I need in a
   convenient way (slow? hard to use?)

b) Can I convert it to something else in a couple of years/
   will the format still be supported?

It's hard to judge about b) for me. Thinking back, PNG was so great when
it was new, but it took years for browsers to support it, even though it
was explicitly made for the web. But PNG is open, JPEG2000 isn't exactly.

> Only when it is intended for print on sufficiently capable printer, I
> use 16-bit TIFF.

For many images, I don't know where they'll end up. I put some on the
web, but I also give some to labs to get real prints. So I rather don't
want to lose anything in between. My favorite photo lab btw accepts
16bit PNG files on CD, so I actually never go lossy.

Best
Niels

_______________________________________________
Digikam-users mailing list
[hidden email]
https://mail.kde.org/mailman/listinfo/digikam-users