Page 1 of 2

How to view large PNG images after upload?

PostPosted: Wed May 07, 2014 8:22 pm
by jamesan
Hello,
I'm learning using the demo server and a server I've setup locally. In both cases I can't seem to view a 22k x 22k PNG file I've uploaded. Is there something I need to do to construct image pyramids?

Here is the path to the data on demo.openmicroscopy.org, the image can be public:
janderson_2611/2014-05/07/18-42-15.740/1400_TEM_Leveled.png

The error reported by the viewer is:
"Error instantiating pixel buffer: /repositories/demo/PixelsDir-166/166928_pyramid

The message is cutoff after that.

There are out-of-memory errors for my local server. I've upped the limits to those suggested in the docs. Is there a formula for how to scale the memory limits?

Thanks,
James

Re: How to view large PNG images after upload?

PostPosted: Thu May 08, 2014 10:46 am
by jmoore
Hi James,

jamesan wrote: I can't seem to view a 22k x 22k PNG file I've uploaded. Is there something I need to do to construct image pyramids? ... There are out-of-memory errors for my local server. I've upped the limits to those suggested in the docs.


We were seeing the out of memory exceptions, too. The server not having enough memory configured can lead to a 0-size file being generated. This may also be what you're seeing. Try:
Code: Select all
bin/omero admin fixpyramids --dry-run /OMERO

and delete any or all of the offending files. (Omitting --dry-run will perform the delete).


Is there a formula for how to scale the memory limits?

Not at the moment, but we'll try to come up with one.

Cheers,
~Josh

P.S. the pyramid for image 166979 is now generated; the other is running as we speak.

Re: How to view large PNG images after upload?

PostPosted: Thu May 08, 2014 4:12 pm
by jamesan
Thanks.

I uploaded the second image after making the post because the first image disappeared from the UI after a few attempts at displaying it. I'm not sure if that is a bug or if we were working on the image at the same moment.

Running that command repaired the image and it is visible.

Re: How to view large PNG images after upload?

PostPosted: Wed May 14, 2014 7:31 pm
by jamesan
Before I try bumping the limits further is the underlying implementation trying to load the entire image into memory? I've tried uploading a downsampled by two PNG with dimensions 89731 x 89926. The out of memory errors have returned.

Is a more correct step to research how to write a Python script to stream the image to the Omero server as it is generated?

Re: How to view large PNG images after upload?

PostPosted: Thu May 15, 2014 12:27 am
by mlinkert
In effect, yes, the whole image does need to be read into memory. This is an artifact of how the PNG format works - reading a specific tile from the image requires decompressing and decoding everything before that tile.

We may be able to improve this situation, and there is now a ticket on our issue tracking system:

http://trac.openmicroscopy.org.uk/ome/ticket/12274

If you would like to be notified of updates to that ticket, please let us know.

Re: How to view large PNG images after upload?

PostPosted: Mon May 19, 2014 6:17 pm
by jamesan
Please keep me notified.

Is there any existing format that I could use which avoids the issue? I am using Python's Pillow library so I am not tied to PNG for the final export. I hate to ask for resources be spent on this for what is probably an edge case. I was only testing the most direct route to a goal.

Is there a way to remove the limits entirely? If there is no guarantee that Omero can run in a reduced memory space my preference would for Omero to default to an unlimited memory use configuration on initial installation. The server I have it running on has 64GB of RAM.

Re: How to view large PNG images after upload?

PostPosted: Wed May 21, 2014 7:41 am
by jmoore
jamesan wrote:Please keep me notified.


I added you to the ticket CC.

Is there any existing format that I could use which avoids the issue? I am using Python's Pillow library so I am not tied to PNG for the final export. I hate to ask for resources be spent on this for what is probably an edge case. I was only testing the most direct route to a goal.


If you can get Pillow to write out tiled TIFFs, then that would be the most efficient.

Is there a way to remove the limits entirely?


Not likely, or at least not in a reasonable time frame. Some formats simply aren't conducive to reading plane sections.

If there is no guarantee that Omero can run in a reduced memory space my preference would for Omero to default to an unlimited memory use configuration on initial installation. The server I have it running on has 64GB of RAM.


Unfortunately, that's not possible either with Java. An explicit upper limit must be set. We have debated back and forth on how high to set the initial upper limit. At the moment, it's quite low with the intent that anyone can get started, but that then a "hardening" configuration needs to be used to put the server into production. Do you think if there was a single command to do this "hardening" that that would meet your needs/expectations?

Cheers,
~Josh

Re: How to view large PNG images after upload?

PostPosted: Thu May 22, 2014 5:25 pm
by jamesan
jmoore wrote:I added you to the ticket CC.


Thank you

jmoore wrote:If you can get Pillow to write out tiled TIFFs, then that would be the most efficient.


I already export tiles for my viewer. I'll check the Omero docs to see what it expects.

jmoore wrote:
jamesan wrote:Is there a way to remove the limits entirely?

Not likely, or at least not in a reasonable time frame. Some formats simply aren't conducive to reading plane sections.


Originally I was referring to the Java limits. I agree PNG isn't a good choice to scale to these large sizes. Is Omero able to read something like an uncompressed TIFF file without loading the entire file? I generally like having a single file blob of all the data if the file is only a few GB.

jmoore wrote:
jamesan wrote:If there is no guarantee that Omero can run in a reduced memory space my preference would for Omero to default to an unlimited memory use configuration on initial installation. The server I have it running on has 64GB of RAM.


Unfortunately, that's not possible either with Java. An explicit upper limit must be set. We have debated back and forth on how high to set the initial upper limit. At the moment, it's quite low with the intent that anyone can get started, but that then a "hardening" configuration needs to be used to put the server into production. Do you think if there was a single command to do this "hardening" that that would meet your needs/expectations?


I was worried it was a Java thing. I haven't used Java in my career besides seeing it cause memory management problems in Matlab too.

I'd suggest the "hardening" command run during setup automatically and configure Java to take full advantage of system resources. The server install is involved enough that few are doing it for fun. A developer has the skills to turn down the memory limits for testing. An IT person runs Omero in a VM or a dedicated machine and wants it to use all of the resources provided. The extra config layer is just more work. Shipping non-production settings just causes errors and adds extra steps to setup.

Perhaps the sophisticated user can use a command like
Code: Select all
omero admin config memory=50%
to tell Omero to configure Java to use at most 50% of the system memory. I can only think of some rare cases I'd want to do that. If I was the dev I wouldn't bother until asked.

Re: How to view large PNG images after upload?

PostPosted: Thu May 22, 2014 6:17 pm
by jmoore
James,

jamesan wrote:I'd suggest the "hardening" command run during setup automatically and configure Java to take full advantage of system resources. The server install is involved enough that few are doing it for fun. A developer has the skills to turn down the memory limits for testing. An IT person runs Omero in a VM or a dedicated machine and wants it to use all of the resources provided. The extra config layer is just more work. Shipping non-production settings just causes errors and adds extra steps to setup.

Perhaps the sophisticated user can use a command like
Code: Select all
omero admin config memory=50%
to tell Omero to configure Java to use at most 50% of the system memory. I can only think of some rare cases I'd want to do that. If I was the dev I wouldn't bother until asked.


Duly noted: https://trac.openmicroscopy.org.uk/ome/ticket/12311 - the idea of using some high-ish percentage at the default may work well. Thanks. ~J.

Re: How to view large PNG images after upload?

PostPosted: Thu May 22, 2014 7:14 pm
by mlinkert
I already export tiles for my viewer. I'll check the Omero docs to see what it expects.


The only requirement is for a single TIFF file that stores the image(s) internally as a set of tiles, rather than storing the entire image in one big block - this is part of the TIFF standard, and a common feature for software that writes TIFF files. I believe that Pillow can do this, but am not positive; if you aren't sure, we can verify given either a file or the output of 'tiffinfo file.tiff' (tiffinfo being part of libtiff: http://www.libtiff.org/tools.html).

Originally I was referring to the Java limits. I agree PNG isn't a good choice to scale to these large sizes. Is Omero able to read something like an uncompressed TIFF file without loading the entire file? I generally like having a single file blob of all the data if the file is only a few GB.


Yes, but again, it's really best if the image is internally broken up into tiles as that makes it much faster and easier to request small pieces of the image at a time.