We're Hiring!

Export Big TIFF support for files over 4GB

General user discussion about using the OMERO platform to its fullest. Please ask new questions at https://forum.image.sc/tags/omero
Please note:
Historical discussions about OMERO. Please look for and ask new questions at https://forum.image.sc/tags/omero

There are workflow guides for various OMERO functions on our help site - http://help.openmicroscopy.org

You should find answers to any basic questions about using the clients there.

Export Big TIFF support for files over 4GB

Postby a.herbert » Thu Aug 18, 2011 9:55 am

Hi,

I am trying to find out more information about the support for Big TIFF export within OMERO Insight. I found this ticket which seems to indicate that work is in progress but is not mature:

http://trac.openmicroscopy.org/ome/ticket/4799#

We use time-lapse microscope images that regularly surpass 4GB in size. At current we can upload the images to OMERO but cannot export them as OME TIFF. We have to use the archive option to be able to extract the data in its original form. If I use 'Export as OME-TIFF' the activities manager takes a long time to process the task. It then says the image is exported but the file size on disc is 0 bytes.

If I log on to the server I can see the file being generated in a tmp directory under:

$OMERO_TMPDIR/omero/tmp/omero_omero/

I can monitor this file size using the watch command. However when the file hits 4Gb the export fails and the file is reset to size 0.

The same thing happens when I use the Gateway API to create an Exporter and use the generateTiff() method:

Code: Select all
    image_id = 3 # Use a file over 4GB
    e = conn.createExporter();
    e.addImage(image_id);
   
    try:
        length = e.generateTiff()

        out = open('/tmp/test.ome.tif', 'wb')

        read = 0
        while True:
            buf = e.read(read, 1000000);
            out.write(buf)
            if len(buf) < 1000000:
                break
            read += len(buf);
           
        out.close()

    finally:
        e.close();


I've tried this a few times and I get various errors:

  • The file reaches several GB until I get an ::Ice::CommunicatorDestroyedException thrown. The file is set to 0 bytes.
  • After some processing time I get an ::omero::ApiUsageException thrown with message: Only one image supported for TIFF, not 0. However the file still continues to be generated on the server side until it reaches 4GB. It does not get reset to 0 bytes. This file is truncated since I cannot read the entire file into ImageJ using BioFormats (I get errors and blank planes).

What do you suggest for storage of our large files? Archiving them roughly doubles the amount of disk space used for each file (which is a costly solution). In addition any scripts that process the images and create new equivalent sized images cannot be exported since they have no corresponding original file; they only have the pixel data. Thus they would have to be split into smaller images, exported and then joined up outside of OMERO.

Thanks,

Alex

PS. Apologies for cross-posting. I tried this in the Developers forum a week ago and had no luck so I thought I'd try this forum instead.
a.herbert
 
Posts: 53
Joined: Tue Jan 11, 2011 1:35 pm

Re: Export Big TIFF support for files over 4GB

Postby jmoore » Fri Aug 19, 2011 6:13 am

Hi Alex,

sorry for having missed the other post. Could you possibly send us your insight log (Help -> Show log file) as well as your server log, var/log/Blitz-0.log? We'll see if we can pinpoint which components need fixing. This may be as straight-forward as detecting the size and setting the bigtiff flag. (Ticket 4799 deals with writing JPEG2000 compressed bigtiff files, as opposed to straight bigtiffs)

Thanks,
~Josh.
User avatar
jmoore
Site Admin
 
Posts: 1591
Joined: Fri May 22, 2009 1:29 pm
Location: Germany

Re: Export Big TIFF support for files over 4GB

Postby ajpatterson » Fri Aug 19, 2011 9:38 am

Hi Alex,

One other thought, that is a bit of a long shot. Are you using a FAT32 file system? They have a 4GB limit on the size of any individual file. This would apply both to the end destination and the temp folder the server is using to build the file.

Ta,

Andrew
--
Andrew Patterson
Software Developer, Open Microscopy Environment
Wellcome Trust Centre for Gene Regulation and Expression
University of Dundee
ajpatterson
 
Posts: 50
Joined: Fri May 01, 2009 11:18 am

Re: Export Big TIFF support for files over 4GB

Postby a.herbert » Fri Aug 19, 2011 10:01 am

Hi Josh,

Thanks for the reply. I moved my current Blitz-0.log to Blitz-0.log.1 and restarted OMERO.

The file I am trying to export is a DeltaVision file named 130711_R3D.dv. It's archived byte size is 5075496960 (4 Gigabytes, 744 Megabytes) and the raw pixel byte size is 5075107840. So the file is definitely over 4GB.

I started Insight, opened the file for viewing, verified that I could see data in all the time frames and then selected 'Export as OME-TIFF'.

I watched the file in the server tmp directory and it grew to 4GB, then was reset to zero bytes. The Activities manager in Insight then said that the image had been exported. It was saved to my chosen output directory as a zero byte file.

Find attached the two log files. I tailed the Insight log to only include the current days activity.

I am using OMERO 4.3.0. I'm happy to try different versions on my local install to test this for you.

@Andrew,

Hi Andrew,

Good idea about the FAT32 file system but the server and my local machine are both running Ext3 on Linux. I can export my archived file so I know the system can input/output these big files. It is just the TIFFs that are limited to 4GB.

The reason we want this to work is that our scientists previously used FAT32 formatted USB drives to move data around between Linux and Windows machines. This is no longer viable and so we are using OMERO. Our Linux machines are limited to an old OS with no NTFS write support due to software dependencies. Installing an NTFS write module to the old kernel is one other option I really don't want to try just yet.

Regards,

Alex
Attachments
BigFile_export_logs.tgz
(122.53 KiB) Downloaded 261 times
a.herbert
 
Posts: 53
Joined: Tue Jan 11, 2011 1:35 pm

Re: Export Big TIFF support for files over 4GB

Postby jmoore » Fri Aug 19, 2011 10:16 am

Hi Alex,

thanks for the logs. You'll have seen that I've filed ticket 6520 which we'll start looking into. In the mean time, you may want to upgrade to 4.3.1 if you haven't tried them already.

Cheers,
~Josh.
User avatar
jmoore
Site Admin
 
Posts: 1591
Joined: Fri May 22, 2009 1:29 pm
Location: Germany

Re: Export Big TIFF support for files over 4GB

Postby a.herbert » Fri Aug 19, 2011 12:07 pm

Hi Josh,

Thanks for adding me to the ticket.

Regards,

Alex
a.herbert
 
Posts: 53
Joined: Tue Jan 11, 2011 1:35 pm

Re: Export Big TIFF support for files over 4GB

Postby a.herbert » Wed Sep 14, 2011 2:29 pm

Hi,

I have obtained the latest version of the OMERO source code (4.3.2-DEV) and have been testing the export of Big-TIFF images.

The recent code changes added to the server now allow me to export my large images. I can see the ome.tiff file being generated by the server in the OMERO_TMP directory and it successfully exceeds 4GB.

However once the file has been generated on the server-side I get an exception within Insight as it tries to download the file:

Code: Select all
Data Retrieval Failure: org.openmicroscopy.shoola.env.data.DSAccessException: Cannot export the image as an OME-TIFF
   at org.openmicroscopy.shoola.env.data.OMEROGateway.exportImageAsOMETiff(OMEROGateway.java:7005)
   at org.openmicroscopy.shoola.env.data.OmeroImageServiceImpl.exportImageAsOMETiff(OmeroImageServiceImpl.java:1458)
   at org.openmicroscopy.shoola.env.data.views.calls.ExportLoader$1.doCall(ExportLoader.java:76)
   at org.openmicroscopy.shoola.env.data.views.BatchCall.doStep(BatchCall.java:144)
   at org.openmicroscopy.shoola.util.concur.tasks.CompositeTask.doStep(CompositeTask.java:226)
   at org.openmicroscopy.shoola.env.data.views.CompositeBatchCall.doStep(CompositeBatchCall.java:126)
   at org.openmicroscopy.shoola.util.concur.tasks.ExecCommand.exec(ExecCommand.java:165)
   at org.openmicroscopy.shoola.util.concur.tasks.ExecCommand.run(ExecCommand.java:274)
   at org.openmicroscopy.shoola.util.concur.tasks.AsyncProcessor$Runner.run(AsyncProcessor.java:91)
   at java.lang.Thread.run(Thread.java:636)
Caused by: org.openmicroscopy.shoola.env.data.DSAccessException: Cannot export the image as an OME-TIFF
   at org.openmicroscopy.shoola.env.data.OMEROGateway.exportImageAsOMETiff(OMEROGateway.java:6992)
   ... 9 more
Caused by: omero.InternalException
    serverStackTrace = "java.lang.NegativeArraySizeException
                           at ome.services.blitz.impl.ExporterI.read(ExporterI.java:477)
                           at ome.services.blitz.impl.ExporterI.read_async(ExporterI.java:229)
                           at omero.api._ExporterTie.read_async(_ExporterTie.java:85)
                           at omero.api._ExporterDisp.___read(_ExporterDisp.java:198)
                           at omero.api._ExporterDisp.__dispatch(_ExporterDisp.java:280)
                           at IceInternal.Incoming.invoke(Incoming.java:159)
                           at Ice.ConnectionI.invokeAll(ConnectionI.java:2037)
                           at Ice.ConnectionI.message(ConnectionI.java:972)
                           at IceInternal.ThreadPool.run(ThreadPool.java:577)
                           at IceInternal.ThreadPool.access$100(ThreadPool.java:12)
                           at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:971)
                        "
    serverExceptionClass = "java.lang.NegativeArraySizeException"
    message = ""
   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   at java.lang.reflect.Constructor.newInstance(Constructor.java:532)
   at java.lang.Class.newInstance0(Class.java:372)
   at java.lang.Class.newInstance(Class.java:325)
   at IceInternal.BasicStream$DynamicUserExceptionFactory.createAndThrow(BasicStream.java:2243)
   at IceInternal.BasicStream.throwException(BasicStream.java:1632)
   at IceInternal.Outgoing.throwUserException(Outgoing.java:442)
   at omero.api._ExporterDelM.read(_ExporterDelM.java:177)
   at omero.api.ExporterPrxHelper.read(ExporterPrxHelper.java:253)
   at omero.api.ExporterPrxHelper.read(ExporterPrxHelper.java:225)
   at org.openmicroscopy.shoola.env.data.OMEROGateway.exportImageAsOMETiff(OMEROGateway.java:6986)
   ... 9 more


I have looked at the code and the bug is due to the use of integers to perform position arithmetic on the file. Since the file is over the size of an integer this generates a negative size index that is eventually passed to the server ExporterI.read(position, size) method.

I have fixed this using the following code within org.openmicroscopy.shoola.env.data.OMEROGateway.exportImageAsOMETiff(...):

Code: Select all
...
long size = store.generateTiff();
long offset = 0;
try {
    for (offset = 0; (offset+INC) < size;) {
        stream.write(store.read(offset, INC));
        offset += INC;
    }   
} finally {
    stream.write(store.read(offset, (int)(size-offset)));
    stream.close();
}
...


The code change is a simple switch to using longs to calculate the final size to be passed to the read() method in the finally block.

Regards,

Alex
a.herbert
 
Posts: 53
Joined: Tue Jan 11, 2011 1:35 pm

Re: Export Big TIFF support for files over 4GB

Postby a.herbert » Wed Sep 14, 2011 3:12 pm

Hi,

Further testing has shown that the same code is used in ome.ij.data.Gateway.exportImageAsOMETiff(...). Therefore the fix suggested above should also be implemented in this class as well:

Code: Select all
long size = store.generateTiff();
long offset = 0;
try {
    try {
        for (offset = 0; (offset+INC) < size;) {
            stream.write(store.read(offset, INC));
            offset += INC;
        }   
    } finally {
        stream.write(store.read(offset, (int)(size-offset)));
        stream.close();
    }
} catch (Exception e) {
    if (stream != null) stream.close();
    if (f != null) f.delete();
}


Regards,

Alex
a.herbert
 
Posts: 53
Joined: Tue Jan 11, 2011 1:35 pm


Return to User Discussion

Who is online

Users browsing this forum: Google [Bot] and 0 guests