FAQ -- Frequently Asked Questions about HDF Dec 1, 1993 Contents ------------------------------------------------------------------------ 1. What is HDF? 2. What is the latest official release of HDF? 3. What does the latest official release of HDF contain? 4. What are the new features included in the latest version of HDF? 5. What is in the HDF library? 6. How does the 'integration of netCDF with HDF' affect application programmers? 7. What are HDF command-line utilities? 8. How many platforms does HDF run on? 9. Which NCSA tools can I use to view HDF objects? 10. Is there any commercial visualization software that accepts HDF files? 11. Are there any conversion programs available for me to convert non-HDF image files into HDF files or vise versa? 12. Where could I get source code and information relevant to HDF and HDF utilities? 13. What are the files in HDF/HDF3.3r2 on ftp.ncsa.uiuc.edu? 14. How do I get source code for the HDF library and utilities? 15. How do I install the base HDF library on my computer? 16. How do I install the netCDF/HDF (multi-file) library on my computer? 17. How do I compile my application programs that call HDF functions? 18. What documentation for HDF is available on the NCSA ftp server? 19. How can I obtain machine readable copies of the documentation? 20. How do I get hard copies of HDF documentation? 21. Can new versions of HDF read hdf files written using older versions of the HDF library? 22. Can application programs that work with old versions of the HDF library always be compiled with new versions of HDF? 23. Does HDF support data compression? 24. Is there a mailing list for HDF discussions and questions? 25. How can a user share with other users his port of HDF to a machine which is not currently supported by HDF? 26: How do I contribute my software to HDF user community? 27. How do I make a bug report? ------------------------------------------------------------------------ Q1: What is HDF? A1: . HDF stands for Hierarchical Data Format. It is a multi-object file format for the transfer of graphical and numerical data between machines. . HDF is a versatile file format. It supports six different data models. Each data model defines a specific type of data and provides a convenient interface for reading, writing, and organizing a unique set of data elements. . HDF is a self-describing format, allowing an application to interpret the structure and contents of a file without any outside information. . HDF is a flexible file format. With HDF, you can group sets of related objects together and then access them as a group or as individual objects. There are pre-defined sets for raster images and floating point multidimentional arrays. User can also create their own grouping structures using an HDF feature called vgroups. . HDF is an extensible file format. It can easily accommodate new data models, regardless of whether they are added by the HDF development team or by HDF users. . HDF is a portable file format. HDF files can be shared across platforms. An hdf file created on one computer, say a Cray supercomputer, can be read on another system, say IBM PC, without modification. . HDF is available in the public domain. Q2: What is the latest official release of HDF? A2: The latest official release is HDF3.3r2. Q3: What does the latest official release of HDF contain? A3: HDF3.3r2 contains the HDF library, HDF command-line utilities, and test programs. Q4: What are the new features included in the latest version of HDF? A4: HDF 3.3 supports: . reading and writing JPEG compressed 24-bit and 8-bit raster images . reading and writing hyperslabs from an SDS. . reading and writing Little-Endian (PC Native) mode files on all platforms. . integration of netCDF with HDF. . a new multi-file SDS interface which supports "time-series" data as well as a more general metadata abstraction. Q5: What is in the HDF library? A5: HDF currently supports six data structure types: 8-bit raster images, 24-bit raster images, color palettes, multi-dimensioned arrays (the scientific data sets and netCDF model), text entries and vdatas (binary tables). The HDF library coontains two parts: the base library and the multi-file library. HDF library functions can be called from C or FORTRAN user application programs. The base library contains a general purpose interface and six application level interfaces, one for each data structure type. These application level interfaces are specifically designed to read, write and manipulate one type. The general purpose interface contains functions, such as file I/O, error handling, memory management and physical storage. The multi-file part integrates netCDF model with HDF Scienctific data sets, and supports simultaneous access to multiple file and multiple object. This part is referred to as netCDF/HDF library in the rest of this FAQ. Q6: How does the 'integration of netCDF with HDF' affect application programmers? A6: The netCDF/HDF library in HDF 3.3 was designed to be completely transparent to the programmer. HDF 3.3 supports a new "multi-file" SDS interface and the complete netCDF interface as defined by Unidata netCDF Release 2.3.2. Using either interface, you are able to read XDR-based netCDF files, HDF-based netCDF files and pre-HDF3.3 HDF files. The library figures out what type of file is being accessed and handles it appropriately. Any of the above types of files may be modified. However, the library will only create new files based on HDF (you can't create new XDR-based netCDF files). Q7: What are HDF command-line utilities? A7: HDF command line utilities are application programs that can be executed by entering them at the command level, just like other UNIX commands. They provide capabilities for doing things with HDF files for which you would normally have to write your own program. For example, the utility r8tohdf is a program that takes a raw raster image from a file and stores it in an HDF file in a raster image set. Q8: How many platforms does HDF run on? A8: HDF 3.3 Release 2 has been tested on the following machines: Platform 'base library' HDF/netCDF Sun4 X X IBM/RS6000 X X SGI X X Convex * X X Cray Y-MP X X NeXT X X HP X X VMS X X DecStation X X Mac X ** IBM PC - MSDOS *** **** IBM PC - Windows *** **** DEC Alpha X Fujitsu VP X * When compiling the mfhdf section of the library on a Convex3 you will need to set the environment variable 'MACHINE' to 'c3' before running the configure script. ** The netCDF interface works but you currently are unable to read old netCDF(XDR) files. In addtion, the fortran interface and ncdump utility has yet to be finished. *** There is no FORTRAN support for either PC version of HDF 3.3r2. **** The netCDF half of the HDF/netCDF merger is not working correctly, but the multi-file SD interface is working correctly. Q9: Which NCSA tools can I use to view HDF objects? A9: NCSA has also developed a suite of software tools for scientific visualization that are based on HDF. For the Mac, NCSA DataScope and NCSA Image and NCSA Collage can be used to view HDF files. On the PC, NCSA Audible Image (PC version of Collage) uses HDF. X-based workstations can use XImage, NCSA XDataSlice and NCSA Polyview. NCSA Collage, a cross-platform collaborative tool, displays and manipulates HDF files. NCSA Mosaic, a networked information browser and World-Wide-Web client, can display the contents of HDF files. Q10: Is there any commercial or public domain visualization software that accepts HDF files? A10: Commercial software -- Spyglass, Wavefront, PCI, PV-Wave, IDL, AVS, Data Explorer and IRIS Explorer. Public domain software -- FREEFORM and GRASS and NCSA visualization tools. Q11: Are there any conversion programs available to convert non-HDF image files into HDF files or vise versa? A11: HDF utilities r8tohdf, paltohdf, hdftor8, hdftopal, hdf24to8 and ristosds convert between raw raster images, palettes and hdf files. The NCSA tool Reformat/XReformat can convert GIF, TIFF, FITS, X Window dump, Sun raster and 8-bit raw image files into HDF. The SDSC Image Tools are tools developed at San Diego Supercomputer Center to handle image manipulation and file format conversion for a wide range of more than 20 file formats. Q12: Where could I get source code and information relevant to HDF and HDF utilities? A12: HDF and HDF utilities are public domain software. They are on the NCSA anonymous ftp server, in subdirectory HDF/. The contents in HDF/ are: README describing files and subdirectories in ftp/HDF/. FAQ This file, frequently asked questions about HDF. prev_releases releases previous to HDF3.3r2, including: HDF3.1r5, Macintosh version of HDF3.1r3; HDF3.2r2; HDFVset2.1; HDF3.2r3; HDF3.2r4; HDF3.3Beta; HDF3.3r1. HDF3.3r2/ HDF 3.3 release 2 (latest release) contrib/ contributions from hdf users outside and inside NCSA examples/ examples of hdf programs--good for testing, too newsletters/ HDF newsletters tarexamples/ compressed tar files of examples HDFVset/ README -- where to get old/new version of HDFVset Q13: What are the files in HDF/HDF3.3r2 on ftp.ncsa.uiuc.edu? A13: There are two files and six subdirectories in HDF/HDF3.3r2/. A README file describes the structure of that directory. An ABOUT_3.3r2 file addresses new features provided in HDF3.3r2. The six subdirectories are: 1. unpacked -- contains all source code. 2. patches -- bug fixes to the source. The patch files can be used to patch the source code in the subdirectory unpacked/. 3. tar -- contains compressed UNIX archived version of, the code in unpacked/. Users should check the patches/ directory to find out which files have been changed and decide whether or not the patches are needed for their applications. 4. hqx -- contains Macintosh BinHexed archived version of, the code in unpacked/. Users should check the patches/ directory to find out which files have been changed and decide whether or not the patches are needed for their applications. 5. zip -- contains IBM-PC compressed archived version of, the code in unpacked/. Users need to check the patches/ directory to find out which files have been changed and decide whether or not the patches are needed for their applications. 6. doc -- documentation for HDF and Vsets. Each subdirectory has a README file explaining what files are in that directory and how to use those files. Q14: How do I get source code for HDF library and utilities? A14: You may obtain HDF via FTP, an archive server, or US mail. FTP server: If you are connected to Internet (NSFNET, ARPANET, MILNET, etc) you may download HDF source code at no charge from the anonymous ftp server at NCSA. The Internet address of the server is: ftp.ncsa.uiuc.edu or 141.142.20.50 Log in by entering anonymous for the name and enter your local e-mail address (login@host) for the password. After logged in to ftp you need to change directory to HDF/HDF3.3r2/. If you want packed source code, change directory to tar/ or hqx/ or zip/ (see A11 above). Files in those directories need to be transferred using binary mode. If you want unpacked source code, change directory to unpacked/ and transfer to your host all the files in unpacked/ and in its subdirectories. If you have any questions regarding this procedure or whether you are connected to Internet, consult your local system administration or network expert. Archive server: E-mail a request to: archive-server@ncsa.uiuc.edu Include in the subject or message line the word "help", then press RETURN. Send another e-mail request to: archive-server@ncsa.uiuc.edu Include in the subject or message line the word "index", then press RETURN. The information you receive from both the help and index commands will give you further instructions on obtaining NCSA software. Refer to Chapter one of HDF Calling Interfaces and Utilities for details. US mail: A tape or CDROM archive of HDF is also available for purchase through the NCSA technical Resources Catalog. To obtain a catalog, contact: NCSA Documentation Orders 152 Computing Applications Building 605 East Springfield Avenue Champaign, IL 61820 (217) 244-4130 Q15: How do I install the HDF base library on my computer? A15: HDF 3.3 base library can be built with a single command from the top level directory where the subdirectories src/, util/ and test/ reside. The file Makefile.template is a generic, machine independent Makefile which you can modify if there is no Makefile already built for your machine. For convenience, there are also machine customized makefiles. For example MAKE.IBM6000 is a Makefile suitable for compiling HDF on an IBM RS/6000. Assuming you are on an IBM RS/6000, copy MAKE.IBM6000 to Makefile and use the following commands to install different targets: cp MAKE.IBM6000 Makefile make make allnofortran --- builds the HDF library and only the C interfaces, the utilities and the C test programs. make all --- builds the HDF library with the C and FORTRAN interfaces, the utilities, and C and FORTRAN test programs. Refer to the file INSTALL.TOP in the top level for making other targets. Q16: How do I install the netCDF/HDF (multi-file) library on my computer? A16: HDF 3.3 netCDF/HDF library has an automatically configuring Makefile system. Modify the file named 'CUSTOMIZE' in the mfhdf (stands for multi-file hdf) directory and run the script named 'configure'. It will set up all the Makefiles correctly. Refer to the README in mfhdf/ for instruction and direction. You need to build the HDF base library before installing the netCDF/HDF library. See Q15 above for how to build the HDF base library. Q17: How do I compile application programs that call HDF functions? A17: To use HDF routines in your C program, you must have the line '#include "hdf.h"', if you don't use netCDF/HDF library, or '#include "mfhdf.h"', otherwise, near the beginning of your code. Applications that need netCDF or multi-file SDS functionality should link with both 'libdf.a' and 'libnetcdf.a'. Applications that use neither of these interfaces can just link with the 'libdf.a' library for the base level of HDF functionality. If you are on a SUN SPARC, the include files are in the directory 'incdir', the base library file 'libdf.a' is in 'libdir', and the netCDF/HDF library file 'libnetcdf.a' is in 'mflibdir'. Use the following command to compile a C program 'myprog.c': cc -DSUN -DHDF -Iincdir myprog.c mflibdir/libnetcdf.a \ libdir/libdf.a -o myprog or cc -DSUN -Iincdir myprog.c -o myprog -L mflibdir -lnetcdf \ -L libdir -ldf The 'mflibdir/libnetcdf.a' or '-L mflibdir -lnetcdf' needs not be included if you are not using the multi-file interface. For FORTRAN programs, if your FORTRAN compiler accepts 'include' statements, you may include constant.i, dffunc.i, and netcdf.inc in your program. Otherwise, you need to declare in your program all the constants used and functions called by the program. To compile a FORTRAN program 'myprogf.f' use: f77 -o myprogf myprogf.f mflibdir/libnetcdf.a libdir/libdf.a Again, the 'mflibdir/libnetcdf.a' needs not be included if you are not using the multi-file interface. Q18: What documentation for HDF is available on the ftp server? A18: Current HDF Documentation: "HDF Calling Interfaces and Utilities" was designed for users who use HDF to store or manipulate their data. It describes the type of data each interface deals with and all the routines contained in each interface. HDF command-line utilities are also described in this manual. It has been updated to HDF3.2. "HDF Specification and Developer's Guide" was designed for those who need detailed information about HDF, such as HDF application program interface developers. It describes the basic structure, components, and software layers of HDF; specifies supported HDF tags and discusses the portability of HDF. It has been updated to HDF3.2. Currently, users may use "HDF Vset 2.0", "vset2.1.extra.doc" and the Vset section of ABOUT_3.3r2 of HDF3.3 as temporary reference for HDF Vset. "HDF Vset 2.0" was designed for users who use HDF Vset to store and manipulate their data. It describes the organization of data within vdatas, vgroups and vsets, routines that manipulate vdatas and vgroups, and vset utilities that manipulate vsets. "Vset2.1.extra.doc" describes routines added in Vset2.1 and not covered by "HDF Vset 2.0". The "ABOUT_3.3r2" of HDF3.3r2 addresses new features in HDF3.3. The next generation of HDF documentation is planned to have the following volumes: Getting Started -- Overview of HDF with simple examples. Its draft is completed and is available on the ftp server. The document is titled "Getting Started With NCSA HDF". User's Guide -- Full coverage of all HDF routines and command-line utilities Reference Manual -- An alphabetical listing of all HDF routines and command-line utilities HDF Specifications -- Same as the current HDF Specifications Q19: How can I obtain machine readable copies of the documentation? A19: On the NCSA anonymous ftp server users can find the drafts for "HDF Specifications", "HDF Calling Interfaces and Utilities", and "Getting Started with NCSA HDF", as well as the documentation for HDF Vset. They are in different subdirectories within the directory Documentation/ (on ftp). There is a README file in each subdirectory, which explains briefly what is contained in that subdirectory and related information. The following is a list showing where (which subdirectory) each documente resides in on NCSA ftp: Documentation subdirectory ------------------- -------------------- HDF Specifications Documentation/HDF.Specs/ (Draft) HDF Calling Interfaces Documentation/HDF3.2/ and Utilities (Draft) HDF Vset 2.0 Documentation/HDF.Vset2.1/ vset2.1.extra.doc Documentation/HDF.Vset2.1/ Getting Started with Documentation/HDF_getting_started/ NCSA HDF (Draft) ABOUT_3.3r2 HDF/HDF3.3r2/ "HDF Specifications", "HDF Calling Interfaces" and "HDF Vset 2.0" were written using MicroSoft Word. In each subdirectory the *.sit.hqx file is the MS Word version of the documentation and is stuffed using Stuffit.1.5. The *.asc.tar file is the ASCII text version of the MS Word file. It is missing all figures and formatting , so it is not very readable. "Getting Started" was created using FrameMaker. This document is available only in postscript form. ABOUT_3.3r2 and vset2.1.extra.doc are ASCII text files. Q20: How do I get hard copies of HDF documentation? A20: NCSA accepts orders for hard copies of HDF Specs, HDF Calling Interfaces, HDF Vset 2.0 and Getting Started. Interested users should contact the office of NCSA Documentation Orders at: (217)244-4130 docorder@ncsa.uiuc.edu (Internet) docorder@ncsagate (BITNET) or Attention: Documentation Orders NCSA University of Illinois at Urbana-Champaign 605 Springfield Ave. Champaign, IL 61820 Before the new documentation is officially published, NCSA supplies drafts of the documentation only. Don't be surprised if you see the big sign "DRAFT" on the top page when you receive the new documentation! ABOUT_3.3r2 is contained in HDF3.3r2 releases. Users having difficulties in getting these two files should contact hdfhelp@ncsa.uiuc.edu. Q21: Can new versions of HDF read hdf files written using older version of the HDF library? A21: Our goal is to make HDF backward compatible in the sense of that HDF files can always be read by new versions of HDF. We have succeeded in doing so up to HDF3.3r2, and will continue to follow the principle as much as possible. The table below lists the backward compatibility of HDF. Note that Vdata and Vgroup interfaces has been merged into HDF since HDF3.2. Before then, they was a seperate library named Vset. Interface | read HDF3.1 | read HDF3.2 | read HDF3.3 -------------------------------------------------------------- HDF3.1 | | | -RIS8 | YES | YES | YES(except JPEG) -RIS24 | YES | YES | YES(except JPEG) -PALETTE | YES | YES | YES -ANNOTATION | YES | YES | YES -SDS | Float 32 only | Float 32 only | Float 32 only Vset 2.1 | | | -VData | YES | YES | YES -Vgroup | YES | YES | YES -------------------------------------------------------------- HDF3.2 | | | -RIS8 | YES | YES | YES(except JPEG) -RIS24 | YES | YES | YES(except JPEG) -PALETTE | YES | YES | YES -ANNOTATION | YES | YES | YES -SDS | YES | YES | YES(except general | | | attributes) -VData | YES | YES | YES -Vgroup | YES | YES | YES -------------------------------------------------------------- HDF3.3 | | | -RIS8 | YES | YES | YES -RIS24 | YES | YES | YES -PALETTE | YES | YES | YES -ANNOTATION | YES | YES | YES -SDS | YES | YES | YES -VData | YES | YES | YES -Vgroup | YES | YES | YES -------------------------------------------------------------- (*) The HDF 3.3 interface supports two types of SDS objects. Previous libraries are able to read "old-style" SDS objects. See the HDF 3.3 Documentation for more information. However, old HDF libraries are NOT always be able to read hdf files written by newer version HDF libraries. For example, HDF3.1 can not read 16-bit integer SDS's because HDF3.1 did not support this data type. Q22: Can my application programs which work with old versions of the HDF library always be compiled with new versions of HDF? A22: As HDF evolves some functions have to be changed or removed. For example, in HDF3.2 some functions' formal parameters which were passed by value in HDF3.1 have to be passed by reference in order to support new number types. When this happens, old application programs need to be modified so that they can work with the new library. Our policy is as follows: Keep existing functions unchanged as much as possible; create new functions when necessary to accommodate new features; if a new function covers the feature of an existing old function, the old function should still be callable by old application programs; should an old function be phased out the users will be forewarned and encouraged to switch to the new function; an old function will be removed from the library only if it is in conflict with the implementation of new features. Q23: Does HDF support data compression? A23: HDF3.3 supports RLE (Run Length Encoding), IMCOMP and JPEG compression for raster images. We plan to support compression for all number types in the future. However, no definite date has been set for this support at this time. Q24: Is there a mailing list for HDF discussions and questions? A24: hdfnews is a mailing list for HDF and its users to share ideas and information. HDF will announce new releases and events via newsletters through hdfnews. Users are welcome to make their comments, criticisms and suggestions on hdfnews as well. To join hdfnews please send your e-mail address to hdfhelp@ncsa.uiuc.edu. With the help and support from the USENET coordinator and users, we have set up the news group sci.data.formats on the network. It has been an excellent forum for discussion of various data file formats. Q25: How can a user share with other users HDF software that he or she has written, such as HDF utilities and ports to machines that NCSA does not support? A25: Several users have ported HDF to their machines or developed their own utility programs to convert data between hdf and other file formats. They contributed their programs to us for HDF user community. Those programs are on NCSA ftp server in subdirectory: HDF/contrib/ Q26: How do I contribute my software to HDF user community? A26: Contact hdfhelp@ncsa.uiuc.edu indicating that you would like to contribute your software to HDF user community. We will set up a directory for you to send the contribution package. For other users' convenience, your contribution package should include the software itself, a Makefile if possible, a man-page, test programs and input data files for test. A README file is also required. It should describe briefly the purpose, function and limitation of the software; on which platforms and operating systems it runs; how to compile, install and test it; who and where to contact for commets, suggestions or bug reports. Q27: How do I make a bug report? A27: All bug reports, comments, suggestions and questions should go to hdfhelp@ncsa.uiuc.edu. Attached below is a bug report template. It would be very helpful for us to locate and fix the bug if all the information inquired in the template are supplied by the reporter. ------------------ Template for bug report ------------------------ To: hdfhelp@ncsa.uiuc.edu Subject: [area]: [synopsis] [replace with actual AREA and SYNOPSIS] VERSION: HDF3.3 Beta 2, patch00 USER: [Name, phone number and address of person reporting the bug. (email address if possible)] AREA: [Area of the HDF source tree affected, e.g., src, util, test, toplevel. If there are bugs in more than one AREA, please use a seperate bug report for each AREA. ] SYNOPSIS: [Brief description of the problem and where it is located] MACHINE / OPERATING SYSTEM: [e.g. Sparc/SunOS 4.1.3, HP9000/730-HPUX9.01...] COMPILER: [e.g. native cc, native ANSI cc, gcc 2.45, MPW, ...] DESCRIPTION: [Detailed description of problem. ] REPEAT BUG BY: [What you did to get the error; include test program or session transcript if at all possible. If you include a program, make sure it depends only on libraries in the HDF distribution, not on any vendor or third-party libraries. Please Be specific; if we can't reproduce it, we can't fix it. Tell us exactly what we should see when the program is run. ] SAMPLE FIX: [If available, please send context diffs (diff -c)] [PLEASE make your Subject(SYNOPSIS): line as descriptive as possible.] [Remove all the explanatory text in brackets before mailing.] [Send to hdfhelp@ncsa.uiuc.edu or to: HDF, SDG at NCSA 605 E. Springfield Ave. Champaign, IL 61820 ] ------------------ End of Bug Report Template ----------------------