Hdf5 For Mac

  1. Hdf5 For Mac Installer
  2. Hdf5 For Mac Shortcut
  3. Hdf5 For Mac Os

Install¶

With Anaconda orMiniconda:

Feb 15, 2018 The latest version of HDFView is 11.5 on Mac Informer. It is a perfect match for Viewers & Editors in the Design & Photo category. The app is developed by The HDF Group and its user rating is 1 out of 5. The new behavior is the HDF5 library will use an MPIBcast to pass the data read from the disk by the root process to the remain processes in the MPI communicator associated with the HDF5 file. (MSB - 2019/01/02, HDFFV-10652) - All MPI-1 API calls have been replaced with MPI-2 equivalents.

If there are wheels for your platform (mac, linux, windows on x86) andyou do not need MPI you can install h5py via pip:

With Enthought Canopy, usethe GUI package manager or:

To install from source see Installation.

Core concepts¶

An HDF5 file is a container for two kinds of objects: datasets, which arearray-like collections of data, and groups, which are folder-like containersthat hold datasets and other groups. The most fundamental thing to rememberwhen using h5py is:

Groups work like dictionaries, and datasets work like NumPy arrays

Suppose someone has sent you a HDF5 file, mytestfile.hdf5. (To create this file, read Appendix: Creating a file.) The very first thing you’ll need to do is to open the file for reading:

The File object is your starting point. What is stored in this file? Remember h5py.File acts like a Python dictionary, thus we can check the keys,

Based on our observation, there is one data set, mydataset in the file.Let us examine the data set as a Dataset object

The object we obtained isn’t an array, but an HDF5 dataset.Like NumPy arrays, datasets have both a shape and a data type:

They also support array-style slicing. This is how you read and write datafrom a dataset in the file:

Hdf5 for machine learning

For more, see File Objects and Datasets.

Appendix: Creating a file¶

At this point, you may wonder how mytestdata.hdf5 is created.We can create a file by setting the mode to w whenthe File object is initialized. Some other modes are a(for read/write/create access), andr+ (for read/write access).A full list of file access modes and their meanings is at File Objects.

The File object has a couple of methods which look interesting. One of them is create_dataset, whichas the name suggests, creates a data set of given shape and dtype

The File object is a context manager; so the following code works too

Groups and hierarchical organization¶

“HDF” stands for “Hierarchical Data Format”. Every object in an HDF5 filehas a name, and they’re arranged in a POSIX-style hierarchy with/-separators:

The “folders” in this system are called groups. The File object wecreated is itself a group, in this case the root group, named /:

Creating a subgroup is accomplished via the aptly-named create_group. But we need to open the file in the “append” mode first (Read/write if exists, create otherwise)

All Group objects also have the create_* methods like File:

By the way, you don’t have to create all the intermediate groups manually.Specifying a full path works just fine:

Groups support most of the Python dictionary-style interface.You retrieve objects in the file using the item-retrieval syntax:

Hdf5 for mac

Iterating over a group provides the names of its members:

Membership testing also uses names:

You can even use full path names:

There are also the familiar keys(), values(), items() anditer() methods, as well as get().

Since iterating over a group only yields its directly-attached members,iterating over an entire file is accomplished with the Group methodsvisit() and visititems(), which take a callable:

For more, see Groups.

Attributes¶

Mac

One of the best features of HDF5 is that you can store metadata right nextto the data it describes. All groups and datasets support attached namedbits of data called attributes.

Attributes are accessed through the attrs proxy object, which againimplements the dictionary interface:

For more, see Attributes.

Introduction

This guide is intended to help Intel® compiler customers build and use the HDF5 library. HDF5 is the latest generation of the HDF libraries, a general purpose library and associated file formats for storing and sharing scientific data. HDF5 is maintained, promoted, and co-developed along with active community support by The HDF Group (THG). THG is a not-for-profit corporation with the mission to sustain HDF technologies and to provide support to HDF user communities. The homepage for THG and HDF5 can be found at http://www.hdfgroup.org/HDF5/.

Version information

HDF5 1.8.8 and later.
Intel® C++ Compiler for Linux* or Mac OS* X
Intel® Fortran Compilers for Linux* or Mac OS* X

Application Notes

HDF5 is a data format and an associated software library designed to store, access, manage, exchange, and archive diverse, complex data in continuously evolving heterogeneous computing and storage environments. HDF5 is extensively used with scientific research, engineering development, and other data.

Hdf5 For Mac

This application note demonstrates the framework for building HDF5 with the Intel compilers but does NOT claim to represent all possible configurations and variations of the build for all possible target environments.

Obtaining the Source Code

The HDF5 source files should be obtained from the HDF5 Software downloads page at http://www.hdfgroup.org/HDF5/release/obtain5.html. Please note the External Library requirements for SZIP and ZLIB and download those if you do not already have those libraries.

Mac

Obtaining the latest version of Intel C++ Compiler and Intel Fortran Compiler

Licensed users of the Intel compilers may download the most recent versions of the compiler from the Intel® Download Center: Intel® Registration Center. Other users can download the evaluation copy from https://software.intel.com/en-us/articles/try-buy-tools.

Hdf5 For Mac Installer

Prerequisites

Software: As mentioned on the HDF5 software downloads page, either SZIP-2.1 or ZLIB libraries can be used for file compression/decompression. Precompiled binaries or sources are available from the HDF5 Software Downloads page for these two libraries.

SZIP: Determine an appropriate location to install SZIP. Directory /usr/local/szip-2.1 may be a reasonable choice. If you wish to build and install szip from the source files, use the procedure shown below:

The above example uses the BASH shell syntax for setting environment variables. For other shells, use the appropriate commands to set environment variables CC, CXX, etc. before the make command. Confirm that after the 'make check' command, the result of the tests return the result 'All test passed.'

Check the directory specified by your --prefix= setting. This directory should contain lib/ and include/directories. For more information on building szip, consult the file named INSTALL in the source directory.

ZLIB:The information shown is for zlib version 1.2.7. zlib is a general purpose data compression library and is a prerequisite for building HDF5. Determine an appropriate location to install zlib. /usr/local/zlib-1.2.7 may be a reasonable choice. If you wish to build and install zlib from the source files, use the procedure shown below:

The above example uses the BASH shell syntax for setting environment variables. For other shells, use the appropriate commands to set environment variables CC, CXX, etc. before the make command. Confirm that after the 'make check' command, the result of the tests return the result 'test OK'.

Check the directory specified by your --prefix= setting. This directory should contain lib/, include/, and share/directories. For more information on building zlib, consult the file named 'README' in the source directory.

Configuration andSetup Information for HDF5

HDF5 uses an Autoconf 'configure' script to determine the build environment and tools and create the necessary build configuration. The first step is to set environment variables to control which compilers are used for the build. These environment variables select the Intel C++ Compiler and the Intel Fortran Compiler.

As shown above, the environment CC, CXX and F9X are used to specify which compilers are used to build HDF5. The example shown above uses both the Intel C++ Compiler ( CC=icc CXX=icpc ) and the Intel Fortran Compiler ( F9X=ifort ). Note that the Intel C++ compiler driver is named 'icpc'. Do NOT use 'icc' as the C++ compiler. The Intel compilers are GNU compatible, thus you may mix and match the Intel compilers with GNU compilers for C++ and Fortran. However, the mixing of GNU compilers with Intel compilers has not been tested with this application.

There are environment variables such as CFLAGS to pass compiler options to the C compiler. However, the configuration script will automatically detect the Intel compilers and use the appropriate optimization options. Thus, the user need not specify optimization settings unless one wants to override the default settings set by configure.

To Extract the Source files and to Configure and Build HDF5

The configure script has many options. Refer to the help provided in the output of './configure --help' or read the contents of the file README.

Using HDF5

There is user documentation in the doc/html subdirectory of the source file directory. Look for the file 'index.html' and view this file in a browser. This doc/html directory can be copied over to the installation directory.

Hdf5 For Mac Shortcut

In general, user program include <hdf5.h> and link with -lhdf5. Additional libraries may be necessary. Please see the user documentation for all the details on the use of HDF5. Users are encouraged to use the compiler helper scripts h5cc, h5fc and h5c++ to build their applications. These helper scripts are installed in the bin/ subdirectory of the installation directory.

Known Issues and Limitations

Hdf5 For Mac Os

  • See the HDF5 website for a list of known issues and limitations