[Hdf-forum] high level API for parallel version of HDF5 library

Nelson, Jarom nelson99 at llnl.gov
Wed Sep 27 15:50:40 CDT 2017


Calls that affect the metadata need to be collective so that each process has a consistent view of what the file metadata should be.

https://support.hdfgroup.org/HDF5/doc/RM/CollectiveCalls.html



Something like this (or the attached):



plist_id = H5Pcreate(H5P_FILE_ACCESS);

H5Pset_fapl_mpio(plist_id, comm, info);

H5Pset_all_coll_metadata_ops( plist_id, true );

file_id = H5Fcreate(H5FILE_NAME, H5F_ACC_TRUNC, H5P_DEFAULT, plist_id);

H5Pclose(plist_id);

for(int procid = 0; i < mpi_size; ++i) {

  hid_t gr_id = H5Gcreate(file_id, std::to_string(procid).c_str(), H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);

  H5Gclose(gr_id);

}

H5Fclose(file_id);



-----Original Message-----
From: Hdf-forum [mailto:hdf-forum-bounces at lists.hdfgroup.org] On Behalf Of Rafal Lichwala
Sent: Wednesday, September 27, 2017 12:32 AM
To: hdf-forum at lists.hdfgroup.org
Subject: Re: [Hdf-forum] high level API for parallel version of HDF5 library



Hi Barbara, Hi All,



Thank you for your answer. That's clear now about H5TBmake_table() call, but...

H5Gcreate() in not a high level API, isn't it?

So why I cannot use it in parallel processes?

Maybe I'm just doing something wrong, so could you please provide me a short example how to create a set of groups (each one is the process

number) running 4 parallel MPI processes? You can limit the example code to the sequence of HDF5 calls only...

My current code works fine for just one process, but when I try it for 2 (or more) parallel processes the result file is corrupted:



plist_id = H5Pcreate(H5P_FILE_ACCESS);

H5Pset_fapl_mpio(plist_id, comm, info);

H5Pset_all_coll_metadata_ops( plist_id, true ); file_id = H5Fcreate(H5FILE_NAME, H5F_ACC_TRUNC, H5P_DEFAULT, plist_id); H5Pclose(plist_id); hid_t gr_id = H5Gcreate(file_id, std::to_string(procid).c_str(), H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT); H5Gclose(gr_id); H5Fclose(file_id);





Best regards,

Rafal





W dniu 2017-09-25 o 22:20, Barbara Jones pisze:

> Hi Rafal,

>

> No, the HDF5 High Level APIs are not supported in the parallel version of HDF5.

>

> -Barbara

> help at hdfgroup.org<mailto:help at hdfgroup.org>

>

> -----Original Message-----

> From: Hdf-forum [mailto:hdf-forum-bounces at lists.hdfgroup.org] On Behalf Of Rafal Lichwala

> Sent: Monday, September 18, 2017 8:53 AM

> To: hdf-forum at lists.hdfgroup.org<mailto:hdf-forum at lists.hdfgroup.org>

> Subject: [Hdf-forum] high level API for parallel version of HDF5 library

>

> Hi,

>

> Can I use high level API function calls (H5TBmake_table(...)) in parallel version of the HDF5 library?

> There are no property list parameters for that function calls...

>

> Regards,

> Rafal

>

>

> _______________________________________________

> Hdf-forum is for HDF software users discussion.

> Hdf-forum at lists.hdfgroup.org<mailto:Hdf-forum at lists.hdfgroup.org>

> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

> Twitter: https://twitter.com/hdf5

>

> _______________________________________________

> Hdf-forum is for HDF software users discussion.

> Hdf-forum at lists.hdfgroup.org<mailto:Hdf-forum at lists.hdfgroup.org>

> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

> Twitter: https://twitter.com/hdf5

>





_______________________________________________

Hdf-forum is for HDF software users discussion.

Hdf-forum at lists.hdfgroup.org<mailto:Hdf-forum at lists.hdfgroup.org>

http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Twitter: https://twitter.com/hdf5
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.hdfgroup.org/pipermail/hdf-forum_lists.hdfgroup.org/attachments/20170927/178e957e/attachment-0001.html>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: h5g_parallel.cpp
URL: <http://lists.hdfgroup.org/pipermail/hdf-forum_lists.hdfgroup.org/attachments/20170927/178e957e/attachment-0001.ksh>


More information about the Hdf-forum mailing list