sfepy.base.ioutils module

class sfepy.base.ioutils.Cached(data)[source]

The wrapper class that marks data, that should be checked during saving, whether it has been stored to the hdf5 file already and if so, a softlink to the already created instance is created instead of saving.

class sfepy.base.ioutils.DataMarker(data)[source]

The Base class for classes for marking data to be handled in a special way during saving to a HDF5 file by write_to_hdf5(). The usage is simple: just “decorate” the desired data element, e.g.:

data = [data1, Cached(data2)]
write_to_hdf5(... , ... , data)
unpack_data()[source]

One can request unpacking of the wrappers during saving.

Returns:
object

The original object, if possible, or self.

This object is written to the HDF5 file as a softlink to the given path. The destination of the softlink should contain only data, so the structure {type: type, data: softlink_to(destination)} is created in the place where the softlink is written.

get_type()[source]
unpack_data()[source]

One can request unpacking of the wrappers during saving.

Returns:
object

The original object, if possible, or self.

write_data(fd, group, cache=None)[source]

Create the softlink to the destination and handle the caching.

class sfepy.base.ioutils.HDF5BaseData[source]

When storing values to HDF5, special classes can be used that wrap the stored data and modify the way the storing is done. This class is the base of those.

unpack_data()[source]

One can request unpacking of the wrappers during saving.

Returns:
object

The original object, if possible, or self.

class sfepy.base.ioutils.HDF5ContextManager(filename, *args, **kwargs)[source]
class sfepy.base.ioutils.HDF5Data[source]

Some data written to the HDF5 file can have a custom format. Descendants of this class should have the method .write_data() or redefine the .write() method.

write(fd, group, name, cache=None)[source]

Write a data structure to the HDF5 file.

Create the following structure in the HDF5 file: {type: self.get_type(), anything writed by self.write_data()}

Parameters:
fd: tables.File

The hdf5 file handle the data should be writed in.

group: tables.group.Group

The group the data will be stored to

name: str

Name of node that will be appended to group and will contain the data

cache: dict or None, optional

Store for already cached objects with structs id(obj) : /path/to Can be used for not storing the one object twice.

write_data(fd, group)[source]

Write data to the HDF5 file. Redefine this function in sub-classes.

Parameters:
fd: tables.File

The hdf5 file handle the data should be writed to.

group: tables.group.Group

The group the data should be stored to.

class sfepy.base.ioutils.InDir(filename)[source]

Store the directory name a file is in, and prepend this name to other files.

Examples

>>> indir = InDir('output/file1')
>>> print indir('file2')

This object is written to the HDF5 file as a softlink to the given path.

write(fd, group, name, cache=None)[source]

Create the softlink to the destination.

class sfepy.base.ioutils.Uncached(data)[source]

The wrapper class that marks data, that should be always stored to the hdf5 file, even if the object has been already stored at a different path in the file and so it would have been stored by a softlink otherwise (IGDomain, Mesh and sparse matrices behave so).

sfepy.base.ioutils.dec(val, encoding='utf-8')[source]

Decode given bytes using the specified encoding.

sfepy.base.ioutils.edit_filename(filename, prefix='', suffix='', new_ext=None)[source]

Edit a file name by add a prefix, inserting a suffix in front of a file name extension or replacing the extension.

Parameters:
filenamestr

The file name.

prefixstr

The prefix to be added.

suffixstr

The suffix to be inserted.

new_extstr, optional

If not None, it replaces the original file name extension.

Returns:
new_filenamestr

The new file name.

sfepy.base.ioutils.enc(string, encoding='utf-8')[source]

Encode given string or bytes using the specified encoding.

sfepy.base.ioutils.ensure_path(filename)[source]

Check if path to filename exists and if not, create the necessary intermediate directories.

sfepy.base.ioutils.get_or_create_hdf5_group(fd, path, from_group=None)[source]
sfepy.base.ioutils.get_print_info(n_step, fill=None)[source]

Returns the max. number of digits in range(n_step) and the corresponding format string.

Examples:

>>> get_print_info(11)
(2, '%2d')
>>> get_print_info(8)
(1, '%1d')
>>> get_print_info(100)
(2, '%2d')
>>> get_print_info(101)
(3, '%3d')
>>> get_print_info(101, fill='0')
(3, '%03d')
sfepy.base.ioutils.get_trunk(filename)[source]
sfepy.base.ioutils.locate_files(pattern, root_dir='.', **kwargs)[source]

Locate all files matching fiven filename pattern in and below supplied root directory.

The **kwargs arguments are passed to os.walk().

sfepy.base.ioutils.look_ahead_line(fd)[source]

Read and return a line from the given file object. Saves the current position in the file before the reading occurs and then, after the reading, restores the saved (original) position.

sfepy.base.ioutils.path_of_hdf5_group(group)[source]
sfepy.base.ioutils.read_array(fd, n_row, n_col, dtype)[source]

Read a NumPy array of shape (n_row, n_col) from the given file object and cast it to type dtype. If n_col is None, determine the number of columns automatically.

sfepy.base.ioutils.read_dict_hdf5(filename, level=0, group=None, fd=None)[source]
sfepy.base.ioutils.read_from_hdf5(fd, group, cache=None)[source]

Read custom data from a HDF5 file group saved by write_to_hdf5().

The data are stored in a general (possibly nested) structure: {

‘type’ : string type identificator ‘data’ : stored data ‘cache’: string, optional - another posible location of object

}

Parameters:
fd: tables.File

The hdf5 file handle the data should be restored from.

group: tables.group.Group

The group in the hdf5 file the data will be restored from.

cache: dict or None

Some objects (e.g. Mesh instances) can be stored on more places in the HDF5 file tree using softlinks, so when the data are restored, the restored objects are stored and searched in cache so that they are created only once. The keys to cache are the (real) paths of the created objects. Moreover, if some stored object has a ‘cache’ key (see e.g. DataSoftLink class), and the object with a given ‘path’ has been already created, it is returned instead of creating a new object. Otherwise, the newly created object is associated both with its real path and with the cache key path.

The caching is not active for scalar data types.

Returns:
dataobject

The restored custom data.

sfepy.base.ioutils.read_list(fd, n_item, dtype)[source]
sfepy.base.ioutils.read_sparse_matrix_from_hdf5(fd, group, output_format=None)[source]

Read sparse matrix from given data group of hdf5 file

Parameters:
fd: tables.File

The hdf5 file handle the matrix will be read from.

group: tables.group.group

The hdf5 file group of the file the matrix will be read from.

output_format: {‘csr’, ‘csc’, None}, optional

The resulting matrix will be in CSR or CSC format if this parameter is not None (which is default), otherwise it will be in the format the matrix was stored.

Returns:
scipy.sparse.base.spmatrix

Readed matrix

sfepy.base.ioutils.read_sparse_matrix_hdf5(filename, output_format=None)[source]
sfepy.base.ioutils.read_token(fd)[source]

Read a single token (sequence of non-whitespace characters) from the given file object.

Notes

Consumes the first whitespace character after the token.

sfepy.base.ioutils.remove_files(root_dir, **kwargs)[source]

Remove all files and directories in supplied root directory.

The **kwargs arguments are passed to os.walk().

sfepy.base.ioutils.remove_files_patterns(root_dir, patterns, ignores=None, verbose=False)[source]

Remove files with names satisfying the given glob patterns in a supplied root directory. Files with patterns in ignores are omitted.

sfepy.base.ioutils.save_options(filename, options_groups, save_command_line=True, quote_command_line=False)[source]

Save groups of options/parameters into a file.

Each option group has to be a sequence with two items: the group name and the options in {key : value} form.

sfepy.base.ioutils.skip_read_line(fd, no_eof=False)[source]

Read the first non-empty line (if any) from the given file object. Return an empty string at EOF, if no_eof is False. If it is True, raise the EOFError instead.

sfepy.base.ioutils.write_dict_hdf5(filename, adict, level=0, group=None, fd=None)[source]
sfepy.base.ioutils.write_sparse_matrix_hdf5(filename, mtx, name='a sparse matrix')[source]

Assume CSR/CSC.

sfepy.base.ioutils.write_sparse_matrix_to_hdf5(fd, group, mtx)[source]

Write sparse matrix to given data group of hdf5 file

Parameters:
group: tables.group.group

The hdf5 file group the matrix will be read from.

mtx: scipy.sparse.base.spmatrix

The writed matrix

sfepy.base.ioutils.write_to_hdf5(fd, group, name, data, cache=None, unpack_markers=False)[source]

Save custom data to a HDF5 file group to be restored by read_from_hdf5().

Allows saving lists, dicts, numpy arrays, scalars, sparse matrices, meshes and iga domains and all pickleable objects.

Parameters:
fd: tables.File

The hdf5 file handle the data should be written in.

group: tables.group.Group

The group the data will be stored to.

name: str

The name of the node that will be appended to the group and will contain the data.

data: object

Data to be stored in the HDF5 file.

cache: dict or None

The cache where the paths to stored objects (currently meshes and iga domains) are stored, so subsequent attempts to store such objects create only softlinks to the initially stored object. The id() of objects serve as the keys into the cache. Mark the object with Cached() or Uncached() for (no) softlinking.

unpack_markers:

If True, the input data is modified so that Cached and Uncached markers are removed from all sub-elements of the data.

Returns:
tables.group.Group

The HDF5 group the data was stored to.