sfepy.parallel.parallel module¶
Functions for a high-level PETSc-based parallelization.
- sfepy.parallel.parallel.assemble_mtx_to_petsc(pmtx, mtx, pdofs, drange, is_overlap=True, comm=None, verbose=False)[source]¶
Assemble a local CSR matrix to a global PETSc matrix.
- sfepy.parallel.parallel.assemble_rhs_to_petsc(prhs, rhs, pdofs, drange, is_overlap=True, comm=None, verbose=False)[source]¶
Assemble a local right-hand side vector to a global PETSc vector.
- sfepy.parallel.parallel.call_in_rank_order(fun, comm=None)[source]¶
Call a function fun task by task in the task rank order.
- sfepy.parallel.parallel.create_gather_scatter(pdofs, pvec_i, pvec, comm=None)[source]¶
Create the
gather()
function for updating a global PETSc vector from local ones and thescatter()
function for updating local PETSc vectors from the global one.
- sfepy.parallel.parallel.create_gather_to_zero(pvec)[source]¶
Create the
gather_to_zero()
function for collecting the global PETSc vector on the task of rank zero.
- sfepy.parallel.parallel.create_local_petsc_vector(pdofs)[source]¶
Create a local PETSc vector with the size corresponding to pdofs.
- sfepy.parallel.parallel.create_petsc_matrix(sizes, mtx_prealloc=None, comm=None)[source]¶
Create and allocate a PETSc matrix.
- sfepy.parallel.parallel.create_petsc_system(mtx, sizes, pdofs, drange, is_overlap=True, comm=None, verbose=False)[source]¶
Create and pre-allocate (if is_overlap is True) a PETSc matrix and related solution and right-hand side vectors.
- sfepy.parallel.parallel.create_prealloc_data(mtx, pdofs, drange, verbose=False)[source]¶
Create CSR preallocation data for a PETSc matrix based on the owned PETSc DOFs and a local matrix with EBCs not applied.
- sfepy.parallel.parallel.create_task_dof_maps(field, cell_tasks, inter_facets, is_overlap=True, use_expand_dofs=False, save_inter_regions=False, output_dir=None)[source]¶
For each task list its inner and interface DOFs of the given field and create PETSc numbering that is consecutive in each subdomain.
For each task, the DOF map has the following structure:
[inner, [own_inter1, own_inter2, ...], [overlap_cells1, overlap_cells2, ...], n_task_total, task_offset]
The overlapping cells are defined so that the system matrix corresponding to each task can be assembled independently, see [1]. TODO: Some “corner” cells may be added even if not needed - filter them out by using the PETSc DOFs range.
When debugging domain partitioning problems, it is advisable to set save_inter_regions to True to save the task interfaces as meshes as well as vertex-based markers - to be used only with moderate problems and small numbers of tasks.
[1] J. Sistek and F. Cirak. Parallel iterative solution of the incompressible Navier-Stokes equations with application to rotating wings. Submitted for publication, 2015
- sfepy.parallel.parallel.distribute_field_dofs(field, gfd, use_expand_dofs=False, comm=None, verbose=False)[source]¶
Distribute the owned cells and DOFs of the given field to all tasks.
The DOFs use the PETSc ordering and are in form of a connectivity, so that each task can easily identify them with the DOFs of the original global ordering or local ordering.
- sfepy.parallel.parallel.distribute_fields_dofs(fields, cell_tasks, is_overlap=True, use_expand_dofs=False, save_inter_regions=False, output_dir=None, comm=None, verbose=False)[source]¶
Distribute the owned cells and DOFs of the given field to all tasks.
Uses interleaved PETSc numbering in each task, i.e., the PETSc DOFs of each tasks are consecutive and correspond to the first field DOFs block followed by the second etc.
Expand DOFs to equations if use_expand_dofs is True.
- sfepy.parallel.parallel.get_composite_sizes(lfds)[source]¶
Get (local, total) sizes of a vector and local equation range for a composite matrix built from field blocks described by lfds local field distributions information.
- sfepy.parallel.parallel.get_inter_facets(domain, cell_tasks)[source]¶
For each couple of neighboring task subdomains get the common boundary (interface) facets.
- sfepy.parallel.parallel.get_local_ordering(field_i, petsc_dofs_conn, use_expand_dofs=False)[source]¶
Get PETSc DOFs in the order of local DOFs of the localized field field_i.
Expand DOFs to equations if use_expand_dofs is True.
- sfepy.parallel.parallel.get_sizes(petsc_dofs_range, n_dof, n_components)[source]¶
Get (local, total) sizes of a vector and local equation range.
- sfepy.parallel.parallel.partition_mesh(mesh, n_parts, use_metis=True, verbose=False)[source]¶
Partition the mesh cells into n_parts subdomains, using metis, if available.
- sfepy.parallel.parallel.setup_composite_dofs(lfds, fields, local_variables, verbose=False)[source]¶
Setup composite DOFs built from field blocks described by lfds local field distributions information.
Returns (local, total) sizes of a vector, local equation range for a composite matrix, and the local ordering of composite PETSc DOFs, corresponding to local_variables (must be in the order of fields!).