Chaste Release::3.1
PetscTools Class Reference

#include <PetscTools.hpp>

Collaboration diagram for PetscTools:

List of all members.

Static Public Member Functions

static void ResetCache ()
static bool IsSequential ()
static bool IsParallel ()
static unsigned GetNumProcs ()
static unsigned GetMyRank ()
static bool AmMaster ()
static bool AmTopMost ()
static void Barrier (const std::string callerId="")
static void BeginRoundRobin ()
static void EndRoundRobin ()
static void IsolateProcesses (bool isolate=true)
static Vec CreateVec (int size, int localSize=PETSC_DECIDE, bool ignoreOffProcEntries=true)
static Vec CreateVec (std::vector< double > data)
static Vec CreateAndSetVec (int size, double value)
static void SetupMat (Mat &rMat, int numRows, int numColumns, unsigned rowPreallocation, int numLocalRows=PETSC_DECIDE, int numLocalColumns=PETSC_DECIDE, bool ignoreOffProcEntries=true)
static bool ReplicateBool (bool flag)
static void ReplicateException (bool flag)
static void DumpPetscObject (const Mat &rMat, const std::string &rOutputFileFullPath)
static void DumpPetscObject (const Vec &rVec, const std::string &rOutputFileFullPath)
static void ReadPetscObject (Mat &rMat, const std::string &rOutputFileFullPath, Vec rParallelLayout=NULL)
static void ReadPetscObject (Vec &rVec, const std::string &rOutputFileFullPath, Vec rParallelLayout=NULL)
static bool HasParMetis ()
static void Destroy (Vec &rVec)
static void Destroy (Mat &rMat)

Static Public Attributes

static const unsigned MASTER_RANK = 0

Static Private Member Functions

static void CheckCache ()

Static Private Attributes

static bool mPetscIsInitialised = false
static unsigned mNumProcessors = 0
static unsigned mRank = 0
static bool mIsolateProcesses = false

Detailed Description

A helper class of static methods.

Definition at line 86 of file PetscTools.hpp.


Member Function Documentation

bool PetscTools::AmMaster ( ) [static]

Just returns whether it is the master process or not.

If not running in parallel, always returns true.

Definition at line 105 of file PetscTools.cpp.

References CheckCache(), MASTER_RANK, mIsolateProcesses, and mRank.

Referenced by AbstractHdf5Converter< ELEMENT_DIM, SPACE_DIM >::AbstractHdf5Converter(), ParallelColumnDataWriter::AdvanceAlongUnlimitedDimension(), ArchiveOpener< Archive, Stream >::ArchiveOpener(), ParallelColumnDataWriter::Close(), CellBasedPdeHandler< DIM >::CloseResultsFiles(), OutputFileHandler::CommonConstructor(), NumericFileComparison::CompareFiles(), FileComparison::CompareFiles(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ComputeMeshPartitioning(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ConstructCuboid(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ConstructLinearMesh(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ConstructRectangularMesh(), AbstractConvergenceTester< CELL, CARDIAC_PROBLEM, DIM, PROBLEM_DIM >::Converge(), CellMLToSharedLibraryConverter::ConvertCellmlToSo(), OutputFileHandler::CopyFileTo(), HeartConfig::CopySchema(), CardiacSimulation::CreateAndRun(), OutputDirectoryFifoQueue::CreateNextDir(), CellMLToSharedLibraryConverter::CreateOptionsFile(), AbstractCellPopulation< ELEMENT_DIM, SPACE_DIM >::CreateOutputFiles(), CardiacSimulation::CreateResumeXmlFile(), ParallelColumnDataWriter::EndDefineMode(), StreeterFibreGenerator< SPACE_DIM >::GenerateOrthotropicFibreOrientation(), AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::GetNextBoundaryElement(), AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::GetNextCableElement(), AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::GetNextElement(), AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::GetNextNode(), Hdf5ToTxtConverter< ELEMENT_DIM, SPACE_DIM >::Hdf5ToTxtConverter(), GenericEventHandler< 1, EventHandler >::HeadingsImpl(), AbstractBidomainSolver< ELEMENT_DIM, SPACE_DIM >::InitialiseForSolve(), OutputFileHandler::MakeFoldersAndReturnFullPath(), NodePartitioner< ELEMENT_DIM, SPACE_DIM >::MetisLibraryPartitioning(), CellBasedPdeHandler< DIM >::OpenResultsFiles(), NodePartitioner< ELEMENT_DIM, SPACE_DIM >::PetscMatrixPartitioning(), ExecutableSupport::Print(), ExecutableSupport::PrintError(), ProgressReporter::PrintFinalising(), ProgressReporter::PrintInitialising(), ProgressReporter::ProgressReporter(), Hdf5DataWriter::PutUnlimitedVariable(), ParallelColumnDataWriter::PutVariable(), ParallelColumnDataWriter::PutVector(), GenericEventHandler< 1, EventHandler >::ReportImpl(), AbstractFileComparison::ResetFiles(), AbstractTetrahedralMesh< SPACE_DIM, SPACE_DIM >::save(), AbstractCardiacTissue< SPACE_DIM >::save(), HeartConfig::save(), CardiacSimulationArchiver< PROBLEM_CLASS >::Save(), AbstractFileComparison::Setup(), OffLatticeSimulation< ELEMENT_DIM, SPACE_DIM >::SetupSolve(), ExecutableSupport::ShowCopyright(), AbstractFileComparison::SkipHeaderLines(), LinearSystem::Solve(), AbstractCellBasedSimulation< ELEMENT_DIM, SPACE_DIM >::Solve(), ProgressReporter::Update(), OffLatticeSimulation< ELEMENT_DIM, SPACE_DIM >::UpdateNodePositions(), HeartConfig::Write(), Hdf5ToMeshalyzerConverter< ELEMENT_DIM, SPACE_DIM >::Write(), Hdf5ToCmguiConverter< ELEMENT_DIM, SPACE_DIM >::Write(), Hdf5ToCmguiConverter< ELEMENT_DIM, SPACE_DIM >::WriteCmguiScript(), AbstractContinuumMechanicsSolver< DIM >::WriteCurrentPressureSolution(), AbstractContinuumMechanicsSolver< DIM >::WriteCurrentSpatialSolution(), AbstractPerElementWriter< ELEMENT_DIM, SPACE_DIM, DATA_SIZE >::WriteData(), VtkMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteFilesUsingMesh(), AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteFilesUsingMesh(), AbstractMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteFilesUsingMeshReader(), AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteFilesUsingParallelMesh(), PostProcessingWriter< ELEMENT_DIM, SPACE_DIM >::WriteGenericFileToMeshalyzer(), MonodomainProblem< ELEMENT_DIM, SPACE_DIM >::WriteInfo(), ExtendedBidomainProblem< DIM >::WriteInfo(), BidomainProblem< DIM >::WriteInfo(), HeartGeometryInformation< SPACE_DIM >::WriteLayerForEachNode(), AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteNclFile(), CellBasedPdeHandler< DIM >::WritePdeSolution(), PseudoEcgCalculator< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::WritePseudoEcg(), AbstractCentreBasedCellPopulation< ELEMENT_DIM, SPACE_DIM >::WriteTimeAndNodeResultsToFiles(), AbstractCellPopulation< ELEMENT_DIM, SPACE_DIM >::WriteTimeAndNodeResultsToFiles(), OdeSolution::WriteToFile(), PostProcessingWriter< ELEMENT_DIM, SPACE_DIM >::WriteVariablesOverTimeAtNodes(), OffLatticeSimulation< ELEMENT_DIM, SPACE_DIM >::WriteVisualizerSetupFile(), LinearSystem::~LinearSystem(), and ProgressReporter::~ProgressReporter().

void PetscTools::Barrier ( const std::string  callerId = "") [static]

If MPI is set up, perform a barrier synchronisation. If not, it's a noop.

Parameters:
callerIdonly used in debug mode; printed before & after the barrier call

Definition at line 119 of file PetscTools.cpp.

References CheckCache(), GetMyRank(), mIsolateProcesses, and mPetscIsInitialised.

Referenced by ParallelColumnDataWriter::AdvanceAlongUnlimitedDimension(), GenericEventHandler< 1, EventHandler >::BeginEventImpl(), BeginRoundRobin(), ElectrodesStimulusFactory< DIM >::CheckForElectrodesIntersection(), ParallelColumnDataWriter::Close(), OutputFileHandler::CommonConstructor(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ComputeMeshPartitioning(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ConstructFromMeshReader(), OutputFileHandler::CopyFileTo(), OutputDirectoryFifoQueue::CreateNextDir(), CellMLToSharedLibraryConverter::CreateOptionsFile(), GenericEventHandler< 1, EventHandler >::EndEventImpl(), EndRoundRobin(), StreeterFibreGenerator< SPACE_DIM >::GenerateOrthotropicFibreOrientation(), Hdf5ToCmguiConverter< ELEMENT_DIM, SPACE_DIM >::Hdf5ToCmguiConverter(), Hdf5ToMeshalyzerConverter< ELEMENT_DIM, SPACE_DIM >::Hdf5ToMeshalyzerConverter(), GenericEventHandler< 1, EventHandler >::HeadingsImpl(), AbstractCardiacProblem< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::InitialiseWriter(), OutputFileHandler::MakeFoldersAndReturnFullPath(), NodePartitioner< ELEMENT_DIM, SPACE_DIM >::PetscMatrixPartitioning(), ExtendedBidomainProblem< DIM >::ProcessExtracellularStimulus(), GenericEventHandler< 1, EventHandler >::ReportImpl(), AbstractTetrahedralMesh< SPACE_DIM, SPACE_DIM >::save(), AbstractCardiacTissue< SPACE_DIM >::save(), HeartConfig::save(), CardiacSimulationArchiver< PROBLEM_CLASS >::Save(), AbstractFileComparison::Setup(), CardiacElectroMechanicsProblem< DIM, ELEC_PROB_DIM >::Solve(), AbstractContinuumMechanicsSolver< DIM >::WriteCurrentPressureSolution(), AbstractContinuumMechanicsSolver< DIM >::WriteCurrentSpatialSolution(), AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteFilesUsingMesh(), AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteFilesUsingParallelMesh(), HeartGeometryInformation< SPACE_DIM >::WriteLayerForEachNode(), and AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteNclFile().

static void PetscTools::CheckCache ( ) [inline, static, private]

Private method makes sure that (if this is the first use within a test) then PETSc has been probed.

Definition at line 108 of file PetscTools.hpp.

References mNumProcessors, and ResetCache().

Referenced by AmMaster(), AmTopMost(), Barrier(), GetMyRank(), GetNumProcs(), IsParallel(), IsSequential(), and ReplicateBool().

Vec PetscTools::CreateAndSetVec ( int  size,
double  value 
) [static]

Create a vector of the specified size with all values set to be the given constant. SetFromOptions is called.

Parameters:
sizethe size of the vector
valuethe value to set each entry

Definition at line 226 of file PetscTools.cpp.

References CreateVec().

Referenced by OdeLinearSystemSolver::OdeLinearSystemSolver(), NodeBasedCellPopulationWithBuskeUpdate< DIM >::UpdateNodeLocations(), and AbstractNonlinearAssemblerSolverHybrid< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::VerifyJacobian().

Vec PetscTools::CreateVec ( int  size,
int  localSize = PETSC_DECIDE,
bool  ignoreOffProcEntries = true 
) [static]
Vec PetscTools::CreateVec ( std::vector< double data) [static]

Create a Vec from the given data.

Parameters:
datasome data

Definition at line 206 of file PetscTools.cpp.

References CreateVec().

static void PetscTools::Destroy ( Mat rMat) [inline, static]

Destroy method Note that PETSc 3.1 and previous destroy based on a PETSc object but PETSc 3.2 and later destroy based on a pointer to a PETSc object

Parameters:
rMata reference to the PETSc object

Definition at line 319 of file PetscTools.hpp.

static void PetscTools::Destroy ( Vec rVec) [inline, static]

Destroy method Note that PETSc 3.1 and previous destroy based on a PETSc object but PETSc 3.2 and later destroy based on a pointer to a PETSc object

Parameters:
rVeca reference to the PETSc object

Definition at line 303 of file PetscTools.hpp.

Referenced by AbstractContinuumMechanicsSolver< DIM >::AllocateMatrixMemory(), AbstractContinuumMechanicsSolver< DIM >::ApplyDirichletBoundaryConditions(), BoundaryConditionsContainer< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::ApplyDirichletToLinearProblem(), PetscMatTools::CheckEquality(), PetscMatTools::CheckSymmetry(), AbstractNonlinearAssemblerSolverHybrid< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::ComputeJacobianNumerically(), PseudoEcgCalculator< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::ComputePseudoEcgAtOneTimeStep(), PdeAndBoundaryConditions< DIM >::DestroySolution(), DistributedVectorFactory::DistributedVectorFactory(), AbstractExtendedBidomainSolver< ELEMENT_DIM, SPACE_DIM >::FinaliseLinearSystem(), AbstractBidomainSolver< ELEMENT_DIM, SPACE_DIM >::FinaliseLinearSystem(), HasParMetis(), Hdf5ToTxtConverter< ELEMENT_DIM, SPACE_DIM >::Hdf5ToTxtConverter(), Hdf5ToVtkConverter< ELEMENT_DIM, SPACE_DIM >::Hdf5ToVtkConverter(), AbstractCardiacProblem< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::Initialise(), AbstractLinearPdeSolver< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::InitialiseForSolve(), AdaptiveBidomainProblem::InitializeSolutionOnAdaptedMesh(), ExtendedBidomainProblem< DIM >::load(), AbstractCardiacProblem< DIM, DIM, 1 >::load(), PCBlockDiagonal::PCBlockDiagonalCreate(), PCLDUFactorisation::PCLDUFactorisationCreate(), PCTwoLevelsBlockDiagonal::PCTwoLevelsBlockDiagonalCreate(), NodePartitioner< ELEMENT_DIM, SPACE_DIM >::PetscMatrixPartitioning(), Hdf5DataWriter::PutStripedVector(), Hdf5DataWriter::PutVector(), ParallelColumnDataWriter::PutVectorStripe(), ReadPetscObject(), ReplicatableVector::RemovePetscContext(), ReplicatableVector::Replicate(), ExtendedBidomainProblem< DIM >::save(), PetscVecTools::SetupInterleavedVectorScatterGather(), AbstractDynamicLinearPdeSolver< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::Solve(), SimplePetscNonlinearSolver::Solve(), SimpleNewtonNonlinearSolver::Solve(), LinearSystem::Solve(), CardiacElectroMechanicsProblem< DIM, ELEC_PROB_DIM >::Solve(), AdaptiveBidomainProblem::Solve(), AbstractCardiacProblem< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::Solve(), StokesFlowSolver< DIM >::Solve(), LinearParabolicPdeSystemWithCoupledOdeSystemSolver< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::SolveAndWriteResultsToFile(), OdeLinearSystemSolver::SolveOneTimeStep(), CellBasedPdeHandler< DIM >::SolvePdeAndWriteResultsToFile(), AbstractNonlinearElasticitySolver< DIM >::SolveSnes(), AbstractNonlinearElasticitySolver< DIM >::TakeNewtonStep(), NodeBasedCellPopulationWithBuskeUpdate< DIM >::UpdateNodeLocations(), AbstractNonlinearAssemblerSolverHybrid< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::VerifyJacobian(), VoltageInterpolaterOntoMechanicsMesh< DIM >::VoltageInterpolaterOntoMechanicsMesh(), Hdf5ToMeshalyzerConverter< ELEMENT_DIM, SPACE_DIM >::Write(), Hdf5ToCmguiConverter< ELEMENT_DIM, SPACE_DIM >::Write(), AbstractCardiacProblem< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::WriteExtraVariablesOneStep(), ExtendedBidomainProblem< DIM >::WriteOneStep(), AbstractCardiacProblem< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::~AbstractCardiacProblem(), AbstractContinuumMechanicsSolver< DIM >::~AbstractContinuumMechanicsSolver(), ExtendedBidomainSolver< ELEM_DIM, SPACE_DIM >::~ExtendedBidomainSolver(), Hdf5DataWriter::~Hdf5DataWriter(), LinearSystem::~LinearSystem(), MonodomainSolver< ELEMENT_DIM, SPACE_DIM >::~MonodomainSolver(), OdeLinearSystemSolver::~OdeLinearSystemSolver(), OperatorSplittingMonodomainSolver< ELEMENT_DIM, SPACE_DIM >::~OperatorSplittingMonodomainSolver(), ParallelColumnDataWriter::~ParallelColumnDataWriter(), and PCTwoLevelsBlockDiagonal::~PCTwoLevelsBlockDiagonal().

void PetscTools::DumpPetscObject ( const Mat rMat,
const std::string &  rOutputFileFullPath 
) [static]

Dumps a given PETSc object to disk.

Parameters:
rMata matrix
rOutputFileFullPathwhere to dump the matrix to disk

Definition at line 299 of file PetscTools.cpp.

References PETSC_DESTROY_PARAM.

void PetscTools::DumpPetscObject ( const Vec rVec,
const std::string &  rOutputFileFullPath 
) [static]

Dumps a given PETSc object to disk.

Parameters:
rVeca vector
rOutputFileFullPathwhere to dump the vector to disk

Definition at line 313 of file PetscTools.cpp.

References PETSC_DESTROY_PARAM.

unsigned PetscTools::GetMyRank ( ) [static]

Return our rank.

If PETSc has not been initialized, returns 0.

Definition at line 99 of file PetscTools.cpp.

References CheckCache(), and mRank.

Referenced by VtkMeshWriter< ELEMENT_DIM, SPACE_DIM >::AddPointData(), ArchiveOpener< Archive, Stream >::ArchiveOpener(), IncompressibleNonlinearElasticitySolver< DIM >::AssembleSystem(), CompressibleNonlinearElasticitySolver< DIM >::AssembleSystem(), Barrier(), BeginRoundRobin(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ConstructCuboid(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ConstructFromMeshReader(), AbstractContinuumMechanicsAssembler< DIM, CAN_ASSEMBLE_VECTOR, CAN_ASSEMBLE_MATRIX >::DoAssemble(), EndRoundRobin(), FormDebugHead(), MixedDimensionMesh< ELEMENT_DIM, SPACE_DIM >::GetCableElement(), AbstractCardiacTissue< ELEMENT_DIM, SPACE_DIM >::GetCardiacCellOrHaloCell(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::GetNodeOrHaloNode(), NodePartitioner< ELEMENT_DIM, SPACE_DIM >::MetisLibraryPartitioning(), CardiacSimulationArchiver< PROBLEM_CLASS >::Migrate(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ParMetisLibraryNodeAndElementPartitioning(), NodePartitioner< ELEMENT_DIM, SPACE_DIM >::PetscMatrixPartitioning(), ExecutableSupport::PrintError(), TetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ReadNodesPerProcessorFile(), GenericEventHandler< 1, EventHandler >::ReportImpl(), ExecutableSupport::ShowParallelLaunching(), StokesFlowSolver< DIM >::Solve(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::SolveBoundaryElementMapping(), AbstractCardiacTissue< ELEMENT_DIM, SPACE_DIM >::SolveCellSystems(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::SolveElementMapping(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::SolveNodeMapping(), AbstractNonlinearElasticitySolver< DIM >::TakeNewtonStep(), DistanceMapCalculator< ELEMENT_DIM, SPACE_DIM >::UpdateQueueFromRemote(), VtkMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteFilesUsingMesh(), AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteFilesUsingParallelMesh(), ExecutableSupport::WriteMachineInfoFile(), AdaptiveTetrahedralMesh::WriteMeshToDistributedFile(), and ExecutableSupport::WriteProvenanceInfoFile().

unsigned PetscTools::GetNumProcs ( ) [static]

Returns total number of processors.

Definition at line 93 of file PetscTools.cpp.

References CheckCache(), and mNumProcessors.

Referenced by VtkMeshWriter< ELEMENT_DIM, SPACE_DIM >::AddPointData(), AbstractCardiacTissue< ELEMENT_DIM, SPACE_DIM >::CalculateHaloNodesFromNodeExchange(), AbstractTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::CalculateNodeExchange(), DistanceMapCalculator< ELEMENT_DIM, SPACE_DIM >::ComputeDistanceMap(), DistanceMapCalculator< ELEMENT_DIM, SPACE_DIM >::DistanceMapCalculator(), EndRoundRobin(), NodePartitioner< ELEMENT_DIM, SPACE_DIM >::MetisLibraryPartitioning(), CardiacSimulationArchiver< PROBLEM_CLASS >::Migrate(), DistributedTetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ParMetisLibraryNodeAndElementPartitioning(), NodePartitioner< ELEMENT_DIM, SPACE_DIM >::PetscMatrixPartitioning(), TetrahedralMesh< ELEMENT_DIM, SPACE_DIM >::ReadNodesPerProcessorFile(), GenericEventHandler< 1, EventHandler >::ReportImpl(), DistributedVectorFactory::rGetGlobalLows(), CardiacSimulationArchiver< PROBLEM_CLASS >::Save(), AbstractMesh< ELEMENT_DIM, SPACE_DIM >::SetDistributedVectorFactory(), ExecutableSupport::ShowParallelLaunching(), AbstractCardiacTissue< ELEMENT_DIM, SPACE_DIM >::SolveCellSystems(), DistanceMapCalculator< ELEMENT_DIM, SPACE_DIM >::UpdateQueueFromRemote(), DistanceMapCalculator< ELEMENT_DIM, SPACE_DIM >::WorkOnLocalQueue(), VtkMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteFilesUsingMesh(), AbstractTetrahedralMeshWriter< ELEMENT_DIM, SPACE_DIM >::WriteFilesUsingParallelMesh(), ExecutableSupport::WriteMachineInfoFile(), and AdaptiveTetrahedralMesh::WriteMeshToDistributedFile().

bool PetscTools::HasParMetis ( ) [static]

Checks if Petsc has been configured with ParMetis partioning support.

Returns:
true If ParMetis partioning is available

Definition at line 411 of file PetscTools.cpp.

References Destroy(), PETSC_DESTROY_PARAM, and SetupMat().

Referenced by NodePartitioner< ELEMENT_DIM, SPACE_DIM >::PetscMatrixPartitioning().

void PetscTools::IsolateProcesses ( bool  isolate = true) [static]

Where work can be split between isolated processes, it would be nice to be able to do so easily without worrying about collective calls made inside classes such as OutputFileHandler leading to deadlock. This method attempts to enable this behaviour. If the flag is set then AmMaster always returns true, Barrier becomes a no-op, and ReplicateBool doesn't replicate.

Parameters:
isolatewhether to consider processes as isolated

Definition at line 153 of file PetscTools.cpp.

References mIsolateProcesses.

void PetscTools::ReadPetscObject ( Mat rMat,
const std::string &  rOutputFileFullPath,
Vec  rParallelLayout = NULL 
) [static]

Read a previously dumped PETSc object from disk.

Parameters:
rMata matrix
rOutputFileFullPathwhere to read the matrix from
rParallelLayoutIf provided, rMat will have the same parallel layout. Its content is irrelevant.

Todo:
: #1082 work out appropriate nz allocation.

Definition at line 327 of file PetscTools.cpp.

References Destroy(), PETSC_DESTROY_PARAM, and SetupMat().

void PetscTools::ReadPetscObject ( Vec rVec,
const std::string &  rOutputFileFullPath,
Vec  rParallelLayout = NULL 
) [static]

Read a previously dumped PETSc object from disk.

Parameters:
rVeca vector
rOutputFileFullPathwhere to read the matrix from
rParallelLayoutIf provided, rMat will have the same parallel layout. Its content is irrelevant.

Definition at line 378 of file PetscTools.cpp.

References PETSC_DESTROY_PARAM.

void PetscTools::ReplicateException ( bool  flag) [static]
void PetscTools::ResetCache ( ) [static]

Reset our cached values: whether PETSc is initialised, how many processors there are, and which one we are.

Definition at line 54 of file PetscTools.cpp.

References mNumProcessors, mPetscIsInitialised, and mRank.

Referenced by CheckCache().

void PetscTools::SetupMat ( Mat rMat,
int  numRows,
int  numColumns,
unsigned  rowPreallocation,
int  numLocalRows = PETSC_DECIDE,
int  numLocalColumns = PETSC_DECIDE,
bool  ignoreOffProcEntries = true 
) [static]

Set up a matrix - set the size using the given parameters. The number of local rows and columns is by default PETSC_DECIDE. SetFromOptions is called.

Parameters:
rMatthe matrix
numRowsthe number of rows in the matrix
numColumnsthe number of columns in the matrix
rowPreallocationthe max number of nonzero entries expected on a row A value of 0 is allowed: no preallocation is then done and the user must preallocate the memory for the matrix themselves.
numLocalRowsthe number of local rows (defaults to PETSC_DECIDE)
numLocalColumnsthe number of local columns (defaults to PETSC_DECIDE)
ignoreOffProcEntriestells PETSc to drop off-processor entries

Todo:
#1216 Fix the 0.7 magic number

Definition at line 240 of file PetscTools.cpp.

References IsSequential().

Referenced by AbstractContinuumMechanicsSolver< DIM >::AllocateMatrixMemory(), Hdf5DataWriter::ApplyPermutation(), Hdf5DataWriter::DefineFixedDimensionUsingMatrix(), HasParMetis(), OperatorSplittingMonodomainSolver< ELEMENT_DIM, SPACE_DIM >::InitialiseForSolve(), MonodomainSolver< ELEMENT_DIM, SPACE_DIM >::InitialiseForSolve(), ExtendedBidomainSolver< ELEM_DIM, SPACE_DIM >::InitialiseForSolve(), BidomainSolver< ELEMENT_DIM, SPACE_DIM >::InitialiseForSolve(), LinearSystem::LinearSystem(), NodePartitioner< ELEMENT_DIM, SPACE_DIM >::PetscMatrixPartitioning(), ReadPetscObject(), LinearSystem::SetPrecondMatrixIsDifferentFromLhs(), SimplePetscNonlinearSolver::Solve(), and AbstractNonlinearAssemblerSolverHybrid< ELEMENT_DIM, SPACE_DIM, PROBLEM_DIM >::VerifyJacobian().


Member Data Documentation

const unsigned PetscTools::MASTER_RANK = 0 [static]

As a convention, we consider processor 0 the master process.

Definition at line 121 of file PetscTools.hpp.

Referenced by AmMaster().

bool PetscTools::mIsolateProcesses = false [static, private]

Whether to pretend that we're just running many master processes independently.

Definition at line 105 of file PetscTools.hpp.

Referenced by AmMaster(), Barrier(), IsolateProcesses(), and ReplicateBool().

unsigned PetscTools::mNumProcessors = 0 [static, private]

The total number of processors.

Definition at line 94 of file PetscTools.hpp.

Referenced by AmTopMost(), CheckCache(), GetNumProcs(), IsParallel(), IsSequential(), and ResetCache().

bool PetscTools::mPetscIsInitialised = false [static, private]

Whether PETSc has been initialised.

Definition at line 91 of file PetscTools.hpp.

Referenced by Barrier(), ReplicateBool(), and ResetCache().

unsigned PetscTools::mRank = 0 [static, private]

Which processors we are.

Definition at line 102 of file PetscTools.hpp.

Referenced by AmMaster(), AmTopMost(), GetMyRank(), and ResetCache().


The documentation for this class was generated from the following files: