PETSc comes with a large number of example codes to illustrate usage. There three main sets are in
WHAT THIS EXAMPLE DEMONSTRATES:
FURTHER DETAILS:
DO THE FOLLOWING:
cd petsc/src/ts/examples/tutorials make ex2
mpiexec -n 1 ./ex2 -ts_max_steps 10 -ts_monitor [Expected output]
mpiexec -n 4 ./ex2 -ts_max_steps 10 -ts_monitor -snes_monitor -ksp_monitor [Expected output]
mpiexec -n 16 ./ex2 -ts_max_steps 10 -ts_monitor -da_grid_x 64 [Expected output]
WHAT THIS EXAMPLE DEMONSTRATES:
FURTHER DETAILS:
DO THE FOLLOWING:
cd petsc/src/snes/examples/tutorials/ make ex19
mpiexec -n 16 ./ex19 -da_refine 6 -snes_monitor -ksp_monitor -snes_view [Expected output]
mpiexec -n 16 ./ex19 -da_refine 6 -snes_monitor -ksp_monitor -snes_view -pc_type mg [Expected output]Note this requires many fewer iterations than the default solver
mpiexec -n 16 ./ex19 -da_refine 6 -snes_monitor -ksp_monitor -snes_view -pc_type hypre [Expected output]Note this requires many fewer iterations than the default solver but requires more iterations than geometric multigrid
mpiexec -n 4 ./ex19 -da_refine 6 -log_summary [Expected output]Search for the line beginning with SNESSolve, the fourth column gives the time for the nonlinear solve.
mpiexec -n 4 ./ex19 -da_refine 6 -log_summary -pc_type mg [Expected output]Compare the runtime for SNESSolve to the case with the default solver
mpiexec -n 16 ./ex19 -da_refine 6 -log_summary [Expected output]Compare the runtime for SNESSolve to the 4 processor case with the default solver, what is the speedup?
mpiexec -n 16 ./ex19 -da_refine 6 -log_summary -pc_type mg [Expected output]Compare the runtime for the SNESSolve to the 4 processor case with multigrid, what is the speedup? Why is the speedup for multigrid lower than the speedup for the default solver?
WHAT THIS EXAMPLE DEMONSTRATES:
FURTHER DETAILS:
DO THE FOLLOWING:
cd petsc/src/ksp/ksp/examples/tutorials make ex42
mpiexec -n 4 ./ex42 -stokes_ksp_monitor [Expected output]Note the poor convergence for even a very small problem
mpiexec -n 4 ./ex42 -stokes_ksp_monitor -stokes_pc_type fieldsplit -stokes_pc_fieldsplit_type schur [Expected output]
mpiexec -n 16 ./ex42 -mx 40 -stokes_ksp_monitor -stokes_pc_type fieldsplit -stokes_pc_fieldsplit_type schur [Expected output]
WHAT THIS EXAMPLE DEMONSTRATES:
FURTHER DETAILS:
DO THE FOLLOWING:
cd petsc/src/ts/examples/tutorials make ex11
mpiexec -n 1 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/sevenside.exo [Expected output]
mpiexec -n 1 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/sevenside.exo -ts_type rosw [Expected output]
mpiexec -n 16 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/annulus-20.exo -monitor Error -advect_sol_type bump -ufv_reconstruct -ufv_limit none [Expected output]Compare turning to the error after turning off reconstruction.
mpiexec -n 4 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/annulus-20.exo physics sw -monitor Height,Energy -ufv_reconstruct -ufv_limit minmod [Expected output]