PETSc Hands On#
PETSc comes with a large number of example codes to illustrate usage. Here, we highlight a few, key ones:
Linear Poisson equation on a 2D grid
example of linear equation problem
see also src/ksp/ksp/tutorials
Nonlinear ODE arising from a time-dependent one dimensional PDE
example of time-stepping problem
see also src/ts/tutorials
Nonlinear PDE on a structured grid
example of nonlinear PDE
see also src/snes/tutorials
Several examples are also included that represent the
interoperability with other numerical software packages in the xSDK
Toolkit. These packages can be
automatically installed by PETSc by configuring with
--download-trilinos
, --download-hypre
, and/or
--download-superlu_dist
.
Nonlinear PDE Example using BoomerAMG from HYPRE
Linear Equation Example using direct solver SuperLU_DIST
Example 1: Linear Poisson equation on a 2D grid#
WHAT THIS EXAMPLE DEMONSTRATES:
Using command line options
Using Linear Solvers
Handling a simple structured grid
FURTHER DETAILS:
DO THE FOLLOWING:
Compile
src/ksp/ksp/tutorials/ex50.c
$ cd petsc/src/ksp/ksp/tutorials $ make ex50
Run a 1 processor example with a 3x3 mesh and view the matrix assembled
$ mpiexec -n 1 ./ex50 -da_grid_x 4 -da_grid_y 4 -mat_view
Expected output:
Mat Object: 1 MPI processes type: seqaij row 0: (0, 0.) (1, 0.) (4, 0.) row 1: (0, 0.) (1, 0.) (2, 0.) (5, 0.) row 2: (1, 0.) (2, 0.) (3, 0.) (6, 0.) row 3: (2, 0.) (3, 0.) (7, 0.) row 4: (0, 0.) (4, 0.) (5, 0.) (8, 0.) row 5: (1, 0.) (4, 0.) (5, 0.) (6, 0.) (9, 0.) row 6: (2, 0.) (5, 0.) (6, 0.) (7, 0.) (10, 0.) row 7: (3, 0.) (6, 0.) (7, 0.) (11, 0.) row 8: (4, 0.) (8, 0.) (9, 0.) (12, 0.) row 9: (5, 0.) (8, 0.) (9, 0.) (10, 0.) (13, 0.) row 10: (6, 0.) (9, 0.) (10, 0.) (11, 0.) (14, 0.) row 11: (7, 0.) (10, 0.) (11, 0.) (15, 0.) row 12: (8, 0.) (12, 0.) (13, 0.) row 13: (9, 0.) (12, 0.) (13, 0.) (14, 0.) row 14: (10, 0.) (13, 0.) (14, 0.) (15, 0.) row 15: (11, 0.) (14, 0.) (15, 0.) Mat Object: 1 MPI processes type: seqaij row 0: (0, 2.) (1, -1.) (4, -1.) row 1: (0, -1.) (1, 3.) (2, -1.) (5, -1.) row 2: (1, -1.) (2, 3.) (3, -1.) (6, -1.) row 3: (2, -1.) (3, 2.) (7, -1.) row 4: (0, -1.) (4, 3.) (5, -1.) (8, -1.) row 5: (1, -1.) (4, -1.) (5, 4.) (6, -1.) (9, -1.) row 6: (2, -1.) (5, -1.) (6, 4.) (7, -1.) (10, -1.) row 7: (3, -1.) (6, -1.) (7, 3.) (11, -1.) row 8: (4, -1.) (8, 3.) (9, -1.) (12, -1.) row 9: (5, -1.) (8, -1.) (9, 4.) (10, -1.) (13, -1.) row 10: (6, -1.) (9, -1.) (10, 4.) (11, -1.) (14, -1.) row 11: (7, -1.) (10, -1.) (11, 3.) (15, -1.) row 12: (8, -1.) (12, 2.) (13, -1.) row 13: (9, -1.) (12, -1.) (13, 3.) (14, -1.) row 14: (10, -1.) (13, -1.) (14, 3.) (15, -1.) row 15: (11, -1.) (14, -1.) (15, 2.)
Run with a 120x120 mesh on 4 processors using superlu_dist and view the solver options used
$ mpiexec -n 4 ./ex50 -da_grid_x 120 -da_grid_y 120 -pc_type lu -pc_factor_mat_solver_type superlu_dist -ksp_monitor -ksp_view
Expected output:
0 KSP Residual norm 3.039809126331e+00 1 KSP Residual norm 2.395703277441e-14 KSP Object: 4 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: lu out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: external factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 4 MPI processes type: superlu_dist rows=14400, cols=14400 package used to perform factorization: superlu_dist total: nonzeros=0, allocated nonzeros=0 SuperLU_DIST run parameters: Process grid nprow 2 x npcol 2 Equilibrate matrix TRUE Replace tiny pivots FALSE Use iterative refinement FALSE Processors in row 2 col partition 2 Row permutation LargeDiag_MC64 Column permutation METIS_AT_PLUS_A Parallel symbolic factorization FALSE Repeated factorization SamePattern linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=14400, cols=14400 total: nonzeros=71520, allocated nonzeros=71520 total number of mallocs used during MatSetValues calls=0 has attached null space
Run with a 1025x1025 grid using multigrid solver on 4 processors with 9 multigrid levels
$ mpiexec -n 4 ./ex50 -da_grid_x 1025 -da_grid_y 1025 -pc_type mg -pc_mg_levels 9 -ksp_monitor
Expected output:
Example 2: Nonlinear ODE arising from a time-dependent one dimensional PDE#
WHAT THIS EXAMPLE DEMONSTRATES:
Using command line options
Handling a simple structured grid
Using the ODE integrator
Using call-back functions
FURTHER DETAILS:
DO THE FOLLOWING:
Compile
src/ts/tutorials/ex2.c
$ cd petsc/src/ts/tutorials $ make ex2
Run a 1 processor example on the default grid with all the default solver options
$ mpiexec -n 1 ./ex2 -ts_max_steps 10 -ts_monitor
Expected output:
0 TS dt 0.00847458 time 0. 1 TS dt 0.00847458 time 0.00847458 2 TS dt 0.00847458 time 0.0169492 3 TS dt 0.00847458 time 0.0254237 4 TS dt 0.00847458 time 0.0338983 5 TS dt 0.00847458 time 0.0423729 6 TS dt 0.00847458 time 0.0508475 7 TS dt 0.00847458 time 0.059322 8 TS dt 0.00847458 time 0.0677966 9 TS dt 0.00847458 time 0.0762712 10 TS dt 0.00847458 time 0.0847458
Run with the same options on 4 processors plus monitor convergence of the nonlinear and linear solvers
$ mpiexec -n 4 ./ex2 -ts_max_steps 10 -ts_monitor -snes_monitor -ksp_monitor
Expected output:
0 TS dt 0.00847458 time 0. 0 SNES Function norm 1.044373877325e+01 0 KSP Residual norm 6.321628301999e-02 1 KSP Residual norm 6.526906698930e-03 2 KSP Residual norm 3.007791706552e-03 3 KSP Residual norm 4.405713554522e-04 4 KSP Residual norm 1.328208424515e-04 5 KSP Residual norm 1.644811353135e-05 6 KSP Residual norm 2.166566242194e-06 7 KSP Residual norm 1.586470130595e-16 1 SNES Function norm 7.337449917266e-04 0 KSP Residual norm 3.134728198020e-06 1 KSP Residual norm 3.906901802932e-07 2 KSP Residual norm 9.017310449502e-08 3 KSP Residual norm 3.016048824996e-08 4 KSP Residual norm 5.545058115802e-09 5 KSP Residual norm 1.173289407756e-09 6 KSP Residual norm 2.688949492463e-10 7 KSP Residual norm 6.425931432155e-21 2 SNES Function norm 5.661678516636e-11 1 TS dt 0.00847458 time 0.00847458 0 SNES Function norm 1.044514631097e+01 0 KSP Residual norm 6.331279355759e-02 1 KSP Residual norm 6.535292747665e-03 2 KSP Residual norm 2.987562112557e-03 3 KSP Residual norm 4.407960732509e-04 4 KSP Residual norm 1.310867990310e-04 5 KSP Residual norm 1.643512035814e-05 6 KSP Residual norm 2.157619515591e-06 7 KSP Residual norm 9.439033917013e-17 1 SNES Function norm 7.215537347936e-04 0 KSP Residual norm 3.094217119938e-06 1 KSP Residual norm 3.853396165643e-07 2 KSP Residual norm 8.869933627178e-08 3 KSP Residual norm 2.968861004222e-08 4 KSP Residual norm 5.442261222566e-09 5 KSP Residual norm 1.152789140987e-09 6 KSP Residual norm 2.628606842614e-10 7 KSP Residual norm 3.895271394296e-21 2 SNES Function norm 5.533643492917e-11 2 TS dt 0.00847458 time 0.0169492 0 SNES Function norm 1.044653068944e+01 0 KSP Residual norm 6.340852763106e-02 1 KSP Residual norm 6.543595844294e-03 2 KSP Residual norm 2.967488018227e-03 3 KSP Residual norm 4.410154015305e-04 4 KSP Residual norm 1.293935090965e-04 5 KSP Residual norm 1.642264605661e-05 6 KSP Residual norm 2.149306177473e-06 7 KSP Residual norm 1.303479711243e-16 1 SNES Function norm 7.096638555936e-04 0 KSP Residual norm 3.054510440042e-06 1 KSP Residual norm 3.801000498387e-07 2 KSP Residual norm 8.725839831972e-08 3 KSP Residual norm 2.922750507411e-08 4 KSP Residual norm 5.341779435787e-09 5 KSP Residual norm 1.132841361166e-09 6 KSP Residual norm 2.569880731718e-10 7 KSP Residual norm 5.983404585705e-21 2 SNES Function norm 5.236235205743e-11 3 TS dt 0.00847458 time 0.0254237 0 SNES Function norm 1.044789247558e+01 0 KSP Residual norm 6.350349403220e-02 1 KSP Residual norm 6.551817537718e-03 2 KSP Residual norm 2.947571742346e-03 3 KSP Residual norm 4.412293238928e-04 4 KSP Residual norm 1.277401751314e-04 5 KSP Residual norm 1.641065933559e-05 6 KSP Residual norm 2.141569821046e-06 7 KSP Residual norm 7.468211737013e-17 1 SNES Function norm 6.980655055866e-04 0 KSP Residual norm 3.015586472511e-06 1 KSP Residual norm 3.749683834693e-07 2 KSP Residual norm 8.584934528817e-08 3 KSP Residual norm 2.877684460640e-08 4 KSP Residual norm 5.243554447127e-09 5 KSP Residual norm 1.113425160917e-09 6 KSP Residual norm 2.512722872950e-10 7 KSP Residual norm 3.506711593308e-21 2 SNES Function norm 5.113451407763e-11 4 TS dt 0.00847458 time 0.0338983 0 SNES Function norm 1.044923221792e+01 0 KSP Residual norm 6.359770143981e-02 1 KSP Residual norm 6.559959330184e-03 2 KSP Residual norm 2.927815370907e-03 3 KSP Residual norm 4.414378305489e-04 4 KSP Residual norm 1.261259895261e-04 5 KSP Residual norm 1.639913030560e-05 6 KSP Residual norm 2.134357967380e-06 7 KSP Residual norm 6.760613069100e-17 1 SNES Function norm 6.867492280866e-04 0 KSP Residual norm 2.977424249989e-06 1 KSP Residual norm 3.699416246968e-07 2 KSP Residual norm 8.447126428579e-08 3 KSP Residual norm 2.833631246001e-08 4 KSP Residual norm 5.147530491106e-09 5 KSP Residual norm 1.094520636202e-09 6 KSP Residual norm 2.457086187479e-10 7 KSP Residual norm 7.230574224864e-21 2 SNES Function norm 4.752621462354e-11 5 TS dt 0.00847458 time 0.0423729 0 SNES Function norm 1.045055044742e+01 0 KSP Residual norm 6.369115842069e-02 1 KSP Residual norm 6.568022679059e-03 2 KSP Residual norm 2.908220767658e-03 3 KSP Residual norm 4.416409177985e-04 4 KSP Residual norm 1.245501371995e-04 5 KSP Residual norm 1.638803042750e-05 6 KSP Residual norm 2.127621812336e-06 7 KSP Residual norm 7.898425027421e-17 1 SNES Function norm 6.757059470783e-04 0 KSP Residual norm 2.940003517446e-06 1 KSP Residual norm 3.650168880980e-07 2 KSP Residual norm 8.312328182674e-08 3 KSP Residual norm 2.790560484320e-08 4 KSP Residual norm 5.053653328311e-09 5 KSP Residual norm 1.076108839046e-09 6 KSP Residual norm 2.402925346880e-10 7 KSP Residual norm 3.741394720880e-21 2 SNES Function norm 4.666130700591e-11 6 TS dt 0.00847458 time 0.0508475 0 SNES Function norm 1.045184767808e+01 0 KSP Residual norm 6.378387343082e-02 1 KSP Residual norm 6.576008998535e-03 2 KSP Residual norm 2.888789584447e-03 3 KSP Residual norm 4.418385875472e-04 4 KSP Residual norm 1.230117980237e-04 5 KSP Residual norm 1.637733246166e-05 6 KSP Residual norm 2.121315990116e-06 7 KSP Residual norm 9.102989092310e-17 1 SNES Function norm 6.649269539423e-04 0 KSP Residual norm 2.903304720284e-06 1 KSP Residual norm 3.601913872254e-07 2 KSP Residual norm 8.180453791505e-08 3 KSP Residual norm 2.748442911816e-08 4 KSP Residual norm 4.961868923662e-09 5 KSP Residual norm 1.058171714893e-09 6 KSP Residual norm 2.350195851927e-10 7 KSP Residual norm 4.964834968715e-21 2 SNES Function norm 4.471025129852e-11 7 TS dt 0.00847458 time 0.059322 0 SNES Function norm 1.045312440769e+01 0 KSP Residual norm 6.387585481648e-02 1 KSP Residual norm 6.583919661259e-03 2 KSP Residual norm 2.869523271170e-03 3 KSP Residual norm 4.420308468558e-04 4 KSP Residual norm 1.215101490543e-04 5 KSP Residual norm 1.636701041839e-05 6 KSP Residual norm 2.115398352263e-06 7 KSP Residual norm 6.417124125528e-17 1 SNES Function norm 6.544038785722e-04 0 KSP Residual norm 2.867308940893e-06 1 KSP Residual norm 3.554624318597e-07 2 KSP Residual norm 8.051422770910e-08 3 KSP Residual norm 2.707250399844e-08 4 KSP Residual norm 4.872125443546e-09 5 KSP Residual norm 1.040692056328e-09 6 KSP Residual norm 2.298855614716e-10 7 KSP Residual norm 3.140670460028e-21 2 SNES Function norm 4.058847439815e-11 8 TS dt 0.00847458 time 0.0677966 0 SNES Function norm 1.045438111843e+01 0 KSP Residual norm 6.396711081541e-02 1 KSP Residual norm 6.591755999885e-03 2 KSP Residual norm 2.850423085515e-03 3 KSP Residual norm 4.422177075225e-04 4 KSP Residual norm 1.200443665787e-04 5 KSP Residual norm 1.635703950965e-05 6 KSP Residual norm 2.109829761879e-06 7 KSP Residual norm 6.553582414264e-17 1 SNES Function norm 6.441286836585e-04 0 KSP Residual norm 2.831997900324e-06 1 KSP Residual norm 3.508274228910e-07 2 KSP Residual norm 7.925155398311e-08 3 KSP Residual norm 2.666955853517e-08 4 KSP Residual norm 4.784371891515e-09 5 KSP Residual norm 1.023653459714e-09 6 KSP Residual norm 2.248862935554e-10 7 KSP Residual norm 3.952784367220e-21 2 SNES Function norm 4.092050026541e-11 9 TS dt 0.00847458 time 0.0762712 0 SNES Function norm 1.045561827745e+01 0 KSP Residual norm 6.405764955798e-02 1 KSP Residual norm 6.599519308575e-03 2 KSP Residual norm 2.831490102359e-03 3 KSP Residual norm 4.423991856931e-04 4 KSP Residual norm 1.186136279928e-04 5 KSP Residual norm 1.634739610156e-05 6 KSP Residual norm 2.104573899619e-06 7 KSP Residual norm 5.913882911408e-17 1 SNES Function norm 6.340936409061e-04 0 KSP Residual norm 2.797353906304e-06 1 KSP Residual norm 3.462838484911e-07 2 KSP Residual norm 7.801574775723e-08 3 KSP Residual norm 2.627533147918e-08 4 KSP Residual norm 4.698557254015e-09 5 KSP Residual norm 1.007040261358e-09 6 KSP Residual norm 2.200177803007e-10 7 KSP Residual norm 3.762004175411e-21 2 SNES Function norm 4.009143190166e-11 10 TS dt 0.00847458 time 0.0847458
Run with the same options on 4 processors with 128 grid points
$ mpiexec -n 16 ./ex2 -ts_max_steps 10 -ts_monitor -M 128
Expected output:
0 TS dt 0.00393701 time 0. 1 TS dt 0.00393701 time 0.00393701 2 TS dt 0.00393701 time 0.00787402 3 TS dt 0.00393701 time 0.011811 4 TS dt 0.00393701 time 0.015748 5 TS dt 0.00393701 time 0.019685 6 TS dt 0.00393701 time 0.023622 7 TS dt 0.00393701 time 0.0275591 8 TS dt 0.00393701 time 0.0314961 9 TS dt 0.00393701 time 0.0354331 10 TS dt 0.00393701 time 0.0393701
Example 3: Nonlinear PDE on a structured grid#
WHAT THIS EXAMPLE DEMONSTRATES:
Handling a 2d structured grid
Using the nonlinear solvers
Changing the default linear solver
FURTHER DETAILS:
DO THE FOLLOWING:
Compile
src/snes/tutorials/ex19.c
$ cd petsc/src/snes/tutorials/ $ make ex19
Run a 4 processor example with 5 levels of grid refinement, monitor the convergence of the nonlinear and linear solver and examine the exact solver used
$ mpiexec -n 4 ./ex19 -da_refine 5 -snes_monitor -ksp_monitor -snes_view
Expected output:
lid velocity = 0.000106281, prandtl # = 1., grashof # = 1. 0 SNES Function norm 1.036007954337e-02 0 KSP Residual norm 9.144944502871e-02 1 KSP Residual norm 2.593759906204e-02 2 KSP Residual norm 1.669815200495e-02 3 KSP Residual norm 1.510777951698e-02 4 KSP Residual norm 1.458401237884e-02 5 KSP Residual norm 1.418635322926e-02 6 KSP Residual norm 1.377436725003e-02 7 KSP Residual norm 1.332236907186e-02 8 KSP Residual norm 1.288602527920e-02 9 KSP Residual norm 1.240018288138e-02 10 KSP Residual norm 1.186798872492e-02 11 KSP Residual norm 1.126565820237e-02 12 KSP Residual norm 1.063916289485e-02 13 KSP Residual norm 9.975377414549e-03 14 KSP Residual norm 9.354874256053e-03 15 KSP Residual norm 8.779095086508e-03 16 KSP Residual norm 8.257220099779e-03 17 KSP Residual norm 7.721504294489e-03 18 KSP Residual norm 7.165931362294e-03 19 KSP Residual norm 6.614579158468e-03 20 KSP Residual norm 6.069852590203e-03 21 KSP Residual norm 5.532583715678e-03 22 KSP Residual norm 5.012542062575e-03 23 KSP Residual norm 4.469698743551e-03 24 KSP Residual norm 3.946112491958e-03 25 KSP Residual norm 3.431223373793e-03 26 KSP Residual norm 2.968213186086e-03 27 KSP Residual norm 2.622089571760e-03 28 KSP Residual norm 2.418543969985e-03 29 KSP Residual norm 2.310824403854e-03 30 KSP Residual norm 2.236309852146e-03 31 KSP Residual norm 2.197075836614e-03 32 KSP Residual norm 2.184744877441e-03 33 KSP Residual norm 2.184139801213e-03 34 KSP Residual norm 2.177357542486e-03 35 KSP Residual norm 2.165221418018e-03 36 KSP Residual norm 2.150741735309e-03 37 KSP Residual norm 2.135576712445e-03 38 KSP Residual norm 2.119308727352e-03 39 KSP Residual norm 2.106838711740e-03 40 KSP Residual norm 2.095968914022e-03 41 KSP Residual norm 2.087204866778e-03 42 KSP Residual norm 2.075535849832e-03 43 KSP Residual norm 2.056901852426e-03 44 KSP Residual norm 2.026534413197e-03 45 KSP Residual norm 1.986848633315e-03 46 KSP Residual norm 1.940721994942e-03 47 KSP Residual norm 1.883144499412e-03 48 KSP Residual norm 1.803002891048e-03 49 KSP Residual norm 1.710431551202e-03 50 KSP Residual norm 1.590925854814e-03 51 KSP Residual norm 1.451082323211e-03 52 KSP Residual norm 1.301330299614e-03 53 KSP Residual norm 1.182230084682e-03 54 KSP Residual norm 1.107680018677e-03 55 KSP Residual norm 1.066507531910e-03 56 KSP Residual norm 1.042227387049e-03 57 KSP Residual norm 1.018322560203e-03 58 KSP Residual norm 9.885570705392e-04 59 KSP Residual norm 9.548240372245e-04 60 KSP Residual norm 9.201938192905e-04 61 KSP Residual norm 9.067345082392e-04 62 KSP Residual norm 9.048664588409e-04 63 KSP Residual norm 9.048625048106e-04 64 KSP Residual norm 9.045786478382e-04 65 KSP Residual norm 9.025111529622e-04 66 KSP Residual norm 8.966677177120e-04 67 KSP Residual norm 8.874037824397e-04 68 KSP Residual norm 8.789188093430e-04 69 KSP Residual norm 8.717007141537e-04 70 KSP Residual norm 8.635394193756e-04 71 KSP Residual norm 8.549188412543e-04 72 KSP Residual norm 8.449007643802e-04 73 KSP Residual norm 8.306833639556e-04 74 KSP Residual norm 8.180866477839e-04 75 KSP Residual norm 8.062571853623e-04 76 KSP Residual norm 7.936883824218e-04 77 KSP Residual norm 7.780490917915e-04 78 KSP Residual norm 7.615878046973e-04 79 KSP Residual norm 7.442269316420e-04 80 KSP Residual norm 7.232115811673e-04 81 KSP Residual norm 6.988082322432e-04 82 KSP Residual norm 6.683230529966e-04 83 KSP Residual norm 6.362177125994e-04 84 KSP Residual norm 5.971822394607e-04 85 KSP Residual norm 5.563666831991e-04 86 KSP Residual norm 5.122378485297e-04 87 KSP Residual norm 4.641141819361e-04 88 KSP Residual norm 4.127674620013e-04 89 KSP Residual norm 3.767940694930e-04 90 KSP Residual norm 3.464891353455e-04 91 KSP Residual norm 3.328326373461e-04 92 KSP Residual norm 3.152057745476e-04 93 KSP Residual norm 3.046565182399e-04 94 KSP Residual norm 3.006541516591e-04 95 KSP Residual norm 2.987345416223e-04 96 KSP Residual norm 2.969106149950e-04 97 KSP Residual norm 2.959679010652e-04 98 KSP Residual norm 2.951910867025e-04 99 KSP Residual norm 2.941123180589e-04 100 KSP Residual norm 2.926765604512e-04 101 KSP Residual norm 2.900410156058e-04 102 KSP Residual norm 2.863626926022e-04 103 KSP Residual norm 2.810952824275e-04 104 KSP Residual norm 2.755878113867e-04 105 KSP Residual norm 2.712557488209e-04 106 KSP Residual norm 2.679296794391e-04 107 KSP Residual norm 2.642026368303e-04 108 KSP Residual norm 2.602137324935e-04 109 KSP Residual norm 2.541382700234e-04 110 KSP Residual norm 2.454344737954e-04 111 KSP Residual norm 2.335628910244e-04 112 KSP Residual norm 2.168004908802e-04 113 KSP Residual norm 1.993544334742e-04 114 KSP Residual norm 1.812661079898e-04 115 KSP Residual norm 1.614155068832e-04 116 KSP Residual norm 1.450176652465e-04 117 KSP Residual norm 1.271814682476e-04 118 KSP Residual norm 1.114893752683e-04 119 KSP Residual norm 1.016403410116e-04 120 KSP Residual norm 9.221659791770e-05 121 KSP Residual norm 8.791115165789e-05 122 KSP Residual norm 8.529594818471e-05 123 KSP Residual norm 8.439823024838e-05 124 KSP Residual norm 8.432330233590e-05 125 KSP Residual norm 8.432313969703e-05 126 KSP Residual norm 8.431433381217e-05 127 KSP Residual norm 8.424348778495e-05 128 KSP Residual norm 8.382806777182e-05 129 KSP Residual norm 8.337135217553e-05 130 KSP Residual norm 8.306671396769e-05 131 KSP Residual norm 8.299046396158e-05 132 KSP Residual norm 8.298022748488e-05 133 KSP Residual norm 8.296556620174e-05 134 KSP Residual norm 8.293318221137e-05 135 KSP Residual norm 8.289997195859e-05 136 KSP Residual norm 8.288650847461e-05 137 KSP Residual norm 8.287793944348e-05 138 KSP Residual norm 8.282009713924e-05 139 KSP Residual norm 8.231936644200e-05 140 KSP Residual norm 8.092917384457e-05 141 KSP Residual norm 7.810875275548e-05 142 KSP Residual norm 7.372335915736e-05 143 KSP Residual norm 6.920873807564e-05 144 KSP Residual norm 6.510777151187e-05 145 KSP Residual norm 6.142132751787e-05 146 KSP Residual norm 5.816161902635e-05 147 KSP Residual norm 5.516213050219e-05 148 KSP Residual norm 5.242284932630e-05 149 KSP Residual norm 4.986057648037e-05 150 KSP Residual norm 4.733288055568e-05 151 KSP Residual norm 4.601506226246e-05 152 KSP Residual norm 4.523121336508e-05 153 KSP Residual norm 4.507733287029e-05 154 KSP Residual norm 4.507097170108e-05 155 KSP Residual norm 4.506961467378e-05 156 KSP Residual norm 4.505696152433e-05 157 KSP Residual norm 4.501836545704e-05 158 KSP Residual norm 4.495626487199e-05 159 KSP Residual norm 4.490012973376e-05 160 KSP Residual norm 4.480136241466e-05 161 KSP Residual norm 4.458312996682e-05 162 KSP Residual norm 4.431767440291e-05 163 KSP Residual norm 4.407964983727e-05 164 KSP Residual norm 4.383992113924e-05 165 KSP Residual norm 4.357188317772e-05 166 KSP Residual norm 4.319522376980e-05 167 KSP Residual norm 4.261206561683e-05 168 KSP Residual norm 4.205899613192e-05 169 KSP Residual norm 4.138430977164e-05 170 KSP Residual norm 4.047346042359e-05 171 KSP Residual norm 3.933307539335e-05 172 KSP Residual norm 3.819767627834e-05 173 KSP Residual norm 3.702517997681e-05 174 KSP Residual norm 3.583191921804e-05 175 KSP Residual norm 3.458798761674e-05 176 KSP Residual norm 3.316083374306e-05 177 KSP Residual norm 3.173967896731e-05 178 KSP Residual norm 3.016354626802e-05 179 KSP Residual norm 2.866779750173e-05 180 KSP Residual norm 2.702938196877e-05 181 KSP Residual norm 2.618361138750e-05 182 KSP Residual norm 2.522495437254e-05 183 KSP Residual norm 2.426023897276e-05 184 KSP Residual norm 2.355948721907e-05 185 KSP Residual norm 2.319684487218e-05 186 KSP Residual norm 2.289784420766e-05 187 KSP Residual norm 2.267598687625e-05 188 KSP Residual norm 2.240641749204e-05 189 KSP Residual norm 2.212551730336e-05 190 KSP Residual norm 2.170264854588e-05 191 KSP Residual norm 2.112756030054e-05 192 KSP Residual norm 2.038822399814e-05 193 KSP Residual norm 1.962951220216e-05 194 KSP Residual norm 1.884493949304e-05 195 KSP Residual norm 1.799734026963e-05 196 KSP Residual norm 1.722254569823e-05 197 KSP Residual norm 1.660423842819e-05 198 KSP Residual norm 1.621056066730e-05 199 KSP Residual norm 1.591018158958e-05 200 KSP Residual norm 1.557926981647e-05 201 KSP Residual norm 1.510189268164e-05 202 KSP Residual norm 1.440759642876e-05 203 KSP Residual norm 1.349458967348e-05 204 KSP Residual norm 1.240308276374e-05 205 KSP Residual norm 1.118091740362e-05 206 KSP Residual norm 9.955874799398e-06 207 KSP Residual norm 8.667314210234e-06 208 KSP Residual norm 7.389939064823e-06 209 KSP Residual norm 6.261620050378e-06 210 KSP Residual norm 5.246555512523e-06 211 KSP Residual norm 4.721004890241e-06 212 KSP Residual norm 4.239837116741e-06 213 KSP Residual norm 3.816477467422e-06 214 KSP Residual norm 3.501683693279e-06 215 KSP Residual norm 3.305190215185e-06 216 KSP Residual norm 3.206138813817e-06 217 KSP Residual norm 3.174323738414e-06 218 KSP Residual norm 3.169528835126e-06 219 KSP Residual norm 3.169521851846e-06 220 KSP Residual norm 3.165241221321e-06 221 KSP Residual norm 3.145015122355e-06 222 KSP Residual norm 3.096044377523e-06 223 KSP Residual norm 3.018842023098e-06 224 KSP Residual norm 2.964634266861e-06 225 KSP Residual norm 2.957729966340e-06 226 KSP Residual norm 2.953877433705e-06 227 KSP Residual norm 2.925464755647e-06 228 KSP Residual norm 2.868821700731e-06 229 KSP Residual norm 2.782027517577e-06 230 KSP Residual norm 2.646127535134e-06 231 KSP Residual norm 2.482650898676e-06 232 KSP Residual norm 2.309998463210e-06 233 KSP Residual norm 2.154086486854e-06 234 KSP Residual norm 2.002548899717e-06 235 KSP Residual norm 1.885163787351e-06 236 KSP Residual norm 1.820671950047e-06 237 KSP Residual norm 1.781332450628e-06 238 KSP Residual norm 1.751510777513e-06 239 KSP Residual norm 1.723392579686e-06 240 KSP Residual norm 1.694083934428e-06 241 KSP Residual norm 1.677189950467e-06 242 KSP Residual norm 1.673111374168e-06 243 KSP Residual norm 1.671932556435e-06 244 KSP Residual norm 1.670372344826e-06 245 KSP Residual norm 1.668281814293e-06 246 KSP Residual norm 1.664401756910e-06 247 KSP Residual norm 1.655699903087e-06 248 KSP Residual norm 1.644879786465e-06 249 KSP Residual norm 1.638313410510e-06 250 KSP Residual norm 1.634433920669e-06 251 KSP Residual norm 1.632288417000e-06 252 KSP Residual norm 1.630803349524e-06 253 KSP Residual norm 1.629108012046e-06 254 KSP Residual norm 1.625738527055e-06 255 KSP Residual norm 1.620864125655e-06 256 KSP Residual norm 1.616268872661e-06 257 KSP Residual norm 1.611801733029e-06 258 KSP Residual norm 1.602497312803e-06 259 KSP Residual norm 1.575738098262e-06 260 KSP Residual norm 1.519104509227e-06 261 KSP Residual norm 1.431857168315e-06 262 KSP Residual norm 1.338744222187e-06 263 KSP Residual norm 1.262050256308e-06 264 KSP Residual norm 1.193091353420e-06 265 KSP Residual norm 1.139031579299e-06 266 KSP Residual norm 1.088471447383e-06 267 KSP Residual norm 1.035130328689e-06 268 KSP Residual norm 9.777098693277e-07 269 KSP Residual norm 9.155430511512e-07 270 KSP Residual norm 8.568922899017e-07 1 SNES Function norm 1.178751989614e-06 0 KSP Residual norm 1.748815715091e-06 1 KSP Residual norm 1.624620439395e-06 2 KSP Residual norm 1.456422465392e-06 3 KSP Residual norm 1.326030517472e-06 4 KSP Residual norm 1.134584001300e-06 5 KSP Residual norm 9.824370585989e-07 6 KSP Residual norm 8.882499873515e-07 7 KSP Residual norm 8.249609129314e-07 8 KSP Residual norm 7.838632525267e-07 9 KSP Residual norm 7.558367537184e-07 10 KSP Residual norm 7.352641428514e-07 11 KSP Residual norm 7.177890029352e-07 12 KSP Residual norm 7.027407189224e-07 13 KSP Residual norm 6.870092629142e-07 14 KSP Residual norm 6.712782681272e-07 15 KSP Residual norm 6.556064942447e-07 16 KSP Residual norm 6.413026840450e-07 17 KSP Residual norm 6.250492096162e-07 18 KSP Residual norm 6.087611627271e-07 19 KSP Residual norm 5.930996641661e-07 20 KSP Residual norm 5.781788025672e-07 21 KSP Residual norm 5.610549351106e-07 22 KSP Residual norm 5.401956055125e-07 23 KSP Residual norm 5.168528059550e-07 24 KSP Residual norm 4.913547400553e-07 25 KSP Residual norm 4.653072018102e-07 26 KSP Residual norm 4.457206633372e-07 27 KSP Residual norm 4.263897743643e-07 28 KSP Residual norm 4.072207343179e-07 29 KSP Residual norm 3.820129426326e-07 30 KSP Residual norm 3.524926057079e-07 31 KSP Residual norm 3.348441637200e-07 32 KSP Residual norm 3.208234358783e-07 33 KSP Residual norm 3.080653563509e-07 34 KSP Residual norm 2.969658623379e-07 35 KSP Residual norm 2.873838557550e-07 36 KSP Residual norm 2.784716738215e-07 37 KSP Residual norm 2.682605079585e-07 38 KSP Residual norm 2.582627271510e-07 39 KSP Residual norm 2.471851529652e-07 40 KSP Residual norm 2.364422933814e-07 41 KSP Residual norm 2.262205960981e-07 42 KSP Residual norm 2.177174676790e-07 43 KSP Residual norm 2.119314681575e-07 44 KSP Residual norm 2.086939643300e-07 45 KSP Residual norm 2.069589867774e-07 46 KSP Residual norm 2.051651483193e-07 47 KSP Residual norm 2.028925351652e-07 48 KSP Residual norm 1.994371389840e-07 49 KSP Residual norm 1.960711107273e-07 50 KSP Residual norm 1.934148114236e-07 51 KSP Residual norm 1.905853168179e-07 52 KSP Residual norm 1.867410083506e-07 53 KSP Residual norm 1.820712301605e-07 54 KSP Residual norm 1.769911631032e-07 55 KSP Residual norm 1.726229506451e-07 56 KSP Residual norm 1.684625473534e-07 57 KSP Residual norm 1.621405677874e-07 58 KSP Residual norm 1.531432035392e-07 59 KSP Residual norm 1.424389275969e-07 60 KSP Residual norm 1.310183104254e-07 61 KSP Residual norm 1.242656204932e-07 62 KSP Residual norm 1.179240873971e-07 63 KSP Residual norm 1.123101522446e-07 64 KSP Residual norm 1.054132720446e-07 65 KSP Residual norm 9.849626843795e-08 66 KSP Residual norm 9.197247865719e-08 67 KSP Residual norm 8.638142369566e-08 68 KSP Residual norm 8.305351990631e-08 69 KSP Residual norm 8.136422492034e-08 70 KSP Residual norm 8.002361575492e-08 71 KSP Residual norm 7.866116430480e-08 72 KSP Residual norm 7.722692892725e-08 73 KSP Residual norm 7.544381224969e-08 74 KSP Residual norm 7.354404089853e-08 75 KSP Residual norm 7.119435912763e-08 76 KSP Residual norm 6.867518671189e-08 77 KSP Residual norm 6.602411006203e-08 78 KSP Residual norm 6.337351313821e-08 79 KSP Residual norm 6.080224058330e-08 80 KSP Residual norm 5.890231340688e-08 81 KSP Residual norm 5.776735028390e-08 82 KSP Residual norm 5.681661116681e-08 83 KSP Residual norm 5.569418042042e-08 84 KSP Residual norm 5.415755708381e-08 85 KSP Residual norm 5.233260646242e-08 86 KSP Residual norm 5.021186255249e-08 87 KSP Residual norm 4.793874560013e-08 88 KSP Residual norm 4.567033260035e-08 89 KSP Residual norm 4.349813734988e-08 90 KSP Residual norm 4.106532397968e-08 91 KSP Residual norm 3.953311916974e-08 92 KSP Residual norm 3.782549069823e-08 93 KSP Residual norm 3.618014409136e-08 94 KSP Residual norm 3.473408466117e-08 95 KSP Residual norm 3.295298494632e-08 96 KSP Residual norm 3.116362368283e-08 97 KSP Residual norm 2.964729931235e-08 98 KSP Residual norm 2.843092504497e-08 99 KSP Residual norm 2.752698726769e-08 100 KSP Residual norm 2.677449961663e-08 101 KSP Residual norm 2.623140990700e-08 102 KSP Residual norm 2.574444690034e-08 103 KSP Residual norm 2.522405798326e-08 104 KSP Residual norm 2.457253380175e-08 105 KSP Residual norm 2.370264120442e-08 106 KSP Residual norm 2.262668411404e-08 107 KSP Residual norm 2.152259489644e-08 108 KSP Residual norm 2.036137945628e-08 109 KSP Residual norm 1.909454196392e-08 110 KSP Residual norm 1.787445336551e-08 111 KSP Residual norm 1.667394027353e-08 112 KSP Residual norm 1.559924659359e-08 113 KSP Residual norm 1.481283115471e-08 114 KSP Residual norm 1.435158522296e-08 115 KSP Residual norm 1.414878343957e-08 116 KSP Residual norm 1.407393287977e-08 117 KSP Residual norm 1.404447798640e-08 118 KSP Residual norm 1.403314447128e-08 119 KSP Residual norm 1.402765026958e-08 120 KSP Residual norm 1.402395814514e-08 121 KSP Residual norm 1.402351651527e-08 122 KSP Residual norm 1.402351613398e-08 123 KSP Residual norm 1.402327634838e-08 124 KSP Residual norm 1.402226170240e-08 125 KSP Residual norm 1.402020560080e-08 126 KSP Residual norm 1.401585203411e-08 127 KSP Residual norm 1.400213674922e-08 128 KSP Residual norm 1.396738912660e-08 129 KSP Residual norm 1.392184633617e-08 130 KSP Residual norm 1.386006552773e-08 131 KSP Residual norm 1.378213258746e-08 132 KSP Residual norm 1.368010395178e-08 133 KSP Residual norm 1.353666156547e-08 134 KSP Residual norm 1.329698812650e-08 135 KSP Residual norm 1.293880270293e-08 136 KSP Residual norm 1.237318583853e-08 137 KSP Residual norm 1.179161581684e-08 138 KSP Residual norm 1.134206743886e-08 139 KSP Residual norm 1.099888749064e-08 140 KSP Residual norm 1.066259659918e-08 141 KSP Residual norm 1.032068623282e-08 142 KSP Residual norm 9.954236804205e-09 143 KSP Residual norm 9.560912215756e-09 144 KSP Residual norm 9.145545126143e-09 145 KSP Residual norm 8.738420927086e-09 146 KSP Residual norm 8.409260212909e-09 147 KSP Residual norm 8.127321870356e-09 148 KSP Residual norm 7.898748854589e-09 149 KSP Residual norm 7.703865149553e-09 150 KSP Residual norm 7.486870703879e-09 151 KSP Residual norm 7.330560146692e-09 152 KSP Residual norm 7.128689079801e-09 153 KSP Residual norm 6.964286032888e-09 154 KSP Residual norm 6.785137395884e-09 155 KSP Residual norm 6.640351575110e-09 156 KSP Residual norm 6.511156528025e-09 157 KSP Residual norm 6.423541298314e-09 158 KSP Residual norm 6.345751133879e-09 159 KSP Residual norm 6.280901533539e-09 160 KSP Residual norm 6.207943324992e-09 161 KSP Residual norm 6.142867578714e-09 162 KSP Residual norm 6.087418250155e-09 163 KSP Residual norm 6.034648495317e-09 164 KSP Residual norm 5.977088198390e-09 165 KSP Residual norm 5.912054132468e-09 166 KSP Residual norm 5.836856906294e-09 167 KSP Residual norm 5.765552516298e-09 168 KSP Residual norm 5.679756378903e-09 169 KSP Residual norm 5.563707770725e-09 170 KSP Residual norm 5.405011361483e-09 171 KSP Residual norm 5.189986018558e-09 172 KSP Residual norm 4.908102634804e-09 173 KSP Residual norm 4.706174551357e-09 174 KSP Residual norm 4.546685413987e-09 175 KSP Residual norm 4.435455313583e-09 176 KSP Residual norm 4.332523827655e-09 177 KSP Residual norm 4.203498282754e-09 178 KSP Residual norm 4.058340805179e-09 179 KSP Residual norm 3.912110698046e-09 180 KSP Residual norm 3.754156543395e-09 181 KSP Residual norm 3.673186018443e-09 182 KSP Residual norm 3.613029839620e-09 183 KSP Residual norm 3.561927158378e-09 184 KSP Residual norm 3.500712553539e-09 185 KSP Residual norm 3.426672494749e-09 186 KSP Residual norm 3.351413827965e-09 187 KSP Residual norm 3.271649485418e-09 188 KSP Residual norm 3.188154724333e-09 189 KSP Residual norm 3.100224644403e-09 190 KSP Residual norm 3.010790734288e-09 191 KSP Residual norm 2.899860376120e-09 192 KSP Residual norm 2.795307867707e-09 193 KSP Residual norm 2.656859285349e-09 194 KSP Residual norm 2.478450141125e-09 195 KSP Residual norm 2.312417968425e-09 196 KSP Residual norm 2.135819348150e-09 197 KSP Residual norm 1.987345509540e-09 198 KSP Residual norm 1.882308219422e-09 199 KSP Residual norm 1.824720724119e-09 200 KSP Residual norm 1.793394737731e-09 201 KSP Residual norm 1.777603206282e-09 202 KSP Residual norm 1.767124708984e-09 203 KSP Residual norm 1.759460100944e-09 204 KSP Residual norm 1.754016726123e-09 205 KSP Residual norm 1.750274824009e-09 206 KSP Residual norm 1.747607840035e-09 207 KSP Residual norm 1.744480855293e-09 208 KSP Residual norm 1.741655356119e-09 209 KSP Residual norm 1.739290578624e-09 210 KSP Residual norm 1.734739843580e-09 211 KSP Residual norm 1.731591096986e-09 212 KSP Residual norm 1.728201695680e-09 213 KSP Residual norm 1.723790864899e-09 214 KSP Residual norm 1.713216135283e-09 215 KSP Residual norm 1.699328282646e-09 216 KSP Residual norm 1.678451518644e-09 217 KSP Residual norm 1.657901783704e-09 218 KSP Residual norm 1.635934083952e-09 219 KSP Residual norm 1.612202184752e-09 220 KSP Residual norm 1.567594145713e-09 221 KSP Residual norm 1.512642314015e-09 222 KSP Residual norm 1.457761603944e-09 223 KSP Residual norm 1.408111435897e-09 224 KSP Residual norm 1.356076746727e-09 225 KSP Residual norm 1.307962107684e-09 226 KSP Residual norm 1.266646107226e-09 227 KSP Residual norm 1.231345481628e-09 228 KSP Residual norm 1.187725733507e-09 229 KSP Residual norm 1.141044747451e-09 230 KSP Residual norm 1.099737269841e-09 231 KSP Residual norm 1.066786397851e-09 232 KSP Residual norm 1.030129878172e-09 233 KSP Residual norm 9.927434935483e-10 234 KSP Residual norm 9.473773325131e-10 235 KSP Residual norm 9.089690854957e-10 236 KSP Residual norm 8.759516453077e-10 237 KSP Residual norm 8.535012664712e-10 238 KSP Residual norm 8.308754136837e-10 239 KSP Residual norm 8.082501666452e-10 240 KSP Residual norm 7.754022320857e-10 241 KSP Residual norm 7.572112123056e-10 242 KSP Residual norm 7.442389885537e-10 243 KSP Residual norm 7.283799305392e-10 244 KSP Residual norm 7.073231969200e-10 245 KSP Residual norm 6.852558048466e-10 246 KSP Residual norm 6.637193841945e-10 247 KSP Residual norm 6.457535438239e-10 248 KSP Residual norm 6.348852218182e-10 249 KSP Residual norm 6.257254477629e-10 250 KSP Residual norm 6.158564534747e-10 251 KSP Residual norm 6.053446723415e-10 252 KSP Residual norm 5.965943232727e-10 253 KSP Residual norm 5.889879966631e-10 254 KSP Residual norm 5.798723091489e-10 255 KSP Residual norm 5.693837266918e-10 256 KSP Residual norm 5.572444317590e-10 257 KSP Residual norm 5.444694016783e-10 258 KSP Residual norm 5.294254215977e-10 259 KSP Residual norm 5.111579131674e-10 260 KSP Residual norm 4.907812169770e-10 261 KSP Residual norm 4.698943704786e-10 262 KSP Residual norm 4.516736830848e-10 263 KSP Residual norm 4.344099614072e-10 264 KSP Residual norm 4.205651998682e-10 265 KSP Residual norm 4.094400044417e-10 266 KSP Residual norm 4.007111452005e-10 267 KSP Residual norm 3.933386697206e-10 268 KSP Residual norm 3.859019001818e-10 269 KSP Residual norm 3.778293647154e-10 270 KSP Residual norm 3.685811905503e-10 271 KSP Residual norm 3.629152212256e-10 272 KSP Residual norm 3.564840338194e-10 273 KSP Residual norm 3.500495329589e-10 274 KSP Residual norm 3.431898068167e-10 275 KSP Residual norm 3.353565364446e-10 276 KSP Residual norm 3.263783927727e-10 277 KSP Residual norm 3.192763021450e-10 278 KSP Residual norm 3.122085426690e-10 279 KSP Residual norm 3.052760168605e-10 280 KSP Residual norm 2.980518819939e-10 281 KSP Residual norm 2.912439358242e-10 282 KSP Residual norm 2.849441584275e-10 283 KSP Residual norm 2.782808821251e-10 284 KSP Residual norm 2.710684441319e-10 285 KSP Residual norm 2.633874526828e-10 286 KSP Residual norm 2.553129453009e-10 287 KSP Residual norm 2.463101545110e-10 288 KSP Residual norm 2.377718932698e-10 289 KSP Residual norm 2.297278980842e-10 290 KSP Residual norm 2.216396694908e-10 291 KSP Residual norm 2.140852462454e-10 292 KSP Residual norm 2.058870942400e-10 293 KSP Residual norm 1.971167887405e-10 294 KSP Residual norm 1.884340894074e-10 295 KSP Residual norm 1.791613990270e-10 296 KSP Residual norm 1.688633513441e-10 297 KSP Residual norm 1.602704727320e-10 298 KSP Residual norm 1.541392323809e-10 299 KSP Residual norm 1.504896772477e-10 300 KSP Residual norm 1.483487862182e-10 301 KSP Residual norm 1.471763898232e-10 302 KSP Residual norm 1.458511285335e-10 303 KSP Residual norm 1.446834701991e-10 304 KSP Residual norm 1.422626457779e-10 305 KSP Residual norm 1.383238030475e-10 306 KSP Residual norm 1.340609360645e-10 307 KSP Residual norm 1.295566320698e-10 308 KSP Residual norm 1.254740222524e-10 309 KSP Residual norm 1.211504994547e-10 310 KSP Residual norm 1.174647456465e-10 311 KSP Residual norm 1.136881957093e-10 312 KSP Residual norm 1.097588329841e-10 313 KSP Residual norm 1.041565322506e-10 314 KSP Residual norm 9.702228215912e-11 315 KSP Residual norm 8.989484119100e-11 316 KSP Residual norm 8.190664018704e-11 317 KSP Residual norm 7.558818834442e-11 318 KSP Residual norm 6.957179676675e-11 319 KSP Residual norm 6.447734334845e-11 320 KSP Residual norm 6.116288878725e-11 321 KSP Residual norm 5.888979447543e-11 322 KSP Residual norm 5.764019417950e-11 323 KSP Residual norm 5.691872542062e-11 324 KSP Residual norm 5.660800677735e-11 325 KSP Residual norm 5.653667085624e-11 326 KSP Residual norm 5.653626356788e-11 327 KSP Residual norm 5.653235239410e-11 328 KSP Residual norm 5.653229638983e-11 329 KSP Residual norm 5.651625676369e-11 330 KSP Residual norm 5.646144096698e-11 331 KSP Residual norm 5.644627590998e-11 332 KSP Residual norm 5.644531324883e-11 333 KSP Residual norm 5.644497420702e-11 334 KSP Residual norm 5.644493580837e-11 335 KSP Residual norm 5.644470457969e-11 336 KSP Residual norm 5.644168745929e-11 337 KSP Residual norm 5.640053714959e-11 338 KSP Residual norm 5.626705339217e-11 339 KSP Residual norm 5.617632737039e-11 340 KSP Residual norm 5.610418525034e-11 341 KSP Residual norm 5.596338060631e-11 342 KSP Residual norm 5.565290829848e-11 343 KSP Residual norm 5.515768341466e-11 344 KSP Residual norm 5.442406254737e-11 345 KSP Residual norm 5.353934703450e-11 346 KSP Residual norm 5.271979463229e-11 347 KSP Residual norm 5.159719023218e-11 348 KSP Residual norm 5.050230595494e-11 349 KSP Residual norm 4.886604439750e-11 350 KSP Residual norm 4.701727019984e-11 351 KSP Residual norm 4.529669235590e-11 352 KSP Residual norm 4.321338221199e-11 353 KSP Residual norm 4.146974082227e-11 354 KSP Residual norm 3.989258303406e-11 355 KSP Residual norm 3.833287369436e-11 356 KSP Residual norm 3.696924118282e-11 357 KSP Residual norm 3.572876582097e-11 358 KSP Residual norm 3.466908142629e-11 359 KSP Residual norm 3.362405494859e-11 360 KSP Residual norm 3.248076664724e-11 361 KSP Residual norm 3.183961506680e-11 362 KSP Residual norm 3.105405824388e-11 363 KSP Residual norm 3.019347608164e-11 364 KSP Residual norm 2.915901620646e-11 365 KSP Residual norm 2.819671262899e-11 366 KSP Residual norm 2.737270822959e-11 367 KSP Residual norm 2.674106883731e-11 368 KSP Residual norm 2.627612843956e-11 369 KSP Residual norm 2.574335369953e-11 370 KSP Residual norm 2.523339840051e-11 371 KSP Residual norm 2.465910296336e-11 372 KSP Residual norm 2.407616725222e-11 373 KSP Residual norm 2.348589790024e-11 374 KSP Residual norm 2.305940230387e-11 375 KSP Residual norm 2.268699192687e-11 376 KSP Residual norm 2.227499543133e-11 377 KSP Residual norm 2.188320344640e-11 378 KSP Residual norm 2.152291481840e-11 379 KSP Residual norm 2.108875033860e-11 380 KSP Residual norm 2.062954609947e-11 381 KSP Residual norm 2.007204894177e-11 382 KSP Residual norm 1.941423359969e-11 383 KSP Residual norm 1.872835639533e-11 384 KSP Residual norm 1.794965148165e-11 385 KSP Residual norm 1.714082018695e-11 2 SNES Function norm 1.215516598864e-11 SNES Object: 4 MPI processes type: newtonls maximum iterations=50, maximum function evaluations=10000 tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 total number of linear solver iterations=655 total number of function evaluations=3 norm schedule ALWAYS Jacobian is built using colored finite differences on a DMDA SNESLineSearch Object: 4 MPI processes type: bt interpolation: cubic alpha=1.000000e-04 maxstep=1.000000e+08, minlambda=1.000000e-12 tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08 maximum iterations=40 KSP Object: 4 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: bjacobi number of blocks = 4 Local solver information for first block is in the following KSP and PC objects on rank 0: Use -ksp_view ::ascii_info_detail to display information for all blocks KSP Object: (sub_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (sub_) 1 MPI processes type: ilu out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=9604, cols=9604, bs=4 package used to perform factorization: petsc total: nonzeros=188944, allocated nonzeros=188944 using I-node routines: found 2401 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (sub_) 1 MPI processes type: seqaij rows=9604, cols=9604, bs=4 total: nonzeros=188944, allocated nonzeros=188944 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2401 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=37636, cols=37636, bs=4 total: nonzeros=746512, allocated nonzeros=746512 total number of mallocs used during MatSetValues calls=0 Number of SNES iterations = 2
Run with the same options but use geometric multigrid as the linear solver
$ mpiexec -n 4 ./ex19 -da_refine 5 -snes_monitor -ksp_monitor -snes_view -pc_type mg
Expected output:
lid velocity = 0.000106281, prandtl # = 1., grashof # = 1. 0 SNES Function norm 1.036007954337e-02 0 KSP Residual norm 2.388589583549e+00 1 KSP Residual norm 5.715829806981e-01 2 KSP Residual norm 4.623679005936e-02 3 KSP Residual norm 1.143381177646e-02 4 KSP Residual norm 2.015139840224e-03 5 KSP Residual norm 4.356196119798e-04 6 KSP Residual norm 4.240953066710e-05 7 KSP Residual norm 8.848315297175e-06 1 SNES Function norm 9.854304971115e-06 0 KSP Residual norm 3.868049496775e-05 1 KSP Residual norm 7.693574326868e-06 2 KSP Residual norm 1.059429116239e-06 3 KSP Residual norm 4.004524784804e-07 4 KSP Residual norm 1.050186948327e-07 5 KSP Residual norm 5.073180513583e-08 6 KSP Residual norm 2.510513776297e-08 7 KSP Residual norm 1.211886495400e-08 8 KSP Residual norm 1.911963112131e-09 9 KSP Residual norm 3.005260864225e-10 2 SNES Function norm 3.117674497824e-10 0 KSP Residual norm 3.005042584730e-10 1 KSP Residual norm 1.120673922713e-10 2 KSP Residual norm 3.288439453292e-11 3 KSP Residual norm 5.822504321413e-12 4 KSP Residual norm 2.486684466178e-12 5 KSP Residual norm 1.198858055503e-12 6 KSP Residual norm 6.255669709502e-13 7 KSP Residual norm 1.544647758005e-13 8 KSP Residual norm 4.592122224907e-14 9 KSP Residual norm 4.984149547392e-15 10 KSP Residual norm 8.905129652955e-16 3 SNES Function norm 1.045594761851e-14 SNES Object: 4 MPI processes type: newtonls maximum iterations=50, maximum function evaluations=10000 tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 total number of linear solver iterations=26 total number of function evaluations=4 norm schedule ALWAYS Jacobian is built using colored finite differences on a DMDA SNESLineSearch Object: 4 MPI processes type: bt interpolation: cubic alpha=1.000000e-04 maxstep=1.000000e+08, minlambda=1.000000e-12 tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08 maximum iterations=40 KSP Object: 4 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: mg type is MULTIPLICATIVE, levels=6 cycles=v Cycles per PCApply=1 Not using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 4 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 4 MPI processes type: redundant First (color=0) of 4 PCs follows KSP Object: (mg_coarse_redundant_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_redundant_) 1 MPI processes type: lu out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot [INBLOCKS] matrix ordering: nd factor fill ratio given 5., needed 1.875 Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=64, cols=64, bs=4 package used to perform factorization: petsc total: nonzeros=1920, allocated nonzeros=1920 using I-node routines: found 16 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=64, cols=64, bs=4 total: nonzeros=1024, allocated nonzeros=1024 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 16 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=64, cols=64, bs=4 total: nonzeros=1024, allocated nonzeros=1024 total number of mallocs used during MatSetValues calls=0 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 4 MPI processes type: chebyshev eigenvalue estimates used: min = 0.148269, max = 1.63095 eigenvalues estimate via gmres min 0.144902, max 1.48269 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 4 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=196, cols=196, bs=4 total: nonzeros=3472, allocated nonzeros=3472 total number of mallocs used during MatSetValues calls=0 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 4 MPI processes type: chebyshev eigenvalue estimates used: min = 0.149178, max = 1.64096 eigenvalues estimate via gmres min 0.0843938, max 1.49178 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 4 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=676, cols=676, bs=4 total: nonzeros=12688, allocated nonzeros=12688 total number of mallocs used during MatSetValues calls=0 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 4 MPI processes type: chebyshev eigenvalue estimates used: min = 0.146454, max = 1.61099 eigenvalues estimate via gmres min 0.0659153, max 1.46454 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 4 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=2500, cols=2500, bs=4 total: nonzeros=48400, allocated nonzeros=48400 total number of mallocs used during MatSetValues calls=0 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 4 MPI processes type: chebyshev eigenvalue estimates used: min = 0.141089, max = 1.55197 eigenvalues estimate via gmres min 0.044097, max 1.41089 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_4_esteig_) 4 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_4_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=9604, cols=9604, bs=4 total: nonzeros=188944, allocated nonzeros=188944 total number of mallocs used during MatSetValues calls=0 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 5 ------------------------------- KSP Object: (mg_levels_5_) 4 MPI processes type: chebyshev eigenvalue estimates used: min = 0.127956, max = 1.40751 eigenvalues estimate via gmres min 0.0380398, max 1.27956 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_5_esteig_) 4 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_5_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=37636, cols=37636, bs=4 total: nonzeros=746512, allocated nonzeros=746512 total number of mallocs used during MatSetValues calls=0 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=37636, cols=37636, bs=4 total: nonzeros=746512, allocated nonzeros=746512 total number of mallocs used during MatSetValues calls=0 Number of SNES iterations = 3
Note this requires many fewer iterations than the default solver
Run with the same options but use algebraic multigrid (hypre’s BoomerAMG) as the linear solver
$ mpiexec -n 4 ./ex19 -da_refine 5 -snes_monitor -ksp_monitor -snes_view -pc_type hypre
Expected output:
lid velocity = 0.000106281, prandtl # = 1., grashof # = 1. 0 SNES Function norm 1.036007954337e-02 0 KSP Residual norm 3.529801578944e+00 1 KSP Residual norm 9.549227287830e-01 2 KSP Residual norm 1.833989370926e-01 3 KSP Residual norm 3.207579266155e-02 4 KSP Residual norm 1.205175868015e-02 5 KSP Residual norm 3.633439929268e-03 6 KSP Residual norm 7.859510138803e-04 7 KSP Residual norm 1.906870003209e-04 8 KSP Residual norm 3.576753906104e-05 9 KSP Residual norm 7.330444204372e-06 1 SNES Function norm 3.201130806131e-06 0 KSP Residual norm 4.428492869097e-05 1 KSP Residual norm 1.376439700414e-05 2 KSP Residual norm 4.735862478488e-06 3 KSP Residual norm 1.441618472811e-06 4 KSP Residual norm 5.469745765483e-07 5 KSP Residual norm 9.791769343634e-08 6 KSP Residual norm 3.666388214924e-08 7 KSP Residual norm 1.110122757449e-08 8 KSP Residual norm 2.336948935104e-09 9 KSP Residual norm 4.378689442660e-10 2 SNES Function norm 2.503925085367e-10 0 KSP Residual norm 4.377201912341e-10 1 KSP Residual norm 1.301710740308e-10 2 KSP Residual norm 4.911627444716e-11 3 KSP Residual norm 1.629334185679e-11 4 KSP Residual norm 3.966572220621e-12 5 KSP Residual norm 1.201091398418e-12 6 KSP Residual norm 2.399034074753e-13 7 KSP Residual norm 5.152830599460e-14 8 KSP Residual norm 1.728793265661e-14 9 KSP Residual norm 4.944662929385e-15 10 KSP Residual norm 1.518465265686e-15 3 SNES Function norm 1.014356640724e-14 SNES Object: 4 MPI processes type: newtonls maximum iterations=50, maximum function evaluations=10000 tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 total number of linear solver iterations=28 total number of function evaluations=4 norm schedule ALWAYS Jacobian is built using colored finite differences on a DMDA SNESLineSearch Object: 4 MPI processes type: bt interpolation: cubic alpha=1.000000e-04 maxstep=1.000000e+08, minlambda=1.000000e-12 tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08 maximum iterations=40 KSP Object: 4 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: hypre HYPRE BoomerAMG preconditioning Cycle type V Maximum number of levels 25 Maximum number of iterations PER hypre call 1 Convergence tolerance PER hypre call 0. Threshold for strong coupling 0.25 Interpolation truncation factor 0. Interpolation: max elements per row 0 Number of levels of aggressive coarsening 0 Number of paths for aggressive coarsening 1 Maximum row sums 0.9 Sweeps down 1 Sweeps up 1 Sweeps on coarse 1 Relax down symmetric-SOR/Jacobi Relax up symmetric-SOR/Jacobi Relax on coarse Gaussian-elimination Relax weight (all) 1. Outer relax weight (all) 1. Using CF-relaxation Not using more complex smoothers. Measure type local Coarsen type Falgout Interpolation type classical linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=37636, cols=37636, bs=4 total: nonzeros=746512, allocated nonzeros=746512 total number of mallocs used during MatSetValues calls=0 Number of SNES iterations = 3
Note this requires many fewer iterations than the default solver but requires more linear solver iterations than geometric multigrid.
Run with the same options but use the ML preconditioner from Trilinos
$ mpiexec -n 4 ./ex19 -da_refine 5 -snes_monitor -ksp_monitor -snes_view -pc_type ml
Expected output:
lid velocity = 0.000106281, prandtl # = 1., grashof # = 1. 0 SNES Function norm 1.036007954337e-02 0 KSP Residual norm 4.386611885521e-01 1 KSP Residual norm 3.127838547648e-01 2 KSP Residual norm 2.334238799976e-01 3 KSP Residual norm 1.318350769046e-01 4 KSP Residual norm 8.143004995277e-02 5 KSP Residual norm 4.813457707703e-02 6 KSP Residual norm 2.733719484710e-02 7 KSP Residual norm 1.462151166610e-02 8 KSP Residual norm 8.331275159714e-03 9 KSP Residual norm 5.275884606772e-03 10 KSP Residual norm 3.768690746652e-03 11 KSP Residual norm 2.699365250849e-03 12 KSP Residual norm 1.824681696989e-03 13 KSP Residual norm 1.179126381375e-03 14 KSP Residual norm 7.107306905422e-04 15 KSP Residual norm 4.155820345074e-04 16 KSP Residual norm 2.408438645714e-04 17 KSP Residual norm 1.476855736514e-04 18 KSP Residual norm 9.907578845784e-05 19 KSP Residual norm 6.797328938557e-05 20 KSP Residual norm 4.675944545290e-05 21 KSP Residual norm 3.352583819988e-05 22 KSP Residual norm 2.441653843583e-05 23 KSP Residual norm 1.768579774951e-05 24 KSP Residual norm 1.411236998154e-05 25 KSP Residual norm 1.062751951795e-05 26 KSP Residual norm 7.402652132828e-06 27 KSP Residual norm 5.376101942565e-06 28 KSP Residual norm 3.936750059583e-06 1 SNES Function norm 1.539897001872e-06 0 KSP Residual norm 1.219799734452e-05 1 KSP Residual norm 9.351061917402e-06 2 KSP Residual norm 5.158749230924e-06 3 KSP Residual norm 3.800583255559e-06 4 KSP Residual norm 3.358178466662e-06 5 KSP Residual norm 2.704711328549e-06 6 KSP Residual norm 1.965512422061e-06 7 KSP Residual norm 1.383231224585e-06 8 KSP Residual norm 1.016838426400e-06 9 KSP Residual norm 7.817076853999e-07 10 KSP Residual norm 5.714787985538e-07 11 KSP Residual norm 4.070517298179e-07 12 KSP Residual norm 3.199366855723e-07 13 KSP Residual norm 2.498338700906e-07 14 KSP Residual norm 1.774654138694e-07 15 KSP Residual norm 1.220894443119e-07 16 KSP Residual norm 8.191467113032e-08 17 KSP Residual norm 5.471425416521e-08 18 KSP Residual norm 3.800399846106e-08 19 KSP Residual norm 2.624332673801e-08 20 KSP Residual norm 1.817665454829e-08 21 KSP Residual norm 1.193934944948e-08 22 KSP Residual norm 7.058845517318e-09 23 KSP Residual norm 4.801132366324e-09 24 KSP Residual norm 3.136888134931e-09 25 KSP Residual norm 1.908878704398e-09 26 KSP Residual norm 1.179297203589e-09 27 KSP Residual norm 7.450436712605e-10 28 KSP Residual norm 4.727201847673e-10 29 KSP Residual norm 3.071103410289e-10 30 KSP Residual norm 1.954558918512e-10 31 KSP Residual norm 1.476035656576e-10 32 KSP Residual norm 9.544641423593e-11 2 SNES Function norm 2.155536219955e-11 SNES Object: 4 MPI processes type: newtonls maximum iterations=50, maximum function evaluations=10000 tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 total number of linear solver iterations=60 total number of function evaluations=3 norm schedule ALWAYS Jacobian is built using colored finite differences on a DMDA SNESLineSearch Object: 4 MPI processes type: bt interpolation: cubic alpha=1.000000e-04 maxstep=1.000000e+08, minlambda=1.000000e-12 tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08 maximum iterations=40 KSP Object: 4 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: ml type is MULTIPLICATIVE, levels=5 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 4 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 4 MPI processes type: redundant First (color=0) of 4 PCs follows KSP Object: (mg_coarse_redundant_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_redundant_) 1 MPI processes type: lu out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot [INBLOCKS] matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=16, cols=16, bs=4 package used to perform factorization: petsc total: nonzeros=256, allocated nonzeros=256 using I-node routines: found 4 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=16, cols=16, bs=4 total: nonzeros=256, allocated nonzeros=256 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 4 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=16, cols=16, bs=4 total: nonzeros=256, allocated nonzeros=256 total number of mallocs used during MatSetValues calls=0 using I-node (on process 0) routines: found 1 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 4 MPI processes type: richardson damping factor=1. maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=88, cols=88, bs=4 total: nonzeros=4896, allocated nonzeros=4896 total number of mallocs used during MatSetValues calls=0 using I-node (on process 0) routines: found 5 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 4 MPI processes type: richardson damping factor=1. maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=772, cols=772, bs=4 total: nonzeros=35364, allocated nonzeros=35364 total number of mallocs used during MatSetValues calls=0 using I-node (on process 0) routines: found 56 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 4 MPI processes type: richardson damping factor=1. maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=7156, cols=7156, bs=4 total: nonzeros=258241, allocated nonzeros=258241 total number of mallocs used during MatSetValues calls=0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 4 MPI processes type: richardson damping factor=1. maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_4_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=37636, cols=37636, bs=4 total: nonzeros=746512, allocated nonzeros=746512 total number of mallocs used during MatSetValues calls=0 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=37636, cols=37636, bs=4 total: nonzeros=746512, allocated nonzeros=746512 total number of mallocs used during MatSetValues calls=0 Number of SNES iterations = 2
Run on 1 processor with the default linear solver and profile the run
$ mpiexec -n 1 ./ex19 -da_refine 5 -log_view
Expected output:
lid velocity = 0.000106281, prandtl # = 1., grashof # = 1. Number of SNES iterations = 2 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ex19 on a arch-ubuntu-mb-double-extra-opt named pdsbox with 1 processor, by patrick Fri Jul 27 15:30:21 2018 Using Petsc Development GIT revision: v3.9.3-921-gfc2aa81 GIT Date: 2018-07-27 11:07:58 +0200 Max Max/Min Avg Total Time (sec): 3.068e+00 1.000 3.068e+00 Objects: 9.400e+01 1.000 9.400e+01 Flop: 3.195e+09 1.000 3.195e+09 3.195e+09 Flop/sec: 1.041e+09 1.000 1.041e+09 1.041e+09 MPI Messages: 0.000e+00 0.000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ------ --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total Count %Total Avg %Total Count %Total 0: Main Stage: 3.0680e+00 100.0% 3.1952e+09 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent AvgLen: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage ---- Total Max Ratio Max Ratio Max Ratio Mess AvgLen Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage SNESSolve 1 1.0 3.0142e+00 1.0 3.20e+09 1.0 0.0e+00 0.0e+00 0.0e+00 98100 0 0 0 98100 0 0 0 1060 SNESFunctionEval 45 1.0 4.6601e-02 1.0 3.56e+07 1.0 0.0e+00 0.0e+00 0.0e+00 2 1 0 0 0 2 1 0 0 0 763 SNESJacobianEval 2 1.0 8.7663e-02 1.0 3.63e+07 1.0 0.0e+00 0.0e+00 0.0e+00 3 1 0 0 0 3 1 0 0 0 414 SNESLineSearch 2 1.0 6.2666e-03 1.0 5.32e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 849 VecDot 2 1.0 7.8201e-05 1.0 1.51e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1925 VecMDot 568 1.0 2.6999e-01 1.0 6.61e+08 1.0 0.0e+00 0.0e+00 0.0e+00 9 21 0 0 0 9 21 0 0 0 2447 VecNorm 593 1.0 1.2326e-02 1.0 4.46e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 3621 VecScale 587 1.0 7.1690e-03 1.0 2.21e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 3082 VecCopy 63 1.0 1.7498e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 64 1.0 4.0205e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 76 1.0 2.5930e-03 1.0 5.72e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 2206 VecWAXPY 2 1.0 2.6870e-04 1.0 7.53e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 280 VecMAXPY 587 1.0 4.1431e-01 1.0 7.03e+08 1.0 0.0e+00 0.0e+00 0.0e+00 14 22 0 0 0 14 22 0 0 0 1698 VecScatterBegin 45 1.0 7.8702e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecReduceArith 4 1.0 9.0122e-05 1.0 3.01e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3341 VecReduceComm 2 1.0 1.6689e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 587 1.0 2.0009e-02 1.0 6.63e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 3312 MatMult 587 1.0 8.6304e-01 1.0 8.54e+08 1.0 0.0e+00 0.0e+00 0.0e+00 28 27 0 0 0 28 27 0 0 0 990 MatSolve 587 1.0 1.2751e+00 1.0 8.54e+08 1.0 0.0e+00 0.0e+00 0.0e+00 42 27 0 0 0 42 27 0 0 0 670 MatLUFactorNum 2 1.0 5.5508e-02 1.0 1.41e+07 1.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 2 0 0 0 0 254 MatILUFactorSym 1 1.0 9.9506e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyBegin 3 1.0 1.4305e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 3 1.0 6.2709e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 1 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 1.2467e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatZeroEntries 2 1.0 4.0841e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatFDColorCreate 1 1.0 2.3603e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatFDColorSetUp 1 1.0 2.7331e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatFDColorApply 2 1.0 5.9345e-02 1.0 3.63e+07 1.0 0.0e+00 0.0e+00 0.0e+00 2 1 0 0 0 2 1 0 0 0 611 MatFDColorFunc 42 1.0 4.4162e-02 1.0 3.32e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 752 KSPSetUp 2 1.0 6.0439e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 2 1.0 2.9187e+00 1.0 3.15e+09 1.0 0.0e+00 0.0e+00 0.0e+00 95 99 0 0 0 95 99 0 0 0 1080 KSPGMRESOrthog 568 1.0 6.6268e-01 1.0 1.32e+09 1.0 0.0e+00 0.0e+00 0.0e+00 22 41 0 0 0 22 41 0 0 0 1994 PCSetUp 2 1.0 6.6768e-02 1.0 1.41e+07 1.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 2 0 0 0 0 211 PCApply 587 1.0 1.2759e+00 1.0 8.54e+08 1.0 0.0e+00 0.0e+00 0.0e+00 42 27 0 0 0 42 27 0 0 0 670 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage SNES 1 1 1372 0. DMSNES 1 1 672 0. SNESLineSearch 1 1 1000 0. Vector 47 47 13320304 0. Matrix 2 2 19277596 0. Matrix FD Coloring 1 1 16612048 0. Distributed Mesh 1 1 5344 0. Index Set 29 29 738052 0. IS L to G Mapping 2 2 189524 0. Star Forest Graph 2 2 1728 0. Discrete System 1 1 932 0. Vec Scatter 2 2 1408 0. Krylov Solver 1 1 18416 0. DMKSP interface 1 1 656 0. Preconditioner 1 1 1008 0. Viewer 1 0 0 0. ======================================================================================================================== Average time to get PetscTime(): 4.76837e-08 #PETSc Option Table entries: -da_refine 5 -log_view #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: PETSC_DIR=/home/patrick/petsc-mb PETSC_ARCH=arch-ubuntu-mb-double-extra-opt --with-debugging=0 --with-precision=double --with-scalar-type=real --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --COPTFLAGS="-g -O3 -march=native " --CXXOPTFLAGS="-g -O3 -march=native " --FOPTFLAGS="-g -O3 -march=native " --download-c2html --download-suitesparse --download-yaml --download-hdf5 --download-scalapack --download-metis --download-parmetis --download-mumps --download-superlu_dist --download-triangle --download-ctetgen --download-sundials --download-ml --download-exodusii --download-hdf5 --download-netcdf --download-pnetcdf ----------------------------------------- Libraries compiled on 2018-07-27 13:01:14 on pdsbox Machine characteristics: Linux-4.13.0-39-generic-x86_64-with-Ubuntu-16.04-xenial Using PETSc directory: /home/patrick/petsc-mb Using PETSc arch: arch-ubuntu-mb-double-extra-opt ----------------------------------------- Using C compiler: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden Using Fortran compiler: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpif90 -Wall -ffree-line-length-0 -Wno-unused-dummy-argument ----------------------------------------- Using include paths: -I/home/patrick/petsc-mb/include -I/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/include ----------------------------------------- Using C linker: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpicc Using Fortran linker: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpif90 Using libraries: -Wl,-rpath,/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -L/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -lpetsc -Wl,-rpath,/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -L/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lpthread -lscalapack -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -lsuperlu_dist -lml -lsundials_cvode -lsundials_nvecserial -lsundials_nvecparallel -llapack -lblas -lexodus -lnetcdf -lpnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lparmetis -lmetis -ltriangle -lm -lctetgen -lpthread -lyaml -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++ -ldl -----------------------------------------
Search for the line beginning with SNESSolve, the fourth column gives the time for the nonlinear solve.
Run on 1 processor with the geometric multigrid linear solver and profile the run
$ mpiexec -n 1 ./ex19 -da_refine 5 -log_view -pc_type mg
Expected output:
lid velocity = 0.000106281, prandtl # = 1., grashof # = 1. Number of SNES iterations = 3 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ex19 on a arch-ubuntu-mb-double-extra-opt named pdsbox with 1 processor, by patrick Fri Jul 27 15:30:40 2018 Using Petsc Development GIT revision: v3.9.3-921-gfc2aa81 GIT Date: 2018-07-27 11:07:58 +0200 Max Max/Min Avg Total Time (sec): 6.992e-01 1.000 6.992e-01 Objects: 4.800e+02 1.000 4.800e+02 Flop: 5.237e+08 1.000 5.237e+08 5.237e+08 Flop/sec: 7.490e+08 1.000 7.490e+08 7.490e+08 MPI Messages: 0.000e+00 0.000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ------ --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total Count %Total Avg %Total Count %Total 0: Main Stage: 6.9923e-01 100.0% 5.2371e+08 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent AvgLen: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage ---- Total Max Ratio Max Ratio Max Ratio Mess AvgLen Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage SNESSolve 1 1.0 6.4513e-01 1.0 5.24e+08 1.0 0.0e+00 0.0e+00 0.0e+00 92100 0 0 0 92100 0 0 0 812 SNESFunctionEval 255 1.0 5.6777e-02 1.0 4.71e+07 1.0 0.0e+00 0.0e+00 0.0e+00 8 9 0 0 0 8 9 0 0 0 829 SNESJacobianEval 12 1.0 1.1569e-01 1.0 4.89e+07 1.0 0.0e+00 0.0e+00 0.0e+00 17 9 0 0 0 17 9 0 0 0 422 SNESLineSearch 2 1.0 5.7764e-03 1.0 5.32e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 921 VecDot 2 1.0 7.6056e-05 1.0 1.51e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1979 VecMDot 111 1.0 7.1726e-03 1.0 1.38e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 1930 VecNorm 139 1.0 1.0304e-03 1.0 3.71e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 3600 VecScale 123 1.0 6.1989e-04 1.0 1.60e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 2586 VecCopy 321 1.0 1.7195e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 382 1.0 4.5128e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecAXPY 252 1.0 3.0298e-03 1.0 4.41e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1455 VecAYPX 520 1.0 1.4517e-02 1.0 6.58e+06 1.0 0.0e+00 0.0e+00 0.0e+00 2 1 0 0 0 2 1 0 0 0 453 VecAXPBYCZ 260 1.0 9.2232e-03 1.0 1.32e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 1427 VecWAXPY 2 1.0 2.7275e-04 1.0 7.53e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 276 VecMAXPY 123 1.0 1.3335e-02 1.0 1.67e+07 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 1252 VecScatterBegin 265 1.0 1.1539e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecReduceArith 4 1.0 8.3208e-05 1.0 3.01e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3618 VecReduceComm 2 1.0 1.4305e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 123 1.0 1.6305e-03 1.0 4.81e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2949 MatMult 513 1.0 1.9088e-01 1.0 2.10e+08 1.0 0.0e+00 0.0e+00 0.0e+00 27 40 0 0 0 27 40 0 0 0 1100 MatMultAdd 65 1.0 5.0337e-03 1.0 2.93e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 582 MatMultTranspose 70 1.0 5.1179e-03 1.0 3.16e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 617 MatSolve 13 1.0 1.2708e-04 1.0 4.91e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 386 MatSOR 500 1.0 2.4601e-01 1.0 2.00e+08 1.0 0.0e+00 0.0e+00 0.0e+00 35 38 0 0 0 35 38 0 0 0 813 MatLUFactorSym 1 1.0 1.1539e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 2 1.0 1.2755e-04 1.0 1.82e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 143 MatResidual 65 1.0 2.3766e-02 1.0 2.60e+07 1.0 0.0e+00 0.0e+00 0.0e+00 3 5 0 0 0 3 5 0 0 0 1094 MatAssemblyBegin 23 1.0 6.9141e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 23 1.0 8.9450e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatGetRowIJ 1 1.0 1.2636e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 4.2200e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatZeroEntries 12 1.0 5.6696e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatFDColorCreate 6 1.0 2.8205e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatFDColorSetUp 6 1.0 3.7257e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 5 0 0 0 0 5 0 0 0 0 0 MatFDColorApply 12 1.0 7.7073e-02 1.0 4.89e+07 1.0 0.0e+00 0.0e+00 0.0e+00 11 9 0 0 0 11 9 0 0 0 634 MatFDColorFunc 252 1.0 5.5366e-02 1.0 4.47e+07 1.0 0.0e+00 0.0e+00 0.0e+00 8 9 0 0 0 8 9 0 0 0 807 DMCoarsen 5 1.0 2.9874e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 DMCreateInterp 5 1.0 3.7532e-03 1.0 2.25e+05 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 60 KSPSetUp 19 1.0 9.5367e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 2 1.0 5.5405e-01 1.0 4.81e+08 1.0 0.0e+00 0.0e+00 0.0e+00 79 92 0 0 0 79 92 0 0 0 869 KSPGMRESOrthog 111 1.0 1.8478e-02 1.0 2.77e+07 1.0 0.0e+00 0.0e+00 0.0e+00 3 5 0 0 0 3 5 0 0 0 1498 PCSetUp 2 1.0 5.5341e-02 1.0 1.28e+07 1.0 0.0e+00 0.0e+00 0.0e+00 8 2 0 0 0 8 2 0 0 0 232 PCApply 13 1.0 4.7885e-01 1.0 4.45e+08 1.0 0.0e+00 0.0e+00 0.0e+00 68 85 0 0 0 68 85 0 0 0 928 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage SNES 1 1 1372 0. DMSNES 6 6 4432 0. SNESLineSearch 1 1 1000 0. Vector 206 206 18098080 0. Matrix 22 22 13706952 0. Matrix FD Coloring 6 6 22297904 0. Distributed Mesh 6 6 31664 0. Index Set 159 159 393244 0. IS L to G Mapping 12 12 261444 0. Star Forest Graph 12 12 9728 0. Discrete System 6 6 5572 0. Vec Scatter 17 17 11968 0. Krylov Solver 12 12 177272 0. DMKSP interface 6 6 3936 0. Preconditioner 7 7 6968 0. Viewer 1 0 0 0. ======================================================================================================================== Average time to get PetscTime(): 4.76837e-08 #PETSc Option Table entries: -da_refine 5 -log_view -pc_type mg #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: PETSC_DIR=/home/patrick/petsc-mb PETSC_ARCH=arch-ubuntu-mb-double-extra-opt --with-debugging=0 --with-precision=double --with-scalar-type=real --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --COPTFLAGS="-g -O3 -march=native " --CXXOPTFLAGS="-g -O3 -march=native " --FOPTFLAGS="-g -O3 -march=native " --download-c2html --download-suitesparse --download-yaml --download-hdf5 --download-scalapack --download-metis --download-parmetis --download-mumps --download-superlu_dist --download-triangle --download-ctetgen --download-sundials --download-ml --download-exodusii --download-hdf5 --download-netcdf --download-pnetcdf ----------------------------------------- Libraries compiled on 2018-07-27 13:01:14 on pdsbox Machine characteristics: Linux-4.13.0-39-generic-x86_64-with-Ubuntu-16.04-xenial Using PETSc directory: /home/patrick/petsc-mb Using PETSc arch: arch-ubuntu-mb-double-extra-opt ----------------------------------------- Using C compiler: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden Using Fortran compiler: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpif90 -Wall -ffree-line-length-0 -Wno-unused-dummy-argument ----------------------------------------- Using include paths: -I/home/patrick/petsc-mb/include -I/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/include ----------------------------------------- Using C linker: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpicc Using Fortran linker: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpif90 Using libraries: -Wl,-rpath,/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -L/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -lpetsc -Wl,-rpath,/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -L/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lpthread -lscalapack -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -lsuperlu_dist -lml -lsundials_cvode -lsundials_nvecserial -lsundials_nvecparallel -llapack -lblas -lexodus -lnetcdf -lpnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lparmetis -lmetis -ltriangle -lm -lctetgen -lpthread -lyaml -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++ -ldl -----------------------------------------
Compare the runtime for SNESSolve to the case with the default solver
Run on 4 processors with the default linear solver and profile the run
$ mpiexec -n 4 ./ex19 -da_refine 5 -log_view
Expected output:
lid velocity = 0.000106281, prandtl # = 1., grashof # = 1. Number of SNES iterations = 2 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ex19 on a arch-ubuntu-mb-double-extra-opt named pdsbox with 4 processors, by patrick Fri Jul 27 15:30:00 2018 Using Petsc Development GIT revision: v3.9.3-921-gfc2aa81 GIT Date: 2018-07-27 11:07:58 +0200 Max Max/Min Avg Total Time (sec): 1.200e+00 1.000 1.200e+00 Objects: 9.900e+01 1.000 9.900e+01 Flop: 9.349e+08 1.042 9.158e+08 3.663e+09 Flop/sec: 7.789e+08 1.042 7.631e+08 3.053e+09 MPI Messages: 1.453e+03 1.000 1.453e+03 5.811e+03 MPI Message Lengths: 2.266e+06 1.021 1.544e+03 8.972e+06 MPI Reductions: 1.535e+03 1.000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ------ --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total Count %Total Avg %Total Count %Total 0: Main Stage: 1.2001e+00 100.0% 3.6633e+09 100.0% 5.811e+03 100.0% 1.544e+03 100.0% 1.528e+03 99.5% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent AvgLen: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage ---- Total Max Ratio Max Ratio Max Ratio Mess AvgLen Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSidedF 3 1.0 5.1808e-0314.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SNESSolve 1 1.0 1.1392e+00 1.0 9.35e+08 1.0 5.8e+03 1.6e+03 1.5e+03 95100 99100 98 95100 99100 98 3216 SNESFunctionEval 45 1.0 6.7165e-03 1.0 9.08e+06 1.0 2.4e+01 1.6e+03 0.0e+00 1 1 0 0 0 1 1 0 0 0 5295 SNESJacobianEval 2 1.0 2.6000e-02 1.0 9.26e+06 1.0 3.4e+02 1.5e+03 8.6e+01 2 1 6 6 6 2 1 6 6 6 1395 SNESLineSearch 2 1.0 1.9200e-03 1.0 1.36e+06 1.0 3.2e+01 1.6e+03 8.0e+00 0 0 1 1 1 0 0 1 1 1 2771 VecDot 2 1.0 2.2244e-04 2.2 3.84e+04 1.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 677 VecMDot 655 1.0 2.0498e-01 1.1 1.94e+08 1.0 0.0e+00 0.0e+00 6.6e+02 16 21 0 0 43 16 21 0 0 43 3705 VecNorm 683 1.0 4.9419e-02 1.3 1.31e+07 1.0 0.0e+00 0.0e+00 6.8e+02 4 1 0 0 44 4 1 0 0 45 1040 VecScale 677 1.0 2.5268e-03 1.1 6.50e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 10084 VecCopy 66 1.0 5.6410e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 703 1.0 2.7184e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 82 1.0 5.7197e-04 1.1 1.58e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 10791 VecWAXPY 2 1.0 5.7220e-05 1.1 1.92e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1315 VecMAXPY 677 1.0 1.2249e-01 1.1 2.06e+08 1.0 0.0e+00 0.0e+00 0.0e+00 10 22 0 0 0 10 22 0 0 0 6603 VecScatterBegin 722 1.0 9.7113e-03 1.1 0.00e+00 0.0 5.8e+03 1.6e+03 0.0e+00 1 0 99100 0 1 0 99100 0 0 VecScatterEnd 722 1.0 1.0391e-02 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecReduceArith 4 1.0 4.0293e-05 1.1 7.68e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 7472 VecReduceComm 2 1.0 7.3195e-05 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 677 1.0 5.1487e-02 1.2 1.95e+07 1.0 0.0e+00 0.0e+00 6.8e+02 4 2 0 0 44 4 2 0 0 44 1485 MatMult 677 1.0 3.2710e-01 1.0 2.51e+08 1.0 5.4e+03 1.6e+03 0.0e+00 27 27 93 94 0 27 27 93 94 0 3012 MatSolve 677 1.0 3.9744e-01 1.0 2.49e+08 1.0 0.0e+00 0.0e+00 0.0e+00 33 27 0 0 0 33 27 0 0 0 2458 MatLUFactorNum 2 1.0 9.7592e-03 1.0 3.53e+06 1.1 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 1405 MatILUFactorSym 1 1.0 2.1026e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyBegin 3 1.0 5.2419e-0312.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 3 1.0 3.2048e-03 1.0 0.00e+00 0.0 1.6e+01 3.9e+02 8.0e+00 0 0 0 0 1 0 0 0 0 1 0 MatGetRowIJ 1 1.0 3.8147e-06 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 2.5654e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatZeroEntries 2 1.0 1.5545e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatFDColorCreate 1 1.0 1.1539e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatFDColorSetUp 1 1.0 1.1923e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+01 1 0 0 0 5 1 0 0 0 5 0 MatFDColorApply 2 1.0 1.4742e-02 1.1 9.26e+06 1.0 3.4e+02 1.6e+03 3.0e+00 1 1 6 6 0 1 1 6 6 0 2461 MatFDColorFunc 42 1.0 8.3950e-03 1.2 8.47e+06 1.0 3.4e+02 1.6e+03 0.0e+00 1 1 6 6 0 1 1 6 6 0 3954 KSPSetUp 4 1.0 4.4918e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 1 0 0 0 0 1 0 KSPSolve 2 1.0 1.1108e+00 1.0 9.24e+08 1.0 5.4e+03 1.6e+03 1.4e+03 93 99 93 93 91 93 99 93 93 92 3260 KSPGMRESOrthog 655 1.0 3.2223e-01 1.1 3.88e+08 1.0 0.0e+00 0.0e+00 6.6e+02 26 41 0 0 43 26 41 0 0 43 4714 PCSetUp 4 1.0 1.2222e-02 1.0 3.53e+06 1.1 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 1122 PCSetUpOnBlocks 2 1.0 1.2162e-02 1.0 3.53e+06 1.1 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 1128 PCApply 677 1.0 4.1153e-01 1.0 2.49e+08 1.0 0.0e+00 0.0e+00 0.0e+00 34 27 0 0 0 34 27 0 0 0 2374 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage SNES 1 1 1372 0. DMSNES 1 1 672 0. SNESLineSearch 1 1 1000 0. Vector 49 49 3390624 0. Matrix 4 4 5105468 0. Matrix FD Coloring 1 1 4240080 0. Distributed Mesh 1 1 5344 0. Index Set 29 29 207404 0. IS L to G Mapping 1 1 10672 0. Star Forest Graph 2 2 1728 0. Discrete System 1 1 932 0. Vec Scatter 2 2 22184 0. Krylov Solver 2 2 19592 0. DMKSP interface 1 1 656 0. Preconditioner 2 2 1912 0. Viewer 1 0 0 0. ======================================================================================================================== Average time to get PetscTime(): 2.38419e-08 Average time for MPI_Barrier(): 2.91824e-05 Average time for zero size MPI_Send(): 8.88109e-06 #PETSc Option Table entries: -da_refine 5 -log_view #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: PETSC_DIR=/home/patrick/petsc-mb PETSC_ARCH=arch-ubuntu-mb-double-extra-opt --with-debugging=0 --with-precision=double --with-scalar-type=real --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --COPTFLAGS="-g -O3 -march=native " --CXXOPTFLAGS="-g -O3 -march=native " --FOPTFLAGS="-g -O3 -march=native " --download-c2html --download-suitesparse --download-yaml --download-hdf5 --download-scalapack --download-metis --download-parmetis --download-mumps --download-superlu_dist --download-triangle --download-ctetgen --download-sundials --download-ml --download-exodusii --download-hdf5 --download-netcdf --download-pnetcdf ----------------------------------------- Libraries compiled on 2018-07-27 13:01:14 on pdsbox Machine characteristics: Linux-4.13.0-39-generic-x86_64-with-Ubuntu-16.04-xenial Using PETSc directory: /home/patrick/petsc-mb Using PETSc arch: arch-ubuntu-mb-double-extra-opt ----------------------------------------- Using C compiler: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden Using Fortran compiler: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpif90 -Wall -ffree-line-length-0 -Wno-unused-dummy-argument ----------------------------------------- Using include paths: -I/home/patrick/petsc-mb/include -I/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/include ----------------------------------------- Using C linker: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpicc Using Fortran linker: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpif90 Using libraries: -Wl,-rpath,/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -L/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -lpetsc -Wl,-rpath,/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -L/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lpthread -lscalapack -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -lsuperlu_dist -lml -lsundials_cvode -lsundials_nvecserial -lsundials_nvecparallel -llapack -lblas -lexodus -lnetcdf -lpnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lparmetis -lmetis -ltriangle -lm -lctetgen -lpthread -lyaml -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++ -ldl -----------------------------------------
Compare the runtime for
SNESSolve
to the 1 processor case with the default solver. What is the speedup?Run on 4 processors with the geometric multigrid linear solver and profile the run
$ mpiexec -n 4 ./ex19 -da_refine 5 -log_view -pc_type mg
Expected output:
lid velocity = 0.000106281, prandtl # = 1., grashof # = 1. Number of SNES iterations = 3 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ex19 on a arch-ubuntu-mb-double-extra-opt named pdsbox with 4 processors, by patrick Fri Jul 27 15:29:45 2018 Using Petsc Development GIT revision: v3.9.3-921-gfc2aa81 GIT Date: 2018-07-27 11:07:58 +0200 Max Max/Min Avg Total Time (sec): 4.796e-01 1.001 4.794e-01 Objects: 5.730e+02 1.000 5.730e+02 Flop: 2.266e+08 1.061 2.201e+08 8.802e+08 Flop/sec: 4.726e+08 1.060 4.590e+08 1.836e+09 MPI Messages: 3.012e+03 1.043 2.950e+03 1.180e+04 MPI Message Lengths: 1.664e+06 1.054 5.490e+02 6.478e+06 MPI Reductions: 1.472e+03 1.000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ------ --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total Count %Total Avg %Total Count %Total 0: Main Stage: 4.7942e-01 100.0% 8.8021e+08 100.0% 1.180e+04 100.0% 5.490e+02 100.0% 1.465e+03 99.5% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent AvgLen: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage ---- Total Max Ratio Max Ratio Max Ratio Mess AvgLen Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSidedF 29 1.0 5.8196e-03 6.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 SNESSolve 1 1.0 4.1890e-01 1.0 2.27e+08 1.1 1.2e+04 5.5e+02 1.4e+03 87100100100 98 87100100100 98 2101 SNESFunctionEval 382 1.0 1.7268e-02 1.4 1.81e+07 1.1 3.2e+01 1.6e+03 0.0e+00 3 8 0 1 0 3 8 0 1 0 4066 SNESJacobianEval 18 1.0 6.6740e-02 1.0 1.89e+07 1.1 3.0e+03 5.2e+02 5.3e+02 14 8 26 24 36 14 8 26 24 36 1098 SNESLineSearch 3 1.0 2.8355e-03 1.0 2.04e+06 1.0 4.8e+01 1.6e+03 1.2e+01 1 1 0 1 1 1 1 0 1 1 2814 VecDot 3 1.0 2.3961e-04 1.3 5.76e+04 1.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 0 0 0 0 0 0 942 VecMDot 170 1.0 2.7825e-02 2.9 5.78e+06 1.1 0.0e+00 0.0e+00 1.7e+02 4 3 0 0 12 4 3 0 0 12 808 VecNorm 211 1.0 1.4362e-02 1.5 1.48e+06 1.1 0.0e+00 0.0e+00 2.1e+02 3 1 0 0 14 3 1 0 0 14 401 VecScale 188 1.0 4.2009e-04 1.2 6.51e+05 1.1 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 6036 VecCopy 499 1.0 1.2398e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 504 1.0 4.2868e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 378 1.0 6.4588e-04 1.4 1.70e+06 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 10235 VecAYPX 920 1.0 1.0263e-02 1.4 3.00e+06 1.1 0.0e+00 0.0e+00 0.0e+00 2 1 0 0 0 2 1 0 0 0 1134 VecAXPBYCZ 460 1.0 5.8384e-03 1.2 6.00e+06 1.1 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3988 VecWAXPY 3 1.0 1.2279e-04 1.4 2.88e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 920 VecMAXPY 188 1.0 7.0734e-03 1.4 6.95e+06 1.1 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3823 VecScatterBegin 1541 1.0 1.5262e-02 1.1 0.00e+00 0.0 1.1e+04 5.6e+02 0.0e+00 3 0 97 98 0 3 0 97 98 0 0 VecScatterEnd 1541 1.0 4.6712e-02 3.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 6 0 0 0 0 6 0 0 0 0 0 VecReduceArith 6 1.0 6.8903e-05 1.3 1.15e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 6554 VecReduceComm 3 1.0 2.2459e-04 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 188 1.0 1.3350e-02 1.6 1.95e+06 1.1 0.0e+00 0.0e+00 1.9e+02 2 1 0 0 13 2 1 0 0 13 570 MatMult 878 1.0 1.2342e-01 1.2 9.29e+07 1.1 6.9e+03 6.4e+02 0.0e+00 23 41 59 68 0 23 41 59 68 0 2925 MatMultAdd 115 1.0 4.8120e-03 1.1 1.32e+06 1.0 5.8e+02 2.6e+02 0.0e+00 1 1 5 2 0 1 1 5 2 0 1078 MatMultTranspose 120 1.0 1.4319e-02 3.1 1.38e+06 1.0 6.0e+02 2.6e+02 0.0e+00 2 1 5 2 0 2 1 5 2 0 378 MatSolve 23 1.0 2.2483e-04 1.1 8.68e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1545 MatSOR 855 1.0 1.5684e-01 1.3 8.70e+07 1.1 0.0e+00 0.0e+00 0.0e+00 28 38 0 0 0 28 38 0 0 0 2154 MatLUFactorSym 1 1.0 1.2636e-04 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 3 1.0 1.7381e-04 1.2 2.73e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 628 MatCopy 2 1.0 3.3379e-06 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 1 1.0 3.6716e-05 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatResidual 115 1.0 1.8372e-02 1.4 1.19e+07 1.1 9.2e+02 6.1e+02 0.0e+00 3 5 8 9 0 3 5 8 9 0 2504 MatAssemblyBegin 30 1.0 6.0785e-03 5.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatAssemblyEnd 30 1.0 8.9223e-03 1.1 0.00e+00 0.0 1.5e+02 9.3e+01 8.8e+01 2 0 1 0 6 2 0 1 0 6 0 MatGetRowIJ 1 1.0 8.3447e-06 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMats 3 1.0 2.5177e-04 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 6.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 3.4571e-05 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatZeroEntries 18 1.0 4.1199e-04 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatFDColorCreate 6 1.0 4.7135e-04 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 1 0 0 0 0 1 0 MatFDColorSetUp 6 1.0 2.7599e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 4.8e+02 6 0 0 0 33 6 0 0 0 33 0 MatFDColorApply 18 1.0 3.8840e-02 1.0 1.89e+07 1.1 3.0e+03 5.2e+02 2.8e+01 8 8 26 24 2 8 8 26 24 2 1887 MatFDColorFunc 378 1.0 2.6123e-02 1.1 1.73e+07 1.1 3.0e+03 5.2e+02 0.0e+00 5 8 26 24 0 5 8 26 24 0 2566 MatRedundantMat 3 1.0 3.0828e-04 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 6.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMPIConcateSeq 3 1.0 4.4107e-05 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 DMCoarsen 5 1.0 1.9548e-03 1.0 0.00e+00 0.0 8.0e+01 2.2e+01 5.5e+01 0 0 1 0 4 0 0 1 0 4 0 DMCreateInterp 5 1.0 6.3477e-03 1.0 5.76e+04 1.0 1.2e+02 6.6e+01 1.2e+02 1 0 1 0 8 1 0 1 0 8 36 KSPSetUp 29 1.0 3.4931e-03 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 7.6e+01 1 0 0 0 5 1 0 0 0 5 0 KSPSolve 3 1.0 3.7720e-01 1.0 2.10e+08 1.1 1.1e+04 5.0e+02 1.3e+03 79 93 95 86 91 79 93 95 86 91 2166 KSPGMRESOrthog 170 1.0 3.3011e-02 2.1 1.16e+07 1.1 0.0e+00 0.0e+00 1.7e+02 5 5 0 0 12 5 5 0 0 12 1363 PCSetUp 3 1.0 5.2358e-02 1.0 5.09e+06 1.1 2.9e+03 3.2e+02 8.2e+02 11 2 25 14 56 11 2 25 14 56 367 PCApply 23 1.0 3.0923e-01 1.0 1.94e+08 1.1 8.1e+03 5.4e+02 4.4e+02 63 86 69 69 30 63 86 69 69 30 2434 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage SNES 1 1 1372 0. DMSNES 6 6 4432 0. SNESLineSearch 1 1 1000 0. Vector 231 231 4858432 0. Matrix 56 56 4056840 0. Matrix FD Coloring 6 6 5749152 0. Distributed Mesh 6 6 31664 0. Index Set 185 185 220700 0. IS L to G Mapping 6 6 17912 0. Star Forest Graph 12 12 9728 0. Discrete System 6 6 5572 0. Vec Scatter 29 29 97136 0. Krylov Solver 13 13 178448 0. DMKSP interface 6 6 3936 0. Preconditioner 8 8 7904 0. Viewer 1 0 0 0. ======================================================================================================================== Average time to get PetscTime(): 4.76837e-08 Average time for MPI_Barrier(): 2.79903e-05 Average time for zero size MPI_Send(): 1.04904e-05 #PETSc Option Table entries: -da_refine 5 -log_view -pc_type mg #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: PETSC_DIR=/home/patrick/petsc-mb PETSC_ARCH=arch-ubuntu-mb-double-extra-opt --with-debugging=0 --with-precision=double --with-scalar-type=real --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --COPTFLAGS="-g -O3 -march=native " --CXXOPTFLAGS="-g -O3 -march=native " --FOPTFLAGS="-g -O3 -march=native " --download-c2html --download-suitesparse --download-yaml --download-hdf5 --download-scalapack --download-metis --download-parmetis --download-mumps --download-superlu_dist --download-triangle --download-ctetgen --download-sundials --download-ml --download-exodusii --download-hdf5 --download-netcdf --download-pnetcdf ----------------------------------------- Libraries compiled on 2018-07-27 13:01:14 on pdsbox Machine characteristics: Linux-4.13.0-39-generic-x86_64-with-Ubuntu-16.04-xenial Using PETSc directory: /home/patrick/petsc-mb Using PETSc arch: arch-ubuntu-mb-double-extra-opt ----------------------------------------- Using C compiler: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden Using Fortran compiler: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpif90 -Wall -ffree-line-length-0 -Wno-unused-dummy-argument ----------------------------------------- Using include paths: -I/home/patrick/petsc-mb/include -I/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/include ----------------------------------------- Using C linker: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpicc Using Fortran linker: /home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/bin/mpif90 Using libraries: -Wl,-rpath,/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -L/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -lpetsc -Wl,-rpath,/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -L/home/patrick/petsc-mb/arch-ubuntu-mb-double-extra-opt/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lpthread -lscalapack -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -lsuperlu_dist -lml -lsundials_cvode -lsundials_nvecserial -lsundials_nvecparallel -llapack -lblas -lexodus -lnetcdf -lpnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lparmetis -lmetis -ltriangle -lm -lctetgen -lpthread -lyaml -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++ -ldl -----------------------------------------
Compare the runtime for SNESSolve to the 1 processor case with multigrid. What is the speedup? Why is the speedup for multigrid lower than the speedup for the default solver?
Example 4: Linear Stokes-type PDE on a structured grid#
WHAT THIS EXAMPLE DEMONSTRATES:
Handling a 3d structured grid
Controlling linear solver options
Selecting composible preconditioners
Solving a Stokes problem
Adding your own problem specific visualization
FURTHER DETAILS:
DO THE FOLLOWING:
Compile
src/ksp/ksp/tutorials/ex42.c
$ cd petsc/src/ksp/ksp/tutorials $ make ex42
Solve with the default solver
$ mpiexec -n 4 ./ex42 -stokes_ksp_monitor
Expected output:
Residual norms for stokes_ solve. 0 KSP Residual norm 1.415106368539e-01 1 KSP Residual norm 1.400940883855e-01 2 KSP Residual norm 1.378888975786e-01 3 KSP Residual norm 1.186604070209e-01 4 KSP Residual norm 1.028998751762e-01 5 KSP Residual norm 7.848909969551e-02 6 KSP Residual norm 5.447220914352e-02 7 KSP Residual norm 3.988770773413e-02 8 KSP Residual norm 3.363854861536e-02 9 KSP Residual norm 2.736827233823e-02 10 KSP Residual norm 1.165737217325e-02 11 KSP Residual norm 4.774812483753e-03 12 KSP Residual norm 3.000086313131e-03 13 KSP Residual norm 2.037587391559e-03 14 KSP Residual norm 1.568521310223e-03 15 KSP Residual norm 1.334625445379e-03 16 KSP Residual norm 1.090851737254e-03 17 KSP Residual norm 8.103224656843e-04 18 KSP Residual norm 5.102610507854e-04 19 KSP Residual norm 3.288211379908e-04 20 KSP Residual norm 2.728521747502e-04 21 KSP Residual norm 2.223350574525e-04 22 KSP Residual norm 1.673811400572e-04 23 KSP Residual norm 1.311135290276e-04 24 KSP Residual norm 7.772166545934e-05 25 KSP Residual norm 5.867750733525e-05 26 KSP Residual norm 4.630194799472e-05 27 KSP Residual norm 3.964093586415e-05 28 KSP Residual norm 3.056223609761e-05 29 KSP Residual norm 1.859145871820e-05 30 KSP Residual norm 1.193441010276e-05 31 KSP Residual norm 1.165803628044e-05 32 KSP Residual norm 1.141616465398e-05 33 KSP Residual norm 1.049268438209e-05 34 KSP Residual norm 8.497378403744e-06 35 KSP Residual norm 7.388268820294e-06 36 KSP Residual norm 5.336285891511e-06 37 KSP Residual norm 3.840054785874e-06 38 KSP Residual norm 2.395520205040e-06 39 KSP Residual norm 1.803336718072e-06 40 KSP Residual norm 1.575082118565e-06 41 KSP Residual norm 1.498075417723e-06 42 KSP Residual norm 1.421906839381e-06 43 KSP Residual norm 1.162676253678e-06
Note the poor convergence for even a very small problem
Solve with a solver appropriate for Stoke’s problems
-stokes_pc_type fieldsplit -stokes_pc_fieldsplit_type schur
$ mpiexec -n 4 ./ex42 -stokes_ksp_monitor -stokes_pc_type fieldsplit -stokes_pc_fieldsplit_type schur
Expected output:
Residual norms for stokes_ solve. 0 KSP Residual norm 2.219696134159e-01 1 KSP Residual norm 2.160496122907e-01 2 KSP Residual norm 2.931931177007e-02 3 KSP Residual norm 3.505372069583e-03 4 KSP Residual norm 5.619667025611e-04 5 KSP Residual norm 9.130887729526e-05 6 KSP Residual norm 1.745745948841e-05 7 KSP Residual norm 2.979709687028e-06 8 KSP Residual norm 5.167316063963e-07
Solve with a finer mesh
$ mpiexec -n 4 ./ex42 -mx 20 -stokes_ksp_monitor -stokes_pc_type fieldsplit -stokes_pc_fieldsplit_type schur
Expected output:
Residual norms for stokes_ solve. 0 KSP Residual norm 5.855527212944e-01 1 KSP Residual norm 5.686237979386e-01 2 KSP Residual norm 3.073755126459e-02 3 KSP Residual norm 3.829303018364e-03 4 KSP Residual norm 6.329352722742e-04 5 KSP Residual norm 1.037681537018e-04 6 KSP Residual norm 1.742269049904e-05 7 KSP Residual norm 3.133590728240e-06
Repeat with
-mx 40
and/or more MPI ranks.
Example 5: Nonlinear time dependent PDE on Unstructured Grid#
WHAT THIS EXAMPLE DEMONSTRATES:
Changing the default ODE integrator
Handling unstructured grids
Registering your own interchangeable physics and algorithm modules
FURTHER DETAILS:
DO THE FOLLOWING:
Compile
src/ts/tutorials/ex11.c
$ cd petsc/src/ts/tutorials $ make ex11
Run simple advection through a tiny hybrid mesh
$ mpiexec -n 1 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/sevenside.exo
Expected output:
0 time 0 |x| 7.155 1 time 0.7211 |x| 4.386 2 time 1.442 |x| 1.968 3 time 2.163 |x| 1.925 CONVERGED_TIME at time 2.16333 after 3 steps
Run simple advection through a small mesh with a Rosenbrock-W solver
$ mpiexec -n 1 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/sevenside.exo -ts_type rosw
Expected output:
0 time 0 |x| 7.155 1 time 0.03915 |x| 7.035 2 time 0.1105 |x| 6.802 3 time 0.1895 |x| 6.525 4 time 0.2751 |x| 6.21 5 time 0.366 |x| 5.862 6 time 0.4606 |x| 5.492 7 time 0.5569 |x| 5.113 8 time 0.6526 |x| 4.736 9 time 0.7492 |x| 4.361 10 time 0.8467 |x| 3.99 11 time 0.9462 |x| 3.622 12 time 1.051 |x| 3.248 13 time 1.166 |x| 2.853 14 time 1.293 |x| 2.444 15 time 1.431 |x| 2.024 16 time 1.581 |x| 1.915 17 time 1.741 |x| 1.939 18 time 1.908 |x| 1.956 19 time 2.075 |x| 1.969 CONVERGED_TIME at time 2.07496 after 19 steps
Run simple advection through a larger quadrilateral mesh of an annulus with least squares reconstruction and no limiting, monitoring the error
$ mpiexec -n 4 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/annulus-20.exo -monitor Error -advect_sol_type bump -petscfv_type leastsquares -petsclimiter_type sin
Expected output:
0 time 0 |x| 0.8708 Error [ 0, 0] int 0 1 time 0.07526 |x| 0.7882 Error [ 0,0.09434159] int 0.05288794 2 time 0.1505 |x| 0.7343 Error [ 0, 0.2131413] int 0.09120994 3 time 0.2258 |x| 0.6957 Error [ 0, 0.2529671] int 0.1178105 4 time 0.301 |x| 0.6728 Error [ 0, 0.2209969] int 0.1510195 5 time 0.3763 |x| 0.6484 Error [ 0, 0.3418217] int 0.1742279 6 time 0.4516 |x| 0.6259 Error [ 0, 0.2836391] int 0.2025824 7 time 0.5268 |x| 0.6001 Error [ 0, 0.2995109] int 0.2355702 8 time 0.6021 |x| 0.5671 Error [4.018435e-45, 0.4150825] int 0.2669753 9 time 0.6774 |x| 0.5406 Error [1.972482e-39, 0.3607147] int 0.2858846 10 time 0.7526 |x| 0.5171 Error [6.083192e-36, 0.448034] int 0.3098002 11 time 0.8279 |x| 0.506 Error [3.572914e-33, 0.4455137] int 0.3356554 12 time 0.9031 |x| 0.4947 Error [3.404038e-31, 0.4161921] int 0.3553281 13 time 0.9784 |x| 0.4851 Error [2.737647e-29, 0.3831727] int 0.3701208 14 time 1.054 |x| 0.4746 Error [1.595143e-27, 0.4870963] int 0.386669 15 time 1.129 |x| 0.4618 Error [4.855084e-26, 0.4884613] int 0.4033437 16 time 1.204 |x| 0.4501 Error [8.826408e-25, 0.4241629] int 0.4207896 17 time 1.279 |x| 0.4399 Error [1.013076e-23, 0.4846543] int 0.4351298 18 time 1.355 |x| 0.4309 Error [8.357122e-23, 0.5166439] int 0.4563886 19 time 1.43 |x| 0.4192 Error [5.938047e-22, 0.4947022] int 0.4741966 20 time 1.505 |x| 0.412 Error [3.987736e-21, 0.4909573] int 0.4931484 21 time 1.581 |x| 0.4017 Error [2.328432e-20, 0.5286506] int 0.5038741 22 time 1.656 |x| 0.3916 Error [1.188744e-19, 0.518947] int 0.5228681 23 time 1.731 |x| 0.3824 Error [5.375705e-19, 0.5598767] int 0.5402542 24 time 1.806 |x| 0.3774 Error [2.397704e-18, 0.5997036] int 0.5577958 25 time 1.882 |x| 0.3712 Error [1.010082e-17, 0.5943566] int 0.5687406 26 time 1.957 |x| 0.3641 Error [3.837194e-17, 0.5663827] int 0.5766598 27 time 2.032 |x| 0.3569 Error [1.311105e-16, 0.5534062] int 0.5847664 CONVERGED_TIME at time 2.03208 after 27 steps
Compare turning to the error after turning off reconstruction.
Run shallow water on the larger mesh with least squares reconstruction and minmod limiting, monitoring water Height (integral is conserved) and Energy (not conserved)
$ mpiexec -n 4 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/annulus-20.exo -physics sw -monitor Height,Energy -petscfv_type leastsquares -petsclimiter_type minmod
Expected output:
0 time 0 |x| 2.972 Height [ 1, 2.97161] int 28.12971 Energy [ 0.5, 4.415232] int 17.12186 1 time 0.1597 |x| 2.125 Height [ 1, 2.125435] int 28.12971 Energy [ 0.5, 2.334901] int 17.2733 2 time 0.3193 |x| 1.944 Height [ 1, 1.943793] int 28.12971 Energy [ 0.5, 1.904131] int 17.29508 3 time 0.479 |x| 1.866 Height [ 1, 1.865932] int 28.12971 Energy [ 0.5, 1.741459] int 17.11151 4 time 0.6386 |x| 1.677 Height [ 1, 1.676699] int 28.12971 Energy [ 0.5, 1.420746] int 16.96103 5 time 0.7983 |x| 1.579 Height [ 1, 1.578751] int 28.12971 Energy [ 0.5, 1.266909] int 16.8442 6 time 0.9579 |x| 1.534 Height [ 0.9066044, 1.533904] int 28.12971 Energy [ 0.4148315, 1.211645] int 16.73017 7 time 1.118 |x| 1.522 Height [ 0.8458232, 1.5217] int 28.12971 Energy [ 0.3590109, 1.171816] int 16.64452 8 time 1.277 |x| 1.495 Height [ 0.8767922, 1.494585] int 28.12971 Energy [ 0.3846352, 1.132207] int 16.59414 9 time 1.437 |x| 1.462 Height [ 0.9860658, 1.46217] int 28.12971 Energy [ 0.486527, 1.084667] int 16.56137 10 time 1.597 |x| 1.424 Height [ 1.000033, 1.424019] int 28.12971 Energy [ 0.5000325, 1.024201] int 16.51911 11 time 1.756 |x| 1.381 Height [ 1.000129, 1.381206] int 28.12971 Energy [ 0.5001287, 0.9675924] int 16.46646 12 time 1.916 |x| 1.336 Height [ 1.000441, 1.336399] int 28.12971 Energy [ 0.5004409, 0.9293328] int 16.41464 13 time 2.076 |x| 1.308 Height [ 0.9827839, 1.30794] int 28.12971 Energy [ 0.4835028, 0.8976772] int 16.36243 CONVERGED_TIME at time 2.07552 after 13 steps