+ /opt/ddn/mvapich/bin/mpiexec -ppn 8 -np 16 -genv MV2_NUM_HCAS 1 -genv MV2_CPU_BINDING_LEVEL core -genv MV2_CPU_BINDING_POLICY scatter --hosts isc17-c04,isc17-c05 /esfs/jtacquaviva/software/install/ior/git-ddn/bin/ior -i 3 -s 1 -t 102400 -b 17647534080 -D 120 -a POSIX -F -e -g -z -k -o /esfs/jtacquaviva/ioperf/file_write -w + tee -a ./output/COUNT:1#NN:2#PPN:8#API:POSIX#T:102400.txt IOR-3.0.1: MPI Coordinated Test of Parallel I/O ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 0, Success (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) [cli_1]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 1 [cli_3]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 3 [cli_6]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 6 [cli_0]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0 [cli_2]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 2 [cli_4]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 4 [cli_5]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 5 [cli_7]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 7 ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) [cli_8]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 8 ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 0, Success (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) [cli_9]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 9 [cli_10]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 10 [cli_11]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 11 [cli_12]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 12 [cli_13]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 13 [cli_14]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 14 [cli_15]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 15 =================================================================================== = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES = PID 31097 RUNNING AT isc17-c05 = EXIT CODE: 255 = CLEANING UP REMAINING PROCESSES = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES =================================================================================== [proxy:0:0@isc17-c04] HYDU_sock_write (utils/sock/sock.c:286): write error (Broken pipe) [proxy:0:0@isc17-c04] main (pm/pmiserv/pmip.c:265): unable to send EXIT_STATUS command upstream [mpiexec@isc17-c04] HYDT_bscu_wait_for_completion (tools/bootstrap/utils/bscu_wait.c:76): one of the processes terminated badly; aborting [mpiexec@isc17-c04] HYDT_bsci_wait_for_completion (tools/bootstrap/src/bsci_wait.c:23): launcher returned error waiting for completion [mpiexec@isc17-c04] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:218): launcher returned error waiting for completion [mpiexec@isc17-c04] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion + /opt/ddn/mvapich/bin/mpiexec -ppn 8 -np 16 -genv MV2_NUM_HCAS 1 -genv MV2_CPU_BINDING_LEVEL core -genv MV2_CPU_BINDING_POLICY scatter --hosts isc17-c04,isc17-c05 /esfs/jtacquaviva/git/ime-evaluation/drop_caches.sh + /opt/ddn/mvapich/bin/mpiexec -ppn 8 -np 16 -genv MV2_NUM_HCAS 1 -genv MV2_CPU_BINDING_LEVEL core -genv MV2_CPU_BINDING_POLICY scatter --hosts isc17-c04,isc17-c05 /esfs/jtacquaviva/software/install/ior/git-ddn/bin/ior -i 3 -s 1 -t 102400 -b 17647534080 -D 120 -a POSIX -F -e -g -z -k -o /esfs/jtacquaviva/indread2/file -r + tee -a ./output/COUNT:1#NN:2#PPN:8#API:POSIX#T:102400.txt IOR-3.0.1: MPI Coordinated Test of Parallel I/O ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 0, Success (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293) [cli_8]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 8 ior ERROR: block size must be a multiple of transfer size, errno 0, Success (ior.c:2293) [cli_9]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 9 [cli_10]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 10 [cli_11]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 11 [cli_12]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 12 [cli_13]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 13 [cli_14]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 14 [cli_15]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 15 [cli_1]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 1 [cli_2]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 2 [cli_3]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 3 [cli_4]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 4 [cli_5]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 5 [cli_6]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 6 [cli_7]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 7 [cli_0]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0 =================================================================================== = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES = PID 31211 RUNNING AT isc17-c05 = EXIT CODE: 255 = CLEANING UP REMAINING PROCESSES = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES =================================================================================== [proxy:0:0@isc17-c04] HYDU_sock_write (utils/sock/sock.c:286): write error (Broken pipe) [proxy:0:0@isc17-c04] main (pm/pmiserv/pmip.c:265): unable to send EXIT_STATUS command upstream + set +x /esfs/jtacquaviva/ioperf stripe_count: 4 stripe_size: 1048576 stripe_offset: -1